[KYUUBI #410] Add Spark-3.1 trouble shooting
 [](https://github.com/yaooqinn/kyuubi/pull/409)     [❨?❩](https://pullrequestbadge.com/?utm_medium=github&utm_source=yaooqinn&utm_campaign=badge_info)<!-- PR-BADGE: PLEASE DO NOT REMOVE THIS COMMENT --> <!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html 2. If the PR is related to an issue in https://github.com/yaooqinn/kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'. 3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'. --> ### _Why are the changes needed?_ <!-- Please clarify why the changes are needed. For instance, 1. If you add a feature, you can talk about the use case of it. 2. If you fix a bug, you can clarify why it is a bug. --> More details could see the config `spark.sql.legacy.setCommandRejectsSparkCoreConfs` ### _How was this patch tested?_ Closes #409 from ulysses-you/spark-3-1-trouble-shooting. e4bc899 [ulysses-you] fix d6e1903 [ulysses-you] init Authored-by: ulysses-you <ulyssesyou18@gmail.com> Signed-off-by: Kent Yao <yao@apache.org>
This commit is contained in:
parent
57ed76f48d
commit
3eefea6d76
@ -234,3 +234,20 @@ Error operating EXECUTE_STATEMENT: org.apache.spark.sql.AnalysisException: Can n
|
||||
|
||||
If you get this exception when creating a function, you can check your JDK version.
|
||||
You should update JDK to JDK1.8.0_121 and later, since JDK1.8.0_121 fix a security issue [Additional access restrictions for URLClassLoader.newInstance](https://www.oracle.com/java/technologies/javase/8u121-relnotes.html).
|
||||
|
||||
### Failed to start Spark 3.1 with error msg 'Cannot modify the value of a Spark config'
|
||||
Here is the error message
|
||||
```java
|
||||
Caused by: org.apache.spark.sql.AnalysisException: Cannot modify the value of a Spark config: spark.yarn.queue
|
||||
at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:156)
|
||||
at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:40)
|
||||
at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.$anonfun$openSession$2(SparkSQLSessionManager.scala:68)
|
||||
at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.$anonfun$openSession$2$adapted(SparkSQLSessionManager.scala:56)
|
||||
at scala.collection.immutable.Map$Map4.foreach(Map.scala:236)
|
||||
at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:56)
|
||||
... 12 more
|
||||
```
|
||||
|
||||
This is because Spark-3.1 will check the config which you set and throw exception if the config is static or used in other module (e.g. yarn/core).
|
||||
|
||||
You can add a config `spark.sql.legacy.setCommandRejectsSparkCoreConfs=false` in `spark-defaults.conf` to disable this behavior.
|
||||
|
||||
Loading…
Reference in New Issue
Block a user