From 3eefea6d7673298d06e790979446cd256230dcec Mon Sep 17 00:00:00 2001 From: ulysses-you Date: Tue, 9 Mar 2021 17:54:15 +0800 Subject: [PATCH] [KYUUBI #410] Add Spark-3.1 trouble shooting ![ulysses-you](https://badgen.net/badge/Hello/ulysses-you/green) [![Closes #409](https://badgen.net/badge/Preview/Closes%20%23409/blue)](https://github.com/yaooqinn/kyuubi/pull/409) ![17](https://badgen.net/badge/%2B/17/red) ![0](https://badgen.net/badge/-/0/green) ![2](https://badgen.net/badge/commits/2/yellow) ![Test Plan](https://badgen.net/badge/Missing/Test%20Plan/ff0000) [❨?❩](https://pullrequestbadge.com/?utm_medium=github&utm_source=yaooqinn&utm_campaign=badge_info) ### _Why are the changes needed?_ More details could see the config `spark.sql.legacy.setCommandRejectsSparkCoreConfs` ### _How was this patch tested?_ Closes #409 from ulysses-you/spark-3-1-trouble-shooting. e4bc899 [ulysses-you] fix d6e1903 [ulysses-you] init Authored-by: ulysses-you Signed-off-by: Kent Yao --- docs/deployment/trouble_shooting.md | 17 +++++++++++++++++ 1 file changed, 17 insertions(+) diff --git a/docs/deployment/trouble_shooting.md b/docs/deployment/trouble_shooting.md index 6916e8ea0..16892016b 100644 --- a/docs/deployment/trouble_shooting.md +++ b/docs/deployment/trouble_shooting.md @@ -234,3 +234,20 @@ Error operating EXECUTE_STATEMENT: org.apache.spark.sql.AnalysisException: Can n If you get this exception when creating a function, you can check your JDK version. You should update JDK to JDK1.8.0_121 and later, since JDK1.8.0_121 fix a security issue [Additional access restrictions for URLClassLoader.newInstance](https://www.oracle.com/java/technologies/javase/8u121-relnotes.html). + +### Failed to start Spark 3.1 with error msg 'Cannot modify the value of a Spark config' +Here is the error message +```java +Caused by: org.apache.spark.sql.AnalysisException: Cannot modify the value of a Spark config: spark.yarn.queue + at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:156) + at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:40) + at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.$anonfun$openSession$2(SparkSQLSessionManager.scala:68) + at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.$anonfun$openSession$2$adapted(SparkSQLSessionManager.scala:56) + at scala.collection.immutable.Map$Map4.foreach(Map.scala:236) + at org.apache.kyuubi.engine.spark.session.SparkSQLSessionManager.openSession(SparkSQLSessionManager.scala:56) + ... 12 more +``` + +This is because Spark-3.1 will check the config which you set and throw exception if the config is static or used in other module (e.g. yarn/core). + +You can add a config `spark.sql.legacy.setCommandRejectsSparkCoreConfs=false` in `spark-defaults.conf` to disable this behavior.