[KYUUBI #7112] Enhance test 'capture error from spark process builder' for Spark 4.0

### Why are the changes needed?

```
- capture error from spark process builder *** FAILED ***
  The code passed to eventually never returned normally. Attempted 167 times over 1.50233072485 minutes. Last failure message: "org.apache.kyuubi.KyuubiSQLException: 	Suppressed: org.apache.spark.util.Utils$OriginalTryStackTraceException: Full stacktrace of original doTryWithCallerStacktrace caller
   See more: /builds/lakehouse/kyuubi/kyuubi-server/target/work/kentyao/kyuubi-spark-sql-engine.log.2
  	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
  	at org.apache.kyuubi.engine.ProcBuilder.$anonfun$start$1(ProcBuilder.scala:234)
  	at java.base/java.lang.Thread.run(Thread.java:840)
  .
  FYI: The last 4096 line(s) of log are:
...
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
  	at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1742)
  	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:83)
  	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:133)
  	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
  	at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3607)
  	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3659)
  	at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3639)
  	at org.apache.spark.sql.hive.client.HiveClientImpl$.$anonfun$getHive$5(HiveClientImpl.scala:1458)
...
  25/06/19 18:20:08 INFO SparkContext: Successfully stopped SparkContext
  25/06/19 18:20:08 INFO ShutdownHookManager: Shutdown hook called
  25/06/19 18:20:08 INFO ShutdownHookManager: Deleting directory /tmp/spark-791ea5a0-44d2-4750-a549-a3ea2[3254](https://g.hz.netease.com/lakehouse/kyuubi/-/jobs/7667660#L3254)6b2
  25/06/19 18:20:08 INFO ShutdownHookManager: Deleting directory /tmp/spark-1ab9d4a0-707d-4619-bc83-232c29c891f9
  25/06/19 18:20:08 INFO ShutdownHookManager: Deleting directory /builds/lakehouse/kyuubi/kyuubi-server/target/work/kentyao/artifacts/spark-9ee628b1-0c29-4d32-8078-c023d1f812d7" did not contain "org.apache.hadoop.hive.ql.metadata.HiveException:". (SparkProcessBuilderSuite.scala:79)
```

### How was this patch tested?

Pass GHA.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #7112 from pan3793/ut-spark-4.0.

Closes #7112

bd4a24bea [Cheng Pan] Enhance test 'capture error from spark process builder' for Spark 4.0

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
This commit is contained in:
Cheng Pan 2025-07-02 18:19:28 +08:00
parent 4717987e37
commit e98ad7bf32
No known key found for this signature in database
GPG Key ID: 8001952629BCC75D

View File

@ -79,7 +79,9 @@ class SparkProcessBuilderSuite extends KerberizedTestHelper with MockitoSugar {
eventually(timeout(90.seconds), interval(500.milliseconds)) {
val error1 = processBuilder1.getError
assert(
error1.getMessage.contains("org.apache.hadoop.hive.ql.metadata.HiveException:"))
error1.getMessage.contains("org.apache.hadoop.hive.ql.metadata.HiveException:") ||
error1.getMessage.contains("Unable to instantiate " +
"org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient"))
}
}