[KYUUBI #6405] Spark engine supports both javax and jakarta ws.rs namespaces

# 🔍 Description

Spark 4.0 upgraded Jersey from 2 to 3, and also migrated from `javax.ws.rs` to `jakarta.ws.rs` in SPARK-47118, this break the Spark SQL engine complication with Spark 4.0

## Types of changes 🔖

- [x] Bugfix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to change)

## Test Plan 🧪

```
build/mvn clean install -Pscala-2.13 -Pspark-master \
  -pl externals/kyuubi-spark-sql-engine -am -DskipTests
```

before
```
[INFO] --- scala-maven-plugin:4.8.0:compile (scala-compile-first)  kyuubi-spark-sql-engine_2.13 ---
[INFO] Compiler bridge file: /home/kyuubi/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.13-1.8.0-bin_2.13.8__61.0-1.8.0_20221110T195421.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.13.8,1.7.13,null)
[INFO] compiling 61 Scala sources to /home/kyuubi/apache-kyuubi/externals/kyuubi-spark-sql-engine/target/scala-2.13/classes ...
[ERROR] [Error] /home/kyuubi/apache-kyuubi/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/operation/ExecutePython.scala:27: object ws is not a member of package javax
[ERROR] [Error] /home/kyuubi/apache-kyuubi/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/operation/ExecutePython.scala:307: not found: value UriBuilder
[ERROR] [Error] /home/kyuubi/apache-kyuubi/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/operation/ExecutePython.scala:320: not found: value UriBuilder
```

after
```
[INFO] --- scala-maven-plugin:4.8.0:compile (scala-compile-first)  kyuubi-spark-sql-engine_2.13 ---
[INFO] Compiler bridge file: /home/kyuubi/.sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.13-1.8.0-bin_2.13.8__61.0-1.8.0_20221110T195421.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.13.8,1.7.13,null)
[INFO] compiling 61 Scala sources to /home/kyuubi/apache-kyuubi/externals/kyuubi-spark-sql-engine/target/scala-2.13/classes ...
[INFO] compile in 19.2 s
```

---

# Checklist 📝

- [x] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html)

**Be nice. Be informative.**

Closes #6405 from pan3793/jersey.

Closes #6405

6cce23b01 [Cheng Pan] SPARK-47118 Jersey

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
This commit is contained in:
Cheng Pan 2024-05-22 14:29:12 +08:00
parent 586f6008bd
commit 9c1b779b10
No known key found for this signature in database
GPG Key ID: 8001952629BCC75D
2 changed files with 42 additions and 3 deletions

View File

@ -17,6 +17,7 @@
package org.apache.kyuubi.engine.spark
import java.net.URI
import java.time.{Instant, LocalDateTime, ZoneId}
import scala.annotation.meta.getter
@ -28,6 +29,7 @@ import org.apache.spark.util.kvstore.KVIndex
import org.apache.kyuubi.Logging
import org.apache.kyuubi.config.ConfigEntry
import org.apache.kyuubi.util.SemanticVersion
import org.apache.kyuubi.util.reflect.DynMethods
object KyuubiSparkUtil extends Logging {
@ -113,4 +115,42 @@ object KyuubiSparkUtil extends Logging {
SparkSQLEngine.kyuubiConf.get(configEntry)
}
}
// SPARK-47118 (4.0.0) upgrades Jersey from 2 to 3 which also changes javax.ws.rs to
// jakarta.ws.rs, this is an equivalent implementation using reflection of the following
// plain invocation:
// {javax|jakarta}.ws.rs.core.UriBuilder.fromUri(uri).fragment(fragment).build()
def buildURI(uri: URI, fragment: String): URI = {
if (SPARK_ENGINE_RUNTIME_VERSION >= "4.0") {
var uriBuilder = DynMethods.builder("fromUri")
.impl("jakarta.ws.rs.core.UriBuilder", classOf[URI])
.build()
.invoke[AnyRef](uri)
uriBuilder = DynMethods.builder("fragment")
.impl("jakarta.ws.rs.core.UriBuilder", classOf[String])
.build(uriBuilder)
.invoke[AnyRef](fragment)
DynMethods.builder("build")
.impl("jakarta.ws.rs.core.UriBuilder")
.build(uriBuilder)
.invoke[URI]()
} else {
var uriBuilder = DynMethods.builder("fromUri")
.impl("javax.ws.rs.core.UriBuilder", classOf[URI])
.build()
.invoke[AnyRef](uri)
uriBuilder = DynMethods.builder("fragment")
.impl("javax.ws.rs.core.UriBuilder", classOf[String])
.build(uriBuilder)
.invoke[AnyRef](fragment)
DynMethods.builder("build")
.impl("javax.ws.rs.core.UriBuilder")
.build(uriBuilder)
.invoke[URI]()
}
}
}

View File

@ -24,7 +24,6 @@ import java.nio.file.{Files, Path, Paths}
import java.util.concurrent.RejectedExecutionException
import java.util.concurrent.atomic.AtomicBoolean
import java.util.concurrent.locks.ReentrantLock
import javax.ws.rs.core.UriBuilder
import scala.collection.JavaConverters._
@ -304,7 +303,7 @@ object ExecutePython extends Logging {
archive =>
var uri = new URI(archive)
if (uri.getFragment == null) {
uri = UriBuilder.fromUri(uri).fragment(DEFAULT_SPARK_PYTHON_ENV_ARCHIVE_FRAGMENT).build()
uri = buildURI(uri, DEFAULT_SPARK_PYTHON_ENV_ARCHIVE_FRAGMENT)
}
spark.sparkContext.addArchive(uri.toString)
Paths.get(SparkFiles.get(uri.getFragment), pythonEnvExecPath)
@ -317,7 +316,7 @@ object ExecutePython extends Logging {
archive =>
var uri = new URI(archive)
if (uri.getFragment == null) {
uri = UriBuilder.fromUri(uri).fragment(DEFAULT_SPARK_PYTHON_HOME_ARCHIVE_FRAGMENT).build()
uri = buildURI(uri, DEFAULT_SPARK_PYTHON_HOME_ARCHIVE_FRAGMENT)
}
spark.sparkContext.addArchive(uri.toString)
Paths.get(SparkFiles.get(uri.getFragment))