[KYUUBI #6609] Bump Spark 3.5.2
# 🔍 Description Spark 3.5.2 was released recently. Release Notes is available at https://spark.apache.org/releases/spark-release-3-5-2.html ## Types of changes 🔖 - [ ] Bugfix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) ## Test Plan 🧪 Pass GHA. --- # Checklist 📝 - [x] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html) **Be nice. Be informative.** Closes #6609 from pan3793/spark-3.5.2. Closes #6609 587cf1dd3 [Cheng Pan] Bump Spark 3.5.2 Authored-by: Cheng Pan <chengpan@apache.org> Signed-off-by: Cheng Pan <chengpan@apache.org>
This commit is contained in:
parent
5970af508c
commit
b7effd9d3a
4
.github/workflows/master.yml
vendored
4
.github/workflows/master.yml
vendored
@ -434,8 +434,8 @@ jobs:
|
||||
# https://minikube.sigs.k8s.io/docs/handbook/pushing/#7-loading-directly-to-in-cluster-container-runtime
|
||||
minikube image load apache/kyuubi:latest
|
||||
# pre-install spark into minikube
|
||||
docker pull apache/spark:3.5.1
|
||||
minikube image load apache/spark:3.5.1
|
||||
docker pull apache/spark:3.5.2
|
||||
minikube image load apache/spark:3.5.2
|
||||
- name: kubectl pre-check
|
||||
run: |
|
||||
kubectl get nodes
|
||||
|
||||
@ -181,8 +181,8 @@ Examples:
|
||||
$0 -r docker.io/myrepo -t v1.8.1 build
|
||||
$0 -r docker.io/myrepo -t v1.8.1 push
|
||||
|
||||
- Build and push with tag "v1.8.1" and Spark-3.5.1 as base image to docker.io/myrepo
|
||||
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.1 build
|
||||
- Build and push with tag "v1.8.1" and Spark-3.5.2 as base image to docker.io/myrepo
|
||||
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.2 build
|
||||
$0 -r docker.io/myrepo -t v1.8.1 push
|
||||
|
||||
- Build and push for multiple archs to docker.io/myrepo
|
||||
|
||||
@ -42,8 +42,8 @@ Examples:
|
||||
$0 -r docker.io/myrepo -t v1.8.1 build
|
||||
$0 -r docker.io/myrepo -t v1.8.1 push
|
||||
|
||||
- Build and push with tag "v1.8.1" and Spark-3.5.1 as base image to docker.io/myrepo
|
||||
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.1 build
|
||||
- Build and push with tag "v1.8.1" and Spark-3.5.2 as base image to docker.io/myrepo
|
||||
$0 -r docker.io/myrepo -t v1.8.1 -b BASE_IMAGE=repo/spark:3.5.2 build
|
||||
$0 -r docker.io/myrepo -t v1.8.1 push
|
||||
|
||||
- Build and push for multiple archs to docker.io/myrepo
|
||||
|
||||
@ -117,7 +117,7 @@ Sometimes, it may be incompatible with other Spark distributions, then you may n
|
||||
For example,
|
||||
|
||||
```shell
|
||||
build/mvn clean package -pl :kyuubi-spark-lineage_2.12 -am -DskipTests -Dspark.version=3.5.1
|
||||
build/mvn clean package -pl :kyuubi-spark-lineage_2.12 -am -DskipTests -Dspark.version=3.5.2
|
||||
```
|
||||
|
||||
The available `spark.version`s are shown in the following table.
|
||||
|
||||
@ -56,7 +56,7 @@ class KyuubiOnKubernetesWithSparkTestsBase extends WithKyuubiServerOnKubernetes
|
||||
Map(
|
||||
"spark.master" -> s"k8s://$miniKubeApiMaster",
|
||||
// We should update spark docker image in ./github/workflows/master.yml at the same time
|
||||
"spark.kubernetes.container.image" -> "apache/spark:3.5.1",
|
||||
"spark.kubernetes.container.image" -> "apache/spark:3.5.2",
|
||||
"spark.kubernetes.container.image.pullPolicy" -> "IfNotPresent",
|
||||
"spark.executor.memory" -> "512M",
|
||||
"spark.driver.memory" -> "1024M",
|
||||
|
||||
@ -51,7 +51,7 @@ abstract class SparkOnKubernetesSuiteBase
|
||||
// TODO Support more Spark version
|
||||
// Spark official docker image: https://hub.docker.com/r/apache/spark/tags
|
||||
KyuubiConf().set("spark.master", s"k8s://$apiServerAddress")
|
||||
.set("spark.kubernetes.container.image", "apache/spark:3.5.1")
|
||||
.set("spark.kubernetes.container.image", "apache/spark:3.5.2")
|
||||
.set("spark.kubernetes.container.image.pullPolicy", "IfNotPresent")
|
||||
.set("spark.executor.instances", "1")
|
||||
.set("spark.executor.memory", "512M")
|
||||
|
||||
4
pom.xml
4
pom.xml
@ -198,7 +198,7 @@
|
||||
DO NOT forget to change the following properties when change the minor version of Spark:
|
||||
`delta.version`, `delta.artifact`, `maven.plugin.scalatest.exclude.tags`
|
||||
-->
|
||||
<spark.version>3.5.1</spark.version>
|
||||
<spark.version>3.5.2</spark.version>
|
||||
<spark.binary.version>3.5</spark.binary.version>
|
||||
<spark.archive.scala.suffix></spark.archive.scala.suffix>
|
||||
<spark.archive.name>spark-${spark.version}-bin-hadoop3${spark.archive.scala.suffix}.tgz</spark.archive.name>
|
||||
@ -1973,7 +1973,7 @@
|
||||
<module>extensions/spark/kyuubi-spark-connector-hive</module>
|
||||
</modules>
|
||||
<properties>
|
||||
<spark.version>3.5.1</spark.version>
|
||||
<spark.version>3.5.2</spark.version>
|
||||
<spark.binary.version>3.5</spark.binary.version>
|
||||
<delta.version>3.2.0</delta.version>
|
||||
<delta.artifact>delta-spark_${scala.binary.version}</delta.artifact>
|
||||
|
||||
Loading…
Reference in New Issue
Block a user