Compare commits

...

10 Commits

Author SHA1 Message Date
wuziyi
2ca6089fad
[KYUUBI #7153] Share JAAS configuration for Zookeeper client to avoid server OOM
Some checks failed
CI / Kyuubi and Spark Test (verify-on-spark-3.3-binary, -Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PaimonTest,org.apache.kyuubi.tags.HudiTest,org.a… (push) Has been cancelled
CI / Kyuubi and Spark Test (verify-on-spark-3.4-binary, -Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PaimonTest,org.apache.kyuubi.tags.SparkLocalClus… (push) Has been cancelled
CI / Kyuubi and Spark Test (verify-on-spark-4.0-binary, -Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PaimonTest,org.apache.kyuubi.tags.SparkLocalClus… (push) Has been cancelled
CI / Scala Test (8, 2.13, 3.5) (push) Has been cancelled
CI / Spark Connector Cross Version Test (normal, 17, 2.12, 3.5, 3.3) (push) Has been cancelled
CI / Spark Connector Cross Version Test (normal, 17, 2.12, 3.5, 3.4) (push) Has been cancelled
CI / Spark Connector Cross Version Test (normal, 17, 2.13, 3.5, 3.3) (push) Has been cancelled
CI / Spark Connector Cross Version Test (normal, 17, 2.13, 3.5, 3.4) (push) Has been cancelled
CI / Spark Connector Cross Version Test (normal, 17, 2.13, 3.5, 4.0) (push) Has been cancelled
CI / Flink Test (normal, 1.17, , 8) (push) Has been cancelled
CI / Flink Test (normal, 1.18, , 8) (push) Has been cancelled
CI / Flink Test (normal, 1.19, , 8) (push) Has been cancelled
CI / Flink Test (normal, 1.20, , 8) (push) Has been cancelled
CI / Flink Test (verify-on-flink-1.17-binary, 1.20, -Dflink.archive.mirror=https://archive.apache.org/dist/flink/flink-1.17.2 -Dflink.archive.name=flink-1.17.2-bin-scala_2.12.tgz, 8) (push) Has been cancelled
CI / Flink Test (verify-on-flink-1.18-binary, 1.20, -Dflink.archive.mirror=https://archive.apache.org/dist/flink/flink-1.18.1 -Dflink.archive.name=flink-1.18.1-bin-scala_2.12.tgz, 8) (push) Has been cancelled
CI / Flink Test (verify-on-flink-1.19-binary, 1.20, -Dflink.archive.mirror=https://archive.apache.org/dist/flink/flink-1.19.1 -Dflink.archive.name=flink-1.19.1-bin-scala_2.12.tgz, 8) (push) Has been cancelled
CI / Hive Test (normal, , 8) (push) Has been cancelled
CI / Hive Test (verify-on-hive-2.1-cdh6-binary, -Dhive.archive.mirror=https://github.com/pan3793/cdh-hive/releases/download/cdh6.3.2-release -Dhive.archive.name=apache-hive-2.1.1-cdh6.3.2-bin.tar.gz, 8) (push) Has been cancelled
CI / Hive Test (verify-on-hive-2.3-binary, -Dhive.archive.mirror=https://archive.apache.org/dist/hive/hive-2.3.10 -Dhive.archive.name=apache-hive-2.3.10-bin.tar.gz, 8) (push) Has been cancelled
CI / JDBC Trino TPC Tests (normal, 8) (push) Has been cancelled
CI / Kyuubi Server On Kubernetes Integration Test (push) Has been cancelled
CI / Spark Engine On Kubernetes Integration Test (push) Has been cancelled
CI / Zookeeper Integration Test (normal, 8, 3.4) (push) Has been cancelled
CI / Zookeeper Integration Test (normal, 8, 3.5) (push) Has been cancelled
CI / Zookeeper Integration Test (normal, 8, 3.6) (push) Has been cancelled
CI / Zookeeper Integration Test (normal, 8, 3.7) (push) Has been cancelled
Python Client / unit-test (3.10) (push) Has been cancelled
Python Client / unit-test (3.8) (push) Has been cancelled
Python Client / unit-test (3.9) (push) Has been cancelled
Web UI / Kyuubi Web UI check (push) Has been cancelled
### Why are the changes needed?

Sharing jaas configuration for zookeeper client with same keytab and principal to avoid server oom due to nested jaas configuration.

fix issue https://github.com/apache/kyuubi/issues/7153

### How was this patch tested?

ut

### Was this patch authored or co-authored using generative AI tooling?

no

Closes #7154 from Z1Wu/fix/comm_reuse_zk_jass.

Closes #7153

3b0169a00 [Cheng Pan] Update kyuubi-ha/src/main/scala/org/apache/kyuubi/ha/client/zookeeper/ZookeeperClientProvider.scala
5873d12f3 [Cheng Pan] Update kyuubi-ha/src/main/scala/org/apache/kyuubi/ha/client/zookeeper/ZookeeperClientProvider.scala
0d8a18a4e [wuziyi] nit
ffa7d29fc [wuziyi] [fix] share jaas configuration for zookeeper client with same keytab and principal to avoid server oom due to recursive jaas configuration.

Lead-authored-by: wuziyi <wuziyi02@corp.netease.com>
Co-authored-by: Cheng Pan <pan3793@gmail.com>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit f7e10e65d3)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-08-15 14:21:56 +08:00
Wang, Fei
a1846d9fe0 [KYUUBI #7163][SPARK] Check whether engine context stopped in engine terminating checker
### Why are the changes needed?

To close #7163, in this PR, it checks whether engine context stopped in engine terminating checker.
1. Spark context stooped dut to OOM in `spark-listener-group-shared`, and call `tryOrStopSparkContext`.

```
25/08/03 19:08:06 ERROR Utils: uncaught error in thread spark-listener-group-shared, stopping SparkContext
java.lang.OutOfMemoryError: GC overhead limit exceeded
25/08/03 19:08:06 INFO OperationAuditLogger: operation=a7f134b9-373b-402d-a82b-2d42df568807 opType=ExecuteStatement state=INITIALIZED   user=b_hrvst    session=6a90d01c-7627-4ae6-a506-7ba826355489
...
25/08/03 19:08:23 INFO SparkSQLSessionManager: Opening session for b_hrvst10.147.254.115
25/08/03 19:08:23 ERROR SparkTBinaryFrontendService: Error opening session:
org.apache.kyuubi.KyuubiSQLException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:951)
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:337)
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:415)
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)
The currently active SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:951)
org.apache.kyuubi.engine.spark.SparkSQLEngine$.createSpark(SparkSQLEngine.scala:337)
org.apache.kyuubi.engine.spark.SparkSQLEngine$.main(SparkSQLEngine.scala:415)
org.apache.kyuubi.engine.spark.SparkSQLEngine.main(SparkSQLEngine.scala)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:732)

    at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
    at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:73)
```

2. The kyuubi engine stop after 12 hours.
```
25/08/04 07:13:25 ERROR ZookeeperDiscoveryClient: Zookeeper client connection state changed to: LOST, but failed to reconnect in 3 seconds. Give up retry and stop gracefully .
25/08/04 07:13:25 INFO ClientCnxn: Session establishment complete on server zeus-slc-zk-3.vip.hadoop.ebay.com/10.147.141.240:2181, sessionid = 0x3939e22c983032e, negotiated timeout = 40000
25/08/04 07:13:25 INFO ConnectionStateManager: State change: RECONNECTED
25/08/04 07:13:25 INFO ZookeeperDiscoveryClient: Zookeeper client connection state changed to: RECONNECTED
25/08/04 07:13:25 INFO SparkSQLEngine: Service: [SparkTBinaryFrontend] is stopping.
25/08/04 07:13:25 INFO SparkTBinaryFrontendService: Service: [EngineServiceDiscovery] is stopping.
25/08/04 07:13:25 WARN EngineServiceDiscovery: The Zookeeper ensemble is LOST
25/08/04 07:13:25 INFO EngineServiceDiscovery: Service[EngineServiceDiscovery] is stopped.
25/08/04 07:13:25 INFO SparkTBinaryFrontendService: Service[SparkTBinaryFrontend] is stopped.
25/08/04 07:13:25 INFO SparkTBinaryFrontendService: SparkTBinaryFrontend has stopped
25/08/04 07:13:25 INFO SparkSQLEngine: Service: [SparkSQLBackendService] is stopping.
25/08/04 07:13:25 INFO SparkSQLBackendService: Service: [SparkSQLSessionManager] is stopping.
25/08/04 07:13:25 INFO SparkSQLSessionManager: Service: [SparkSQLOperationManager] is stopping.
25/08/04 07:13:45 INFO SparkSQLOperationManager: Service[SparkSQLOperationManager] is stopped.
25/08/04 07:13:45 INFO SparkSQLSessionManager: Service[SparkSQLSessionManager] is stopped.
```

3. seem the shutdown hook does not work in such case
9a0c49e791/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/SparkSQLEngine.scala (L375-L376)

4. and `SparkSQLEngineListener` did not receive `ApplicationEnd` message, maybe due to `spark-listener-group-shared` OOM? I do not have jstack for that, and can not check whether the thread alive.
9a0c49e791/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/spark/kyuubi/SparkSQLEngineListener.scala (L55-L63)

### How was this patch tested?

Existing GA.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #7167 from turboFei/check_spark_stopped.

Closes #7163

835cb3dec [Wang, Fei] SparkContext
cd542decb [Wang, Fei] Revert "no hard code"
cf9e40ef6 [Wang, Fei] no hard code
ca551c23d [Wang, Fei] check engine context stopped

Authored-by: Wang, Fei <fwang12@ebay.com>
Signed-off-by: Wang, Fei <fwang12@ebay.com>
(cherry picked from commit b31663f569)
Signed-off-by: Wang, Fei <fwang12@ebay.com>
2025-08-07 01:28:01 -07:00
Cheng Pan
54db2bcf7f
[KYUUBI #7165] Remove banned action-shellcheck
### Why are the changes needed?

The plugin is banned by ASF.

> ludeeus/action-shellcheck1.1.0 is not allowed to be used in apache/kyuubi.

https://github.com/apache/kyuubi/actions/runs/16745477309

### How was this patch tested?

Pass GHA.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #7165 from pan3793/rm-shellcheck.

Closes #7165

dfda2314c [Cheng Pan] Remove banned action-shellcheck

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 2c64e4e5fc)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-08-07 15:05:10 +08:00
dependabot[bot]
91d78f1bfc
[KYUUBI #7157] Bump form-data from 4.0.0 to 4.0.4 in /kyuubi-server/web-ui
Bumps [form-data](https://github.com/form-data/form-data) from 4.0.0 to 4.0.4.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a href="https://github.com/form-data/form-data/releases">form-data's releases</a>.</em></p>
<blockquote>
<h2>v4.0.1</h2>
<h3>Fixes</h3>
<ul>
<li>npmignore temporary build files (<a href="https://redirect.github.com/form-data/form-data/issues/532">#532</a>)</li>
<li>move util.isArray to Array.isArray (<a href="https://redirect.github.com/form-data/form-data/issues/564">#564</a>)</li>
</ul>
<h3>Tests</h3>
<ul>
<li>migrate from travis to GHA</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/form-data/form-data/blob/master/CHANGELOG.md">form-data's changelog</a>.</em></p>
<blockquote>
<h2><a href="https://github.com/form-data/form-data/compare/v4.0.3...v4.0.4">v4.0.4</a> - 2025-07-16</h2>
<h3>Commits</h3>
<ul>
<li>[meta] add <code>auto-changelog</code> <a href="811f68282f"><code>811f682</code></a></li>
<li>[Tests] handle predict-v8-randomness failures in node &lt; 17 and node &gt; 23 <a href="1d11a76434"><code>1d11a76</code></a></li>
<li>[Fix] Switch to using <code>crypto</code> random for boundary values <a href="3d1723080e"><code>3d17230</code></a></li>
<li>[Tests] fix linting errors <a href="5e340800b5"><code>5e34080</code></a></li>
<li>[meta] actually ensure the readme backup isn’t published <a href="316c82ba93"><code>316c82b</code></a></li>
<li>[Dev Deps] update <code>ljharb/eslint-config</code> <a href="58c25d7640"><code>58c25d7</code></a></li>
<li>[meta] fix readme capitalization <a href="2300ca1959"><code>2300ca1</code></a></li>
</ul>
<h2><a href="https://github.com/form-data/form-data/compare/v4.0.2...v4.0.3">v4.0.3</a> - 2025-06-05</h2>
<h3>Fixed</h3>
<ul>
<li>[Fix] <code>append</code>: avoid a crash on nullish values <a href="https://redirect.github.com/form-data/form-data/issues/577"><code>[#577](https://github.com/form-data/form-data/issues/577)</code></a></li>
</ul>
<h3>Commits</h3>
<ul>
<li>[eslint] use a shared config <a href="426ba9ac44"><code>426ba9a</code></a></li>
<li>[eslint] fix some spacing issues <a href="20941917f0"><code>2094191</code></a></li>
<li>[Refactor] use <code>hasown</code> <a href="81ab41b46f"><code>81ab41b</code></a></li>
<li>[Fix] validate boundary type in <code>setBoundary()</code> method <a href="8d8e469309"><code>8d8e469</code></a></li>
<li>[Tests] add tests to check the behavior of <code>getBoundary</code> with non-strings <a href="837b8a1f75"><code>837b8a1</code></a></li>
<li>[Dev Deps] remove unused deps <a href="870e4e6659"><code>870e4e6</code></a></li>
<li>[meta] remove local commit hooks <a href="e6e83ccb54"><code>e6e83cc</code></a></li>
<li>[Dev Deps] update <code>eslint</code> <a href="4066fd6f65"><code>4066fd6</code></a></li>
<li>[meta] fix scripts to use prepublishOnly <a href="c4bbb13c0e"><code>c4bbb13</code></a></li>
</ul>
<h2><a href="https://github.com/form-data/form-data/compare/v4.0.1...v4.0.2">v4.0.2</a> - 2025-02-14</h2>
<h3>Merged</h3>
<ul>
<li>[Fix] set <code>Symbol.toStringTag</code> when available <a href="https://redirect.github.com/form-data/form-data/pull/573"><code>[#573](https://github.com/form-data/form-data/issues/573)</code></a></li>
<li>[Fix] set <code>Symbol.toStringTag</code> when available <a href="https://redirect.github.com/form-data/form-data/pull/573"><code>[#573](https://github.com/form-data/form-data/issues/573)</code></a></li>
<li>fix (npmignore): ignore temporary build files <a href="https://redirect.github.com/form-data/form-data/pull/532"><code>[#532](https://github.com/form-data/form-data/issues/532)</code></a></li>
<li>fix (npmignore): ignore temporary build files <a href="https://redirect.github.com/form-data/form-data/pull/532"><code>[#532](https://github.com/form-data/form-data/issues/532)</code></a></li>
</ul>
<h3>Fixed</h3>
<ul>
<li>[Fix] set <code>Symbol.toStringTag</code> when available (<a href="https://redirect.github.com/form-data/form-data/issues/573">#573</a>) <a href="https://redirect.github.com/form-data/form-data/issues/396"><code>[#396](https://github.com/form-data/form-data/issues/396)</code></a></li>
<li>[Fix] set <code>Symbol.toStringTag</code> when available (<a href="https://redirect.github.com/form-data/form-data/issues/573">#573</a>) <a href="https://redirect.github.com/form-data/form-data/issues/396"><code>[#396](https://github.com/form-data/form-data/issues/396)</code></a></li>
<li>[Fix] set <code>Symbol.toStringTag</code> when available <a href="https://redirect.github.com/form-data/form-data/issues/396"><code>[#396](https://github.com/form-data/form-data/issues/396)</code></a></li>
</ul>
<h3>Commits</h3>
<ul>
<li>Merge tags v2.5.3 and v3.0.3 <a href="92613b9208"><code>92613b9</code></a></li>
<li>[Tests] migrate from travis to GHA <a href="806eda7774"><code>806eda7</code></a></li>
<li>[Tests] migrate from travis to GHA <a href="8fdb3bc6b5"><code>8fdb3bc</code></a></li>
</ul>
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a href="41996f5ac7"><code>41996f5</code></a> v4.0.4</li>
<li><a href="316c82ba93"><code>316c82b</code></a> [meta] actually ensure the readme backup isn’t published</li>
<li><a href="2300ca1959"><code>2300ca1</code></a> [meta] fix readme capitalization</li>
<li><a href="811f68282f"><code>811f682</code></a> [meta] add <code>auto-changelog</code></li>
<li><a href="5e340800b5"><code>5e34080</code></a> [Tests] fix linting errors</li>
<li><a href="1d11a76434"><code>1d11a76</code></a> [Tests] handle predict-v8-randomness failures in node &lt; 17 and node &gt; 23</li>
<li><a href="58c25d7640"><code>58c25d7</code></a> [Dev Deps] update <code>ljharb/eslint-config</code></li>
<li><a href="3d1723080e"><code>3d17230</code></a> [Fix] Switch to using <code>crypto</code> random for boundary values</li>
<li><a href="d8d67dc8ac"><code>d8d67dc</code></a> v4.0.3</li>
<li><a href="e6e83ccb54"><code>e6e83cc</code></a> [meta] remove local commit hooks</li>
<li>Additional commits viewable in <a href="https://github.com/form-data/form-data/compare/v4.0.0...v4.0.4">compare view</a></li>
</ul>
</details>
<details>
<summary>Maintainer changes</summary>
<p>This version was pushed to npm by <a href="https://www.npmjs.com/~ljharb">ljharb</a>, a new releaser for form-data since your current version.</p>
</details>
<br />

[![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=form-data&package-manager=npm_and_yarn&previous-version=4.0.0&new-version=4.0.4)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `dependabot rebase` will rebase this PR
- `dependabot recreate` will recreate this PR, overwriting any edits that have been made to it
- `dependabot merge` will merge this PR after your CI passes on it
- `dependabot squash and merge` will squash and merge this PR after your CI passes on it
- `dependabot cancel merge` will cancel a previously requested merge and block automerging
- `dependabot reopen` will reopen this PR if it is closed
- `dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
- `dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency
- `dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
- `dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
- `dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the [Security Alerts page](https://github.com/apache/kyuubi/network/alerts).

</details>

Closes #7157 from dependabot[bot]/dependabot/npm_and_yarn/kyuubi-server/web-ui/form-data-4.0.4.

Closes #7157

4d754d973 [dependabot[bot]] Bump form-data from 4.0.0 to 4.0.4 in /kyuubi-server/web-ui

Authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 87f3f90120)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-08-07 15:04:11 +08:00
Cheng Pan
d9a060f402
[KYUUBI #7162] Bump kafka-clients 3.9.1
https://kafka.apache.org/cve-list.html

Pass GHA.

No.

Closes #7162 from pan3793/kafka-3.9.1.

Closes #7162

108e5690b [Cheng Pan] Bump kafka-clients 3.9.1

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit dc9c75b8e6)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-08-05 16:59:56 +08:00
Cheng Pan
cee5f1a19a
[KYUUBI #6928] Bump Spark 4.0.0
### Why are the changes needed?

Test Spark 4.0.0 RC1
https://lists.apache.org/thread/3sx86qhnmot1p519lloyprxv9h7nt2xh

### How was this patch tested?

GHA.

### Was this patch authored or co-authored using generative AI tooling?

No

Closes #6928 from pan3793/spark-4.0.0.

Closes #6928

a910169bd [Cheng Pan] Bump Spark 4.0.0

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-08-04 14:49:02 +08:00
liangzhaoyuan
e599bc6844
[KYUUBI #7158] Spark engine respects session-level idle timeout threshold
### Why are the changes needed?
Fixes the same class of issue as
https://github.com/apache/kyuubi/pull/7138

Previously, `sessionIdleTimeoutThreshold` was initialized only once during session creation using `sessionManager.getConf`, preventing dynamic updates when clients pass new configurations during connection.

we now:
- Allow clients to set session-specific kyuubi.session.idle.timeout` during connection
- Dynamically adjust idle timeout per session
- Prevent connection pile-up by timely recycling idle sessions

Closes #7158 from 1358035421/lc/sessio_idle_timeout_threshold.

Closes #7158

abe513eed [liangzhaoyuan] fix review comments
3face844a [liangzhaoyuan] Use per-session idle timeout threshold instead of global sessionManager's value

Authored-by: liangzhaoyuan <lwlzyl19940916@gmail.com>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 9a50bfa814)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-08-04 11:28:22 +08:00
Cheng Pan
ef5e2150a7
[KYUUBI #7151] Bump commons-lang3 3.18.0
### Why are the changes needed?

This version contains CVE-2025-48924 fix.

### How was this patch tested?

Pass GHA.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #7151 from pan3793/lang3-3.18.

Closes #7151

fbbedce33 [Cheng Pan] Bump commons-lang3 3.18.0

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 8bea66a412)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-07-24 14:47:06 +08:00
Cheng Pan
7af96688f7
[KYUUBI #7139] Fix Spark extension rules to support RebalancePartitions
As title.

UT are modified.

No.

Closes #7139 from pan3793/rebalance.

Closes #7139

edb070afd [Cheng Pan] fix
4d3984a92 [Cheng Pan] Fix Spark extension rules to support RebalancePartitions

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 5f4b1f0de5)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-07-18 12:00:12 +08:00
Cheng Pan
688162e23e
[KYUUBI #7135] Fix cannot access /tmp/engine-archives: No such file or directory
### Why are the changes needed?

Fix
```
Run ls -lh /tmp/engine-archives
ls: cannot access '/tmp/engine-archives': No such file or directory
Error: Process completed with exit code 2.
```

### How was this patch tested?

GHA

### Was this patch authored or co-authored using generative AI tooling?

No

Closes #7135 from pan3793/gha-cache-fix.

Closes #7135

99ef56082 [Cheng Pan] Fix cannot access /tmp/engine-archives: No such file or directory

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit c0d4980dab)
Signed-off-by: Cheng Pan <chengpan@apache.org>
2025-07-11 10:52:19 +08:00
23 changed files with 410 additions and 97 deletions

View File

@ -27,4 +27,4 @@ runs:
key: engine-archives
- name: Show cached engine archives
shell: bash
run: ls -lh /tmp/engine-archives
run: ls -lh /tmp/engine-archives || echo ''

View File

@ -41,10 +41,10 @@ jobs:
distribution: temurin
java-version: 8
cache: 'maven'
- run: >-
build/mvn org.apache.rat:apache-rat-plugin:check
-Ptpcds -Pkubernetes-it
-Pspark-3.3 -Pspark-3.4 -Pspark-3.5
- run: |
build/mvn org.apache.rat:apache-rat-plugin:check \
-Ptpcds -Pkubernetes-it \
-Pspark-3.3 -Pspark-3.4 -Pspark-3.5 -Pspark-4.0
- name: Upload rat report
if: failure()
uses: actions/upload-artifact@v4

View File

@ -77,7 +77,7 @@ jobs:
comment: 'verify-on-spark-3.4-binary'
- java: 17
spark: '3.5'
spark-archive: '-Pscala-2.13 -Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-4.0.0-preview2 -Dspark.archive.name=spark-4.0.0-preview2-bin-hadoop3.tgz'
spark-archive: '-Pscala-2.13 -Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-4.0.0 -Dspark.archive.name=spark-4.0.0-bin-hadoop3.tgz'
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PaimonTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
comment: 'verify-on-spark-4.0-binary'
env:

View File

@ -132,9 +132,3 @@ jobs:
with:
name: super-linter-log
path: super-linter.log
- name: check bin directory
uses: ludeeus/action-shellcheck@1.1.0
with:
# TODO: enable for all folders step by step
scandir: './bin'
severity: error

View File

@ -30,7 +30,7 @@ arrow-vector/16.0.0//arrow-vector-16.0.0.jar
checker-qual/3.42.0//checker-qual-3.42.0.jar
classgraph/4.8.138//classgraph-4.8.138.jar
commons-codec/1.17.1//commons-codec-1.17.1.jar
commons-lang3/3.17.0//commons-lang3-3.17.0.jar
commons-lang3/3.18.0//commons-lang3-3.18.0.jar
error_prone_annotations/2.23.0//error_prone_annotations-2.23.0.jar
failsafe/3.3.2//failsafe-3.3.2.jar
failureaccess/1.0.2//failureaccess-1.0.2.jar
@ -97,7 +97,7 @@ jetty-util-ajax/9.4.57.v20241219//jetty-util-ajax-9.4.57.v20241219.jar
jetty-util/9.4.57.v20241219//jetty-util-9.4.57.v20241219.jar
jline/2.14.6//jline-2.14.6.jar
jul-to-slf4j/1.7.36//jul-to-slf4j-1.7.36.jar
kafka-clients/3.5.2//kafka-clients-3.5.2.jar
kafka-clients/3.9.1//kafka-clients-3.9.1.jar
kubernetes-client-api/6.13.1//kubernetes-client-api-6.13.1.jar
kubernetes-client/6.13.1//kubernetes-client-6.13.1.jar
kubernetes-httpclient-okhttp/6.13.1//kubernetes-httpclient-okhttp-6.13.1.jar

View File

@ -45,15 +45,6 @@ trait RepartitionBuilderWithRebalance extends RepartitionBuilder {
}
}
}
override def canInsertRepartitionByExpression(plan: LogicalPlan): Boolean = {
super.canInsertRepartitionByExpression(plan) && {
plan match {
case _: RebalancePartitions => false
case _ => true
}
}
}
}
/**

View File

@ -118,6 +118,7 @@ trait RepartitionBeforeWriteHelper extends Rule[LogicalPlan] {
case _: Window => true
case s: Sort if s.global => true
case _: RepartitionOperation => true
case _: RebalancePartitions => true
case _: GlobalLimit => true
case _ => false
}.isDefined
@ -131,8 +132,8 @@ trait RepartitionBeforeWriteHelper extends Rule[LogicalPlan] {
case SubqueryAlias(_, child) => canInsert(child)
case Limit(_, _) => false
case _: Sort => false
case _: RepartitionByExpression => false
case _: Repartition => false
case _: RepartitionOperation => false
case _: RebalancePartitions => false
case _ => true
}

View File

@ -50,12 +50,12 @@ trait InsertZorderHelper extends Rule[LogicalPlan] with ZorderBuilder {
def canInsertZorder(query: LogicalPlan): Boolean = query match {
case Project(_, child) => canInsertZorder(child)
case _: RepartitionByExpression | _: Repartition
case _: RepartitionOperation | _: RebalancePartitions
if !conf.getConf(KyuubiSQLConf.ZORDER_GLOBAL_SORT_ENABLED) => true
// TODO: actually, we can force zorder even if existed some shuffle
case _: Sort => false
case _: RepartitionByExpression => false
case _: Repartition => false
case _: RepartitionOperation => false
case _: RebalancePartitions => false
case _ => true
}

View File

@ -28,18 +28,21 @@ import org.apache.kyuubi.sql.KyuubiSQLConf
class RebalanceBeforeWritingSuite extends KyuubiSparkSQLExtensionTest {
test("check rebalance exists") {
def check(df: => DataFrame, expectedRebalanceNum: Int = 1): Unit = {
def check(
df: => DataFrame,
expectedRebalanceNumEnabled: Int = 1,
expectedRebalanceNumDisabled: Int = 0): Unit = {
withSQLConf(KyuubiSQLConf.INSERT_REPARTITION_BEFORE_WRITE_IF_NO_SHUFFLE.key -> "true") {
assert(
df.queryExecution.analyzed.collect {
case r: RebalancePartitions => r
}.size == expectedRebalanceNum)
}.size == expectedRebalanceNumEnabled)
}
withSQLConf(KyuubiSQLConf.INSERT_REPARTITION_BEFORE_WRITE_IF_NO_SHUFFLE.key -> "false") {
assert(
df.queryExecution.analyzed.collect {
case r: RebalancePartitions => r
}.isEmpty)
}.size == expectedRebalanceNumDisabled)
}
}
@ -69,6 +72,14 @@ class RebalanceBeforeWritingSuite extends KyuubiSparkSQLExtensionTest {
check(sql("INSERT INTO TABLE tmp1 SELECT * FROM VALUES(1),(2),(3) AS t(c1)"))
}
withTable("tmp1") {
sql(s"CREATE TABLE tmp1 (c1 int) $storage")
check(
sql("INSERT INTO TABLE tmp1 SELECT /*+ REBALANCE */ * FROM VALUES(1),(2),(3) AS t(c1)"),
1,
1)
}
withTable("tmp1", "tmp2") {
sql(s"CREATE TABLE tmp1 (c1 int) $storage")
sql(s"CREATE TABLE tmp2 (c1 int) $storage")

View File

@ -45,15 +45,6 @@ trait RepartitionBuilderWithRebalance extends RepartitionBuilder {
}
}
}
override def canInsertRepartitionByExpression(plan: LogicalPlan): Boolean = {
super.canInsertRepartitionByExpression(plan) && {
plan match {
case _: RebalancePartitions => false
case _ => true
}
}
}
}
/**

View File

@ -99,6 +99,7 @@ trait RepartitionBeforeWriteHelper extends Rule[LogicalPlan] {
case _: Window => true
case s: Sort if s.global => true
case _: RepartitionOperation => true
case _: RebalancePartitions => true
case _: GlobalLimit => true
case _ => false
}.isDefined
@ -112,8 +113,8 @@ trait RepartitionBeforeWriteHelper extends Rule[LogicalPlan] {
case SubqueryAlias(_, child) => canInsert(child)
case Limit(_, _) => false
case _: Sort => false
case _: RepartitionByExpression => false
case _: Repartition => false
case _: RepartitionOperation => false
case _: RebalancePartitions => false
case _ => true
}

View File

@ -49,12 +49,12 @@ trait InsertZorderHelper extends Rule[LogicalPlan] with ZorderBuilder {
def canInsertZorder(query: LogicalPlan): Boolean = query match {
case Project(_, child) => canInsertZorder(child)
case _: RepartitionByExpression | _: Repartition
case _: RepartitionOperation | _: RebalancePartitions
if !conf.getConf(KyuubiSQLConf.ZORDER_GLOBAL_SORT_ENABLED) => true
// TODO: actually, we can force zorder even if existed some shuffle
case _: Sort => false
case _: RepartitionByExpression => false
case _: Repartition => false
case _: RepartitionOperation => false
case _: RebalancePartitions => false
case _ => true
}

View File

@ -29,19 +29,22 @@ import org.apache.kyuubi.sql.KyuubiSQLConf
class RebalanceBeforeWritingSuite extends KyuubiSparkSQLExtensionTest {
test("check rebalance exists") {
def check(df: => DataFrame, expectedRebalanceNum: Int = 1): Unit = {
def check(
df: => DataFrame,
expectedRebalanceNumEnabled: Int = 1,
expectedRebalanceNumDisabled: Int = 0): Unit = {
withSQLConf(KyuubiSQLConf.INSERT_REPARTITION_BEFORE_WRITE_IF_NO_SHUFFLE.key -> "true") {
withListener(df) { write =>
assert(write.collect {
case r: RebalancePartitions => r
}.size == expectedRebalanceNum)
}.size == expectedRebalanceNumEnabled)
}
}
withSQLConf(KyuubiSQLConf.INSERT_REPARTITION_BEFORE_WRITE_IF_NO_SHUFFLE.key -> "false") {
withListener(df) { write =>
assert(write.collect {
case r: RebalancePartitions => r
}.isEmpty)
}.size == expectedRebalanceNumDisabled)
}
}
}
@ -72,6 +75,14 @@ class RebalanceBeforeWritingSuite extends KyuubiSparkSQLExtensionTest {
check(sql("INSERT INTO TABLE tmp1 SELECT * FROM VALUES(1),(2),(3) AS t(c1)"))
}
withTable("tmp1") {
sql(s"CREATE TABLE tmp1 (c1 int) $storage")
check(
sql("INSERT INTO TABLE tmp1 SELECT /*+ REBALANCE */ * FROM VALUES(1),(2),(3) AS t(c1)"),
1,
1)
}
withTable("tmp1", "tmp2") {
sql(s"CREATE TABLE tmp1 (c1 int) $storage")
sql(s"CREATE TABLE tmp2 (c1 int) $storage")

View File

@ -45,15 +45,6 @@ trait RepartitionBuilderWithRebalance extends RepartitionBuilder {
}
}
}
override def canInsertRepartitionByExpression(plan: LogicalPlan): Boolean = {
super.canInsertRepartitionByExpression(plan) && {
plan match {
case _: RebalancePartitions => false
case _ => true
}
}
}
}
/**

View File

@ -99,6 +99,7 @@ trait RepartitionBeforeWriteHelper extends Rule[LogicalPlan] {
case _: Window => true
case s: Sort if s.global => true
case _: RepartitionOperation => true
case _: RebalancePartitions => true
case _: GlobalLimit => true
case _ => false
}.isDefined
@ -112,8 +113,8 @@ trait RepartitionBeforeWriteHelper extends Rule[LogicalPlan] {
case SubqueryAlias(_, child) => canInsert(child)
case Limit(_, _) => false
case _: Sort => false
case _: RepartitionByExpression => false
case _: Repartition => false
case _: RepartitionOperation => false
case _: RebalancePartitions => false
case _ => true
}

View File

@ -49,12 +49,12 @@ trait InsertZorderHelper extends Rule[LogicalPlan] with ZorderBuilder {
def canInsertZorder(query: LogicalPlan): Boolean = query match {
case Project(_, child) => canInsertZorder(child)
case _: RepartitionByExpression | _: Repartition
case _: RepartitionOperation | _: RebalancePartitions
if !conf.getConf(KyuubiSQLConf.ZORDER_GLOBAL_SORT_ENABLED) => true
// TODO: actually, we can force zorder even if existed some shuffle
case _: Sort => false
case _: RepartitionByExpression => false
case _: Repartition => false
case _: RepartitionOperation => false
case _: RebalancePartitions => false
case _ => true
}

View File

@ -29,19 +29,22 @@ import org.apache.kyuubi.sql.KyuubiSQLConf
class RebalanceBeforeWritingSuite extends KyuubiSparkSQLExtensionTest {
test("check rebalance exists") {
def check(df: => DataFrame, expectedRebalanceNum: Int = 1): Unit = {
def check(
df: => DataFrame,
expectedRebalanceNumEnabled: Int = 1,
expectedRebalanceNumDisabled: Int = 0): Unit = {
withSQLConf(KyuubiSQLConf.INSERT_REPARTITION_BEFORE_WRITE_IF_NO_SHUFFLE.key -> "true") {
withListener(df) { write =>
assert(write.collect {
case r: RebalancePartitions => r
}.size == expectedRebalanceNum)
}.size == expectedRebalanceNumEnabled)
}
}
withSQLConf(KyuubiSQLConf.INSERT_REPARTITION_BEFORE_WRITE_IF_NO_SHUFFLE.key -> "false") {
withListener(df) { write =>
assert(write.collect {
case r: RebalancePartitions => r
}.isEmpty)
}.size == expectedRebalanceNumDisabled)
}
}
}
@ -72,6 +75,14 @@ class RebalanceBeforeWritingSuite extends KyuubiSparkSQLExtensionTest {
check(sql("INSERT INTO TABLE tmp1 SELECT * FROM VALUES(1),(2),(3) AS t(c1)"))
}
withTable("tmp1") {
sql(s"CREATE TABLE tmp1 (c1 int) $storage")
check(
sql("INSERT INTO TABLE tmp1 SELECT /*+ REBALANCE */ * FROM VALUES(1),(2),(3) AS t(c1)"),
1,
1)
}
withTable("tmp1", "tmp2") {
sql(s"CREATE TABLE tmp1 (c1 int) $storage")
sql(s"CREATE TABLE tmp2 (c1 int) $storage")

View File

@ -222,4 +222,8 @@ class SparkSQLSessionManager private (name: String, spark: SparkSession)
opHandle: OperationHandle): Path = {
new Path(getSessionResultSavePath(sessionHandle), opHandle.identifier.toString)
}
override private[kyuubi] def isEngineContextStopped = {
spark.sparkContext.isStopped
}
}

View File

@ -24,6 +24,7 @@ import org.apache.spark.sql.{AnalysisException, SparkSession}
import org.apache.spark.ui.SparkUIUtils.formatDuration
import org.apache.kyuubi.KyuubiSQLException
import org.apache.kyuubi.config.KyuubiConf.SESSION_IDLE_TIMEOUT
import org.apache.kyuubi.config.KyuubiReservedKeys.KYUUBI_SESSION_HANDLE_KEY
import org.apache.kyuubi.engine.spark.events.SessionEvent
import org.apache.kyuubi.engine.spark.operation.SparkSQLOperationManager
@ -57,6 +58,13 @@ class SparkSessionImpl(
}
}
override val sessionIdleTimeoutThreshold: Long = {
conf.get(SESSION_IDLE_TIMEOUT.key)
.map(_.toLong)
.getOrElse(
sessionManager.getConf.get(SESSION_IDLE_TIMEOUT))
}
private val sessionEvent = SessionEvent(this)
override def open(): Unit = {

View File

@ -345,10 +345,15 @@ abstract class SessionManager(name: String) extends CompositeService(name) {
if (idleTimeout > 0) {
val checkTask = new Runnable {
override def run(): Unit = {
if (!shutdown && System.currentTimeMillis() - latestLogoutTime > idleTimeout &&
getActiveUserSessionCount <= 0) {
info(s"Idled for more than $idleTimeout ms, terminating")
stop()
if (!shutdown) {
if (System.currentTimeMillis() - latestLogoutTime > idleTimeout &&
getActiveUserSessionCount <= 0) {
info(s"Idled for more than $idleTimeout ms, terminating")
stop()
} else if (isEngineContextStopped) {
error(s"Engine's SparkContext is stopped, terminating")
stop()
}
}
}
}
@ -360,4 +365,6 @@ abstract class SessionManager(name: String) extends CompositeService(name) {
TimeUnit.MILLISECONDS)
}
}
private[kyuubi] def isEngineContextStopped: Boolean = false
}

View File

@ -19,6 +19,7 @@ package org.apache.kyuubi.ha.client.zookeeper
import java.io.{File, IOException}
import java.nio.charset.StandardCharsets
import java.util.concurrent.ConcurrentHashMap
import javax.security.auth.login.Configuration
import scala.util.Random
@ -38,6 +39,13 @@ import org.apache.kyuubi.util.reflect.DynConstructors
object ZookeeperClientProvider extends Logging {
/**
* Share JAAS configuration for Zookeeper client with same keytab and principal to
* avoid server OOM due to each new JAAS configuration references the previous instance.
* See KYUUBI #7154 for more details.
*/
val jaasConfigurationCache = new ConcurrentHashMap[(String, String), Configuration]()
/**
* Create a [[CuratorFramework]] instance to be used as the ZooKeeper client
* Use the [[ZookeeperACLProvider]] to create appropriate ACLs
@ -113,22 +121,26 @@ object ZookeeperClientProvider extends Logging {
System.setProperty("zookeeper.server.principal", zkServerPrincipal)
}
val zkClientPrincipal = KyuubiHadoopUtils.getServerPrincipal(principal)
// HDFS-16591 makes breaking change on JaasConfiguration
val jaasConf = DynConstructors.builder()
.impl( // Hadoop 3.3.5 and above
"org.apache.hadoop.security.authentication.util.JaasConfiguration",
classOf[String],
classOf[String],
classOf[String])
.impl( // Hadoop 3.3.4 and previous
// scalastyle:off
"org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager$JaasConfiguration",
// scalastyle:on
classOf[String],
classOf[String],
classOf[String])
.build[Configuration]()
.newInstance("KyuubiZooKeeperClient", zkClientPrincipal, keytab)
val jaasConf = jaasConfigurationCache.computeIfAbsent(
(principal, keytab),
_ => {
// HDFS-16591 makes breaking change on JaasConfiguration
DynConstructors.builder()
.impl( // Hadoop 3.3.5 and above
"org.apache.hadoop.security.authentication.util.JaasConfiguration",
classOf[String],
classOf[String],
classOf[String])
.impl( // Hadoop 3.3.4 and previous
// scalastyle:off
"org.apache.hadoop.security.token.delegation.ZKDelegationTokenSecretManager$JaasConfiguration",
// scalastyle:on
classOf[String],
classOf[String],
classOf[String])
.build[Configuration]()
.newInstance("KyuubiZooKeeperClient", zkClientPrincipal, keytab)
})
Configuration.setConfiguration(jaasConf)
case _ =>
}

View File

@ -1597,6 +1597,18 @@
"node": ">=8"
}
},
"node_modules/call-bind-apply-helpers": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
"dependencies": {
"es-errors": "^1.3.0",
"function-bind": "^1.1.2"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/callsites": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
@ -1947,6 +1959,19 @@
"node": ">=12"
}
},
"node_modules/dunder-proto": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
"dependencies": {
"call-bind-apply-helpers": "^1.0.1",
"es-errors": "^1.3.0",
"gopd": "^1.2.0"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/element-plus": {
"version": "2.2.13",
"resolved": "https://registry.npmjs.org/element-plus/-/element-plus-2.2.13.tgz",
@ -1984,6 +2009,47 @@
"url": "https://github.com/fb55/entities?sponsor=1"
}
},
"node_modules/es-define-property": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
"engines": {
"node": ">= 0.4"
}
},
"node_modules/es-errors": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
"integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
"engines": {
"node": ">= 0.4"
}
},
"node_modules/es-object-atoms": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
"dependencies": {
"es-errors": "^1.3.0"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/es-set-tostringtag": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
"dependencies": {
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.6",
"has-tostringtag": "^1.0.2",
"hasown": "^2.0.2"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/esbuild": {
"version": "0.18.20",
"resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.18.20.tgz",
@ -2536,12 +2602,14 @@
}
},
"node_modules/form-data": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
"integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
"dependencies": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
"es-set-tostringtag": "^2.1.0",
"hasown": "^2.0.2",
"mime-types": "^2.1.12"
},
"engines": {
@ -2568,6 +2636,14 @@
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
}
},
"node_modules/function-bind": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
"integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/functional-red-black-tree": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/functional-red-black-tree/-/functional-red-black-tree-1.0.1.tgz",
@ -2583,6 +2659,41 @@
"node": "*"
}
},
"node_modules/get-intrinsic": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
"dependencies": {
"call-bind-apply-helpers": "^1.0.2",
"es-define-property": "^1.0.1",
"es-errors": "^1.3.0",
"es-object-atoms": "^1.1.1",
"function-bind": "^1.1.2",
"get-proto": "^1.0.1",
"gopd": "^1.2.0",
"has-symbols": "^1.1.0",
"hasown": "^2.0.2",
"math-intrinsics": "^1.1.0"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/get-proto": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
"dependencies": {
"dunder-proto": "^1.0.1",
"es-object-atoms": "^1.0.0"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/get-stdin": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/get-stdin/-/get-stdin-8.0.0.tgz",
@ -2661,6 +2772,17 @@
"url": "https://github.com/sponsors/sindresorhus"
}
},
"node_modules/gopd": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/grapheme-splitter": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/grapheme-splitter/-/grapheme-splitter-1.0.4.tgz",
@ -2676,6 +2798,42 @@
"node": ">=8"
}
},
"node_modules/has-symbols": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/has-tostringtag": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
"dependencies": {
"has-symbols": "^1.0.3"
},
"engines": {
"node": ">= 0.4"
},
"funding": {
"url": "https://github.com/sponsors/ljharb"
}
},
"node_modules/hasown": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
"integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
"dependencies": {
"function-bind": "^1.1.2"
},
"engines": {
"node": ">= 0.4"
}
},
"node_modules/html-encoding-sniffer": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-3.0.0.tgz",
@ -3089,6 +3247,14 @@
"semver": "bin/semver.js"
}
},
"node_modules/math-intrinsics": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
"engines": {
"node": ">= 0.4"
}
},
"node_modules/md5-hex": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/md5-hex/-/md5-hex-3.0.1.tgz",
@ -5665,6 +5831,15 @@
"integrity": "sha512-b6Ilus+c3RrdDk+JhLKUAQfzzgLEPy6wcXqS7f/xe1EETvsDP6GORG7SFuOs6cID5YkqchW/LXZbX5bc8j7ZcQ==",
"dev": true
},
"call-bind-apply-helpers": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
"requires": {
"es-errors": "^1.3.0",
"function-bind": "^1.1.2"
}
},
"callsites": {
"version": "3.1.0",
"resolved": "https://registry.npmjs.org/callsites/-/callsites-3.1.0.tgz",
@ -5933,6 +6108,16 @@
"webidl-conversions": "^7.0.0"
}
},
"dunder-proto": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
"requires": {
"call-bind-apply-helpers": "^1.0.1",
"es-errors": "^1.3.0",
"gopd": "^1.2.0"
}
},
"element-plus": {
"version": "2.2.13",
"resolved": "https://registry.npmjs.org/element-plus/-/element-plus-2.2.13.tgz",
@ -5961,6 +6146,35 @@
"integrity": "sha512-o4q/dYJlmyjP2zfnaWDUC6A3BQFmVTX+tZPezK7k0GLSU9QYCauscf5Y+qcEPzKL+EixVouYDgLQK5H9GrLpkg==",
"dev": true
},
"es-define-property": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g=="
},
"es-errors": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
"integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw=="
},
"es-object-atoms": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
"requires": {
"es-errors": "^1.3.0"
}
},
"es-set-tostringtag": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/es-set-tostringtag/-/es-set-tostringtag-2.1.0.tgz",
"integrity": "sha512-j6vWzfrGVfyXxge+O0x5sh6cvxAog0a/4Rdd2K36zCMV5eJ+/+tOAngRO8cODMNWbVRdVlmGZQL2YS3yR8bIUA==",
"requires": {
"es-errors": "^1.3.0",
"get-intrinsic": "^1.2.6",
"has-tostringtag": "^1.0.2",
"hasown": "^2.0.2"
}
},
"esbuild": {
"version": "0.18.20",
"resolved": "https://registry.npmjs.org/esbuild/-/esbuild-0.18.20.tgz",
@ -6366,12 +6580,14 @@
"integrity": "sha512-wWN62YITEaOpSK584EZXJafH1AGpO8RVgElfkuXbTOrPX4fIfOyEpW/CsiNd8JdYrAoOvafRTOEnvsO++qCqFA=="
},
"form-data": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.0.tgz",
"integrity": "sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==",
"version": "4.0.4",
"resolved": "https://registry.npmjs.org/form-data/-/form-data-4.0.4.tgz",
"integrity": "sha512-KrGhL9Q4zjj0kiUt5OO4Mr/A/jlI2jDYs5eHBpYHPcBEVSiipAvn2Ko2HnPe20rmcuuvMHNdZFp+4IlGTMF0Ow==",
"requires": {
"asynckit": "^0.4.0",
"combined-stream": "^1.0.8",
"es-set-tostringtag": "^2.1.0",
"hasown": "^2.0.2",
"mime-types": "^2.1.12"
}
},
@ -6388,6 +6604,11 @@
"dev": true,
"optional": true
},
"function-bind": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
"integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA=="
},
"functional-red-black-tree": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/functional-red-black-tree/-/functional-red-black-tree-1.0.1.tgz",
@ -6400,6 +6621,32 @@
"integrity": "sha512-8vXOvuE167CtIc3OyItco7N/dpRtBbYOsPsXCz7X/PMnlGjYjSGuZJgM1Y7mmew7BKf9BqvLX2tnOVy1BBUsxQ==",
"dev": true
},
"get-intrinsic": {
"version": "1.3.0",
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
"requires": {
"call-bind-apply-helpers": "^1.0.2",
"es-define-property": "^1.0.1",
"es-errors": "^1.3.0",
"es-object-atoms": "^1.1.1",
"function-bind": "^1.1.2",
"get-proto": "^1.0.1",
"gopd": "^1.2.0",
"has-symbols": "^1.1.0",
"hasown": "^2.0.2",
"math-intrinsics": "^1.1.0"
}
},
"get-proto": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
"requires": {
"dunder-proto": "^1.0.1",
"es-object-atoms": "^1.0.0"
}
},
"get-stdin": {
"version": "8.0.0",
"resolved": "https://registry.npmjs.org/get-stdin/-/get-stdin-8.0.0.tgz",
@ -6451,6 +6698,11 @@
"slash": "^3.0.0"
}
},
"gopd": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg=="
},
"grapheme-splitter": {
"version": "1.0.4",
"resolved": "https://registry.npmjs.org/grapheme-splitter/-/grapheme-splitter-1.0.4.tgz",
@ -6463,6 +6715,27 @@
"integrity": "sha512-EykJT/Q1KjTWctppgIAgfSO0tKVuZUjhgMr17kqTumMl6Afv3EISleU7qZUzoXDFTAHTDC4NOoG/ZxU3EvlMPQ==",
"dev": true
},
"has-symbols": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ=="
},
"has-tostringtag": {
"version": "1.0.2",
"resolved": "https://registry.npmjs.org/has-tostringtag/-/has-tostringtag-1.0.2.tgz",
"integrity": "sha512-NqADB8VjPFLM2V0VvHUewwwsw0ZWBaIdgo+ieHtK3hasLz4qeCRjYcqfB6AQrBggRKppKF8L52/VqdVsO47Dlw==",
"requires": {
"has-symbols": "^1.0.3"
}
},
"hasown": {
"version": "2.0.2",
"resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
"integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
"requires": {
"function-bind": "^1.1.2"
}
},
"html-encoding-sniffer": {
"version": "3.0.0",
"resolved": "https://registry.npmjs.org/html-encoding-sniffer/-/html-encoding-sniffer-3.0.0.tgz",
@ -6782,6 +7055,11 @@
}
}
},
"math-intrinsics": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g=="
},
"md5-hex": {
"version": "3.0.1",
"resolved": "https://registry.npmjs.org/md5-hex/-/md5-hex-3.0.1.tgz",

13
pom.xml
View File

@ -134,7 +134,7 @@
<commons-codec.version>1.17.1</commons-codec.version>
<commons-collections.version>3.2.2</commons-collections.version>
<commons-io.version>2.16.1</commons-io.version>
<commons-lang3.version>3.17.0</commons-lang3.version>
<commons-lang3.version>3.18.0</commons-lang3.version>
<delta.artifact>delta-spark_${scala.binary.version}</delta.artifact>
<delta.version>3.2.0</delta.version>
<failsafe.verion>3.3.2</failsafe.verion>
@ -172,7 +172,7 @@
<jetty.version>9.4.57.v20241219</jetty.version>
<jline.version>2.14.6</jline.version>
<junit.version>4.13.2</junit.version>
<kafka.version>3.5.2</kafka.version>
<kafka.version>3.9.1</kafka.version>
<kubernetes-client.version>6.13.1</kubernetes-client.version>
<kyuubi-relocated.version>0.4.1</kyuubi-relocated.version>
<kyuubi-relocated-zookeeper.artifacts>kyuubi-relocated-zookeeper-34</kyuubi-relocated-zookeeper.artifacts>
@ -2042,18 +2042,19 @@
<module>extensions/spark/kyuubi-spark-connector-hive</module>
</modules>
<properties>
<spark.version>4.0.0-preview2</spark.version>
<spark.version>4.0.0</spark.version>
<spark.binary.version>4.0</spark.binary.version>
<antlr4.version>4.13.1</antlr4.version>
<!-- TODO: update once Delta support Spark 4.0.0-preview2 -->
<!-- TODO: update once Delta support Spark 4.0 -->
<delta.version>4.0.0rc1</delta.version>
<delta.artifact>delta-spark_${scala.binary.version}</delta.artifact>
<!-- TODO: update once Hudi support Spark 4.0 -->
<hudi.artifact>hudi-spark3.5-bundle_${scala.binary.version}</hudi.artifact>
<!-- TODO: update once Iceberg support Spark 4.0 -->
<iceberg.artifact>iceberg-spark-runtime-3.5_${scala.binary.version}</iceberg.artifact>
<!-- TODO: update once Paimon support Spark 4.0 -->
<paimon.artifact>paimon-spark-3.5</paimon.artifact>
<!-- TODO: update once Paimon support Spark 4.0.
paimon-spark-3.5 contains Scala 2.12 classes cause conflicts with Scala 2.13 -->
<paimon.artifact>paimon-common</paimon.artifact>
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PaimonTest,org.apache.kyuubi.tags.HudiTest</maven.plugin.scalatest.exclude.tags>
<spark.archive.name>spark-${spark.version}-bin-hadoop3.tgz</spark.archive.name>
</properties>