[KYUUBI #5830] Test Hive engine with CDH Hive 2.1.1-cdh6.3.2
# 🔍 Description ## Issue References 🔗 The CDH Hive 2.1.1-cdh6.3.2 is the latest free version of CDH, which is adopted widely, it would be great if the Kyuubi Hive engine could support this version. ## Describe Your Solution 🔧 Actually, the Kyuubi Hive engine could work with CDH Hive 2.1.1-cdh6.3.2, this pull request just adds an integration test and fixes one test case. ## Types of changes 🔖 - [ ] Bugfix (non-breaking change which fixes an issue) - [x] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) ## Test Plan 🧪 #### Behavior Without This Pull Request ⚰️ #### Behavior With This Pull Request 🎉 #### Related Unit Tests --- # Checklists ## 📝 Author Self Checklist - [x] My code follows the [style guidelines](https://kyuubi.readthedocs.io/en/master/contributing/code/style.html) of this project - [x] I have performed a self-review - [ ] I have commented my code, particularly in hard-to-understand areas - [x] I have made corresponding changes to the documentation - [x] My changes generate no new warnings - [x] I have added tests that prove my fix is effective or that my feature works - [x] New and existing unit tests pass locally with my changes - [x] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html) ## 📝 Committer Pre-Merge Checklist - [x] Pull request title is okay. - [x] No license issues. - [x] Milestone correctly set? - [x] Test coverage is ok - [x] Assignees are selected. - [x] Minimum number of approvals - [x] No changes are requested **Be nice. Be informative.** Closes #5830 from pan3793/hive-211-cdh6. Closes #5830 afa80bd82 [Cheng Pan] nit 89c9ae96e [Cheng Pan] nit e402e49cd [Cheng Pan] nit 3c126a2a6 [Cheng Pan] Test Hive engine with CDH Hive 2.1.1-cdh6.3.2 Authored-by: Cheng Pan <chengpan@apache.org> Signed-off-by: Cheng Pan <chengpan@apache.org>
This commit is contained in:
parent
4463cc8f97
commit
b594105f14
3
.github/workflows/master.yml
vendored
3
.github/workflows/master.yml
vendored
@ -248,6 +248,9 @@ jobs:
|
||||
- java: 8
|
||||
hive-archive: '-Dhive.archive.mirror=https://archive.apache.org/dist/hive/hive-2.3.9 -Dhive.archive.name=apache-hive-2.3.9-bin.tar.gz'
|
||||
comment: 'verify-on-hive-2.3-binary'
|
||||
- java: 8
|
||||
hive-archive: '-Dhive.archive.mirror=https://github.com/pan3793/cdh-hive/releases/download/cdh6.3.2-release -Dhive.archive.name=apache-hive-2.1.1-cdh6.3.2-bin.tar.gz'
|
||||
comment: 'verify-on-hive-2.1-cdh6-binary'
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- name: Tune Runner VM
|
||||
|
||||
@ -36,23 +36,23 @@ For quick start deployment, we need to prepare the following stuffs:
|
||||
These essential components are JVM-based applications. So, the JRE needs to be
|
||||
pre-installed and the ``JAVA_HOME`` is correctly set to each component.
|
||||
|
||||
================ ============ =============== ===========================================
|
||||
Component Role Version Remarks
|
||||
================ ============ =============== ===========================================
|
||||
**Java** JRE 8/11/17 Officially released against JDK8
|
||||
**Kyuubi** Gateway \ |release| \ - Kyuubi Server
|
||||
Engine lib - Kyuubi Engine
|
||||
Beeline - Kyuubi Hive Beeline
|
||||
**Spark** Engine >=3.1 A Spark distribution
|
||||
**Flink** Engine 1.16/1.17/1.18 A Flink distribution
|
||||
**Trino** Engine >=363 A Trino cluster
|
||||
**Doris** Engine N/A A Doris cluster
|
||||
**Hive** Engine - 3.1.x - A Hive distribution
|
||||
Metastore - N/A - An optional and external metadata store,
|
||||
whose version is decided by engines
|
||||
================ ============ ==================== ===========================================
|
||||
Component Role Version Remarks
|
||||
================ ============ ==================== ===========================================
|
||||
**Java** JRE 8/11/17 Officially released against JDK8
|
||||
**Kyuubi** Gateway \ |release| \ - Kyuubi Server
|
||||
Engine lib - Kyuubi Engine
|
||||
Beeline - Kyuubi Hive Beeline
|
||||
**Spark** Engine 3.1 to 3.5 A Spark distribution
|
||||
**Flink** Engine 1.16/1.17/1.18 A Flink distribution
|
||||
**Trino** Engine >=363 A Trino cluster
|
||||
**Doris** Engine N/A A Doris cluster
|
||||
**Hive** Engine - 2.1-cdh6/2.3/3.1 - A Hive distribution
|
||||
Metastore - N/A - An optional and external metadata store,
|
||||
whose version is decided by engines
|
||||
**Zookeeper** HA >=3.4.x
|
||||
**Disk** Storage N/A N/A
|
||||
================ ============ =============== ===========================================
|
||||
**Disk** Storage N/A N/A
|
||||
================ ============ ==================== ===========================================
|
||||
|
||||
The other internal or external parts listed in the above sheet can be used individually
|
||||
or all together. For example, you can use Kyuubi, Spark and Flink to build a streaming
|
||||
|
||||
@ -230,11 +230,12 @@ trait HiveEngineTests extends HiveJDBCTestHelper {
|
||||
assume(SystemUtils.isJavaVersionAtMost(JavaVersion.JAVA_1_8))
|
||||
withJdbcStatement() { statement =>
|
||||
val resultSet = statement.getConnection.getMetaData.getTableTypes
|
||||
// Hive3 removes support for INDEX_TABLE
|
||||
val hive2Expected = Set("TABLE", "VIEW", "MATERIALIZED_VIEW", "INDEX_TABLE")
|
||||
val hive2_1Expected = Set("TABLE", "VIEW", "INDEX_TABLE")
|
||||
val hive2_3Expected = Set("TABLE", "VIEW", "MATERIALIZED_VIEW", "INDEX_TABLE")
|
||||
val hive3Expected = Set("TABLE", "VIEW", "MATERIALIZED_VIEW")
|
||||
val tableTypes = JdbcUtils.mapResultSet(resultSet) { rs => rs.getString(TABLE_TYPE) }.toSet
|
||||
assert(tableTypes === hive2Expected || tableTypes === hive3Expected)
|
||||
assert(tableTypes === hive2_1Expected || tableTypes === hive2_3Expected ||
|
||||
tableTypes === hive3Expected)
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
Loading…
Reference in New Issue
Block a user