[KYUUBI #5284] Support Hudi Alter Table Command in Authz
### _Why are the changes needed?_ To close #5284. Hudi also is a common used data format, since kyuubi already support iceberg and delta, we should also support hudi. In this pr we support hoodie sql about ALTER COMMAND in authz In this PR we use default Hudi version 0.14.0. We support from spark 3.1 to spark 3.4, since Hudi don't support spark 3.5 yet - [x] spark 3.1 - [x] spark 3.2 - [x] spark 3.3 - [x] spark 3.4 - [ ] spark 3.5 Also since Hudi only supports Scala 2.12, I also made Hudi as a separate profile to avoid importing Hudi when enable Scala 2.13 ### _How was this patch tested?_ - [x] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [ ] Add screenshots for manual tests if appropriate - [ ] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request ### _Was this patch authored or co-authored using generative AI tooling?_ No Closes #5287 from AngersZhuuuu/KYUUBI-5284. Closes #5284 f171e11af [Angerszhuuuu] Update pom.xml 3f57a3dc5 [Angerszhuuuu] follow comment f6c764028 [Angerszhuuuu] follow comment 51797e25c [Angerszhuuuu] trigger b3c059af9 [Angerszhuuuu] Update HudiCatalogRangerSparkExtensionSuite.scala 3510e7601 [liangbowen] remove scope in dependencyManagement 14ea0d498 [liangbowen] change to use `spark.binary.version` for hudi dependency by default 354260eb0 [liangbowen] remove the abbreviation tite 658bddbab [liangbowen] remove clarification and use ALTERTABLE_PROPERTIES for opType 150edcd40 [Angerszhuuuu] update 30c417b19 [Angerszhuuuu] trigger 56e5cb17b [Angerszhuuuu] Update HudiCatalogRangerSparkExtensionSuite.scala fe9b75270 [Angerszhuuuu] update 888943831 [Angerszhuuuu] Update HudiCatalogRangerSparkExtensionSuite.scala db749a277 [Angerszhuuuu] update 9b09e78c2 [Angerszhuuuu] Update HudiCommands.scala 87de62e52 [Angerszhuuuu] follow comment 2d551d112 [Angerszhuuuu] Update master.yml 89082e06b [Angerszhuuuu] Update master.yml 7c7846378 [Angerszhuuuu] Merge branch 'KYUUBI-5284' of https://github.com/AngersZhuuuu/incubator-kyuubi into KYUUBI-5284 d32ca9839 [Angerszhuuuu] Update master.yml ec43e2a7b [Angerszhuuuu] Merge branch 'master' into KYUUBI-5284 b3611fd3e [Angerszhuuuu] update 2a0dfa74f [Angerszhuuuu] Update AuthZUtils.scala 45ee9e251 [Angerszhuuuu] update 0560a5e14 [Angerszhuuuu] Update pom.xml 97c50f622 [Angerszhuuuu] update f57ee0093 [Angerszhuuuu] Update table_command_spec.json fb72197e6 [Angerszhuuuu] update 2154cf928 [Angerszhuuuu] trigger 44469359f [Angerszhuuuu] trigger b0e768cb8 [Angerszhuuuu] Update HoodieCatalogRangerSparkExtensionSuite.scala 83795ed63 [Angerszhuuuu] Update pom.xml eed190f92 [Angerszhuuuu] update 361660145 [Angerszhuuuu] update 1ed1f3ab6 [Angerszhuuuu] Update 7ee3c7dd5 [Angerszhuuuu] Merge branch 'KYUUBI-5284' of https://github.com/AngersZhuuuu/incubator-kyuubi into KYUUBI-5284 ee0916f63 [Angerszhuuuu] Update HoodieCatalogRangerSparkExtensionSuite.scala 010260fa4 [Angerszhuuuu] Merge branch 'master' into KYUUBI-5284 c11d02def [Angerszhuuuu] update b84f91f65 [Angerszhuuuu] update 42fbb0ffa [Angerszhuuuu] Update HoodieCatalogRangerSparkExtensionSuite.scala c1346adb1 [Angerszhuuuu] update 2ec63ae94 [Angerszhuuuu] Update pom.xml 39bce7468 [Angerszhuuuu] update c70b0ea2f [Angerszhuuuu] Update pom.xml e1d85ff77 [Angerszhuuuu] Update pom.xml 59012ac25 [Angerszhuuuu] Update pom.xml a46de65b5 [Angerszhuuuu] Update HoodieTest.java b8173b893 [Angerszhuuuu] update 055713329 [Angerszhuuuu] Update table_command_spec.json d7b21e820 [Angerszhuuuu] Update HoodieCatalogRangerSparkExtensionSuite.scala 0a93ff794 [Angerszhuuuu] [KYUUBI #5284] Kyuubi authz support Hoodie Alter Table Command Lead-authored-by: Angerszhuuuu <angers.zhu@gmail.com> Co-authored-by: liangbowen <liangbowen@gf.com.cn> Signed-off-by: Kent Yao <yao@apache.org>
This commit is contained in:
parent
c6113c3dc5
commit
143b26b6e8
8
.github/workflows/master.yml
vendored
8
.github/workflows/master.yml
vendored
@ -60,17 +60,17 @@ jobs:
|
||||
- java: 8
|
||||
spark: '3.4'
|
||||
spark-archive: '-Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-3.1.3 -Dspark.archive.name=spark-3.1.3-bin-hadoop3.2.tgz -Pzookeeper-3.6'
|
||||
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
|
||||
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.HudiTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
|
||||
comment: 'verify-on-spark-3.1-binary'
|
||||
- java: 8
|
||||
spark: '3.4'
|
||||
spark-archive: '-Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-3.2.4 -Dspark.archive.name=spark-3.2.4-bin-hadoop3.2.tgz -Pzookeeper-3.6'
|
||||
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
|
||||
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.HudiTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
|
||||
comment: 'verify-on-spark-3.2-binary'
|
||||
- java: 8
|
||||
spark: '3.4'
|
||||
spark-archive: '-Dspark.archive.mirror=https://archive.apache.org/dist/spark/spark-3.3.3 -Dspark.archive.name=spark-3.3.3-bin-hadoop3.tgz -Pzookeeper-3.6'
|
||||
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
|
||||
exclude-tags: '-Dmaven.plugin.scalatest.exclude.tags=org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.HudiTest,org.apache.kyuubi.tags.SparkLocalClusterTest'
|
||||
comment: 'verify-on-spark-3.3-binary'
|
||||
- java: 8
|
||||
spark: '3.4'
|
||||
@ -108,7 +108,7 @@ jobs:
|
||||
run: |
|
||||
TEST_MODULES="dev/kyuubi-codecov"
|
||||
./build/mvn clean install ${MVN_OPT} -pl ${TEST_MODULES} -am \
|
||||
-Pspark-${{ matrix.spark }} ${{ matrix.spark-archive }} ${{ matrix.exclude-tags }}
|
||||
-Pspark-${{ matrix.spark }} -Pspark-authz-hudi-test ${{ matrix.spark-archive }} ${{ matrix.exclude-tags }}
|
||||
- name: Code coverage
|
||||
if: |
|
||||
matrix.java == 8 &&
|
||||
|
||||
@ -336,6 +336,23 @@
|
||||
</build>
|
||||
|
||||
<profiles>
|
||||
<!--
|
||||
Add spark-authz-hudi-test profile here to avoid import Apache Hudi when enable scala-2.13.
|
||||
Can remove this profile after Apache Hudi support Scala 2.13.
|
||||
https://issues.apache.org/jira/browse/HUDI-6296
|
||||
-->
|
||||
<profile>
|
||||
<id>spark-authz-hudi-test</id>
|
||||
<dependencies>
|
||||
<dependency>
|
||||
<groupId>org.apache.hudi</groupId>
|
||||
<artifactId>hudi-spark${hudi.spark.binary.version}-bundle_${scala.binary.version}</artifactId>
|
||||
<version>${hudi.version}</version>
|
||||
<scope>test</scope>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</profile>
|
||||
|
||||
<profile>
|
||||
<id>gen-policy</id>
|
||||
<build>
|
||||
|
||||
@ -1409,4 +1409,101 @@
|
||||
"fieldName" : "query",
|
||||
"fieldExtractor" : "LogicalPlanQueryExtractor"
|
||||
} ]
|
||||
}, {
|
||||
"classname" : "org.apache.spark.sql.hudi.command.AlterHoodieTableAddColumnsCommand",
|
||||
"tableDescs" : [ {
|
||||
"fieldName" : "tableId",
|
||||
"fieldExtractor" : "TableIdentifierTableExtractor",
|
||||
"columnDesc" : {
|
||||
"fieldName" : "colsToAdd",
|
||||
"fieldExtractor" : "StructFieldSeqColumnExtractor"
|
||||
},
|
||||
"actionTypeDesc" : null,
|
||||
"tableTypeDesc" : null,
|
||||
"catalogDesc" : null,
|
||||
"isInput" : false,
|
||||
"setCurrentDatabaseIfMissing" : false
|
||||
} ],
|
||||
"opType" : "ALTERTABLE_ADDCOLS",
|
||||
"queryDescs" : [ ]
|
||||
}, {
|
||||
"classname" : "org.apache.spark.sql.hudi.command.AlterHoodieTableChangeColumnCommand",
|
||||
"tableDescs" : [ {
|
||||
"fieldName" : "tableIdentifier",
|
||||
"fieldExtractor" : "TableIdentifierTableExtractor",
|
||||
"columnDesc" : {
|
||||
"fieldName" : "columnName",
|
||||
"fieldExtractor" : "StringColumnExtractor"
|
||||
},
|
||||
"actionTypeDesc" : null,
|
||||
"tableTypeDesc" : null,
|
||||
"catalogDesc" : null,
|
||||
"isInput" : false,
|
||||
"setCurrentDatabaseIfMissing" : false
|
||||
} ],
|
||||
"opType" : "ALTERTABLE_REPLACECOLS",
|
||||
"queryDescs" : [ ]
|
||||
}, {
|
||||
"classname" : "org.apache.spark.sql.hudi.command.AlterHoodieTableDropPartitionCommand",
|
||||
"tableDescs" : [ {
|
||||
"fieldName" : "tableIdentifier",
|
||||
"fieldExtractor" : "TableIdentifierTableExtractor",
|
||||
"columnDesc" : {
|
||||
"fieldName" : "partitionSpecs",
|
||||
"fieldExtractor" : "PartitionSeqColumnExtractor"
|
||||
},
|
||||
"actionTypeDesc" : null,
|
||||
"tableTypeDesc" : null,
|
||||
"catalogDesc" : null,
|
||||
"isInput" : false,
|
||||
"setCurrentDatabaseIfMissing" : false
|
||||
} ],
|
||||
"opType" : "ALTERTABLE_DROPPARTS",
|
||||
"queryDescs" : [ ]
|
||||
}, {
|
||||
"classname" : "org.apache.spark.sql.hudi.command.AlterHoodieTableRenameCommand",
|
||||
"tableDescs" : [ {
|
||||
"fieldName" : "oldName",
|
||||
"fieldExtractor" : "TableIdentifierTableExtractor",
|
||||
"columnDesc" : null,
|
||||
"actionTypeDesc" : null,
|
||||
"tableTypeDesc" : {
|
||||
"fieldName" : "oldName",
|
||||
"fieldExtractor" : "TableIdentifierTableTypeExtractor",
|
||||
"skipTypes" : [ "TEMP_VIEW" ]
|
||||
},
|
||||
"catalogDesc" : null,
|
||||
"isInput" : false,
|
||||
"setCurrentDatabaseIfMissing" : false
|
||||
} ],
|
||||
"opType" : "ALTERTABLE_RENAME",
|
||||
"queryDescs" : [ ]
|
||||
}, {
|
||||
"classname" : "org.apache.spark.sql.hudi.command.AlterTableCommand",
|
||||
"tableDescs" : [ {
|
||||
"fieldName" : "table",
|
||||
"fieldExtractor" : "CatalogTableTableExtractor",
|
||||
"columnDesc" : null,
|
||||
"actionTypeDesc" : null,
|
||||
"tableTypeDesc" : null,
|
||||
"catalogDesc" : null,
|
||||
"isInput" : false,
|
||||
"setCurrentDatabaseIfMissing" : false
|
||||
} ],
|
||||
"opType" : "ALTERTABLE_PROPERTIES",
|
||||
"queryDescs" : [ ]
|
||||
}, {
|
||||
"classname" : "org.apache.spark.sql.hudi.command.Spark31AlterTableCommand",
|
||||
"tableDescs" : [ {
|
||||
"fieldName" : "table",
|
||||
"fieldExtractor" : "CatalogTableTableExtractor",
|
||||
"columnDesc" : null,
|
||||
"actionTypeDesc" : null,
|
||||
"tableTypeDesc" : null,
|
||||
"catalogDesc" : null,
|
||||
"isInput" : false,
|
||||
"setCurrentDatabaseIfMissing" : false
|
||||
} ],
|
||||
"opType" : "ALTERTABLE_PROPERTIES",
|
||||
"queryDescs" : [ ]
|
||||
} ]
|
||||
@ -86,6 +86,12 @@ private[authz] object AuthZUtils {
|
||||
lazy val SPARK_RUNTIME_VERSION: SemanticVersion = SemanticVersion(SPARK_VERSION)
|
||||
lazy val isSparkV32OrGreater: Boolean = SPARK_RUNTIME_VERSION >= "3.2"
|
||||
lazy val isSparkV33OrGreater: Boolean = SPARK_RUNTIME_VERSION >= "3.3"
|
||||
lazy val isSparkV34OrGreater: Boolean = SPARK_RUNTIME_VERSION >= "3.4"
|
||||
lazy val isSparkV35OrGreater: Boolean = SPARK_RUNTIME_VERSION >= "3.5"
|
||||
|
||||
lazy val SCALA_RUNTIME_VERSION: SemanticVersion =
|
||||
SemanticVersion(scala.util.Properties.versionNumberString)
|
||||
lazy val isScalaV213: Boolean = SCALA_RUNTIME_VERSION >= "2.13"
|
||||
|
||||
def quoteIfNeeded(part: String): String = {
|
||||
if (part.matches("[a-zA-Z0-9_]+") && !part.matches("\\d+")) {
|
||||
|
||||
@ -40,6 +40,7 @@ object RangerTestNamespace {
|
||||
val defaultDb = "default"
|
||||
val sparkCatalog = "spark_catalog"
|
||||
val icebergNamespace = "iceberg_ns"
|
||||
val hudiNamespace = "hudi_ns"
|
||||
val namespace1 = "ns1"
|
||||
val namespace2 = "ns2"
|
||||
}
|
||||
|
||||
@ -0,0 +1,82 @@
|
||||
/*
|
||||
* Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
* contributor license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright ownership.
|
||||
* The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
* (the "License"); you may not use this file except in compliance with
|
||||
* the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package org.apache.kyuubi.plugin.spark.authz.gen
|
||||
|
||||
import org.apache.kyuubi.plugin.spark.authz.OperationType._
|
||||
import org.apache.kyuubi.plugin.spark.authz.serde._
|
||||
import org.apache.kyuubi.plugin.spark.authz.serde.TableType._
|
||||
|
||||
object HudiCommands {
|
||||
val AlterHoodieTableAddColumnsCommand = {
|
||||
val cmd = "org.apache.spark.sql.hudi.command.AlterHoodieTableAddColumnsCommand"
|
||||
val columnDesc = ColumnDesc("colsToAdd", classOf[StructFieldSeqColumnExtractor])
|
||||
val tableDesc = TableDesc("tableId", classOf[TableIdentifierTableExtractor], Some(columnDesc))
|
||||
TableCommandSpec(cmd, Seq(tableDesc), ALTERTABLE_ADDCOLS)
|
||||
}
|
||||
|
||||
val AlterHoodieTableChangeColumnCommand = {
|
||||
val cmd = "org.apache.spark.sql.hudi.command.AlterHoodieTableChangeColumnCommand"
|
||||
val columnDesc = ColumnDesc("columnName", classOf[StringColumnExtractor])
|
||||
val tableDesc =
|
||||
TableDesc("tableIdentifier", classOf[TableIdentifierTableExtractor], Some(columnDesc))
|
||||
TableCommandSpec(cmd, Seq(tableDesc), ALTERTABLE_REPLACECOLS)
|
||||
}
|
||||
|
||||
val AlterHoodieTableDropPartitionCommand = {
|
||||
val cmd = "org.apache.spark.sql.hudi.command.AlterHoodieTableDropPartitionCommand"
|
||||
val columnDesc = ColumnDesc("partitionSpecs", classOf[PartitionSeqColumnExtractor])
|
||||
val tableDesc =
|
||||
TableDesc("tableIdentifier", classOf[TableIdentifierTableExtractor], Some(columnDesc))
|
||||
TableCommandSpec(cmd, Seq(tableDesc), ALTERTABLE_DROPPARTS)
|
||||
}
|
||||
|
||||
val AlterHoodieTableRenameCommand = {
|
||||
val cmd = "org.apache.spark.sql.hudi.command.AlterHoodieTableRenameCommand"
|
||||
val oldTableTableTypeDesc =
|
||||
TableTypeDesc(
|
||||
"oldName",
|
||||
classOf[TableIdentifierTableTypeExtractor],
|
||||
Seq(TEMP_VIEW))
|
||||
val oldTableD = TableDesc(
|
||||
"oldName",
|
||||
classOf[TableIdentifierTableExtractor],
|
||||
tableTypeDesc = Some(oldTableTableTypeDesc))
|
||||
|
||||
TableCommandSpec(cmd, Seq(oldTableD), ALTERTABLE_RENAME)
|
||||
}
|
||||
|
||||
val AlterTableCommand = {
|
||||
val cmd = "org.apache.spark.sql.hudi.command.AlterTableCommand"
|
||||
val tableDesc = TableDesc("table", classOf[CatalogTableTableExtractor], None)
|
||||
TableCommandSpec(cmd, Seq(tableDesc), ALTERTABLE_PROPERTIES)
|
||||
}
|
||||
|
||||
val Spark31AlterTableCommand = {
|
||||
val cmd = "org.apache.spark.sql.hudi.command.Spark31AlterTableCommand"
|
||||
val tableDesc = TableDesc("table", classOf[CatalogTableTableExtractor], None)
|
||||
TableCommandSpec(cmd, Seq(tableDesc), ALTERTABLE_PROPERTIES)
|
||||
}
|
||||
|
||||
val data: Array[TableCommandSpec] = Array(
|
||||
AlterHoodieTableAddColumnsCommand,
|
||||
AlterHoodieTableChangeColumnCommand,
|
||||
AlterHoodieTableDropPartitionCommand,
|
||||
AlterHoodieTableRenameCommand,
|
||||
AlterTableCommand,
|
||||
Spark31AlterTableCommand)
|
||||
}
|
||||
@ -43,7 +43,7 @@ class JsonSpecFileGenerator extends AnyFunSuite {
|
||||
// scalastyle:on
|
||||
test("check spec json files") {
|
||||
writeCommandSpecJson("database", DatabaseCommands.data)
|
||||
writeCommandSpecJson("table", TableCommands.data ++ IcebergCommands.data)
|
||||
writeCommandSpecJson("table", TableCommands.data ++ IcebergCommands.data ++ HudiCommands.data)
|
||||
writeCommandSpecJson("function", FunctionCommands.data)
|
||||
writeCommandSpecJson("scan", Scans.data)
|
||||
}
|
||||
|
||||
@ -0,0 +1,132 @@
|
||||
/*
|
||||
* Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
* contributor license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright ownership.
|
||||
* The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
* (the "License"); you may not use this file except in compliance with
|
||||
* the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
package org.apache.kyuubi.plugin.spark.authz.ranger
|
||||
|
||||
import org.apache.spark.SparkConf
|
||||
import org.scalatest.Outcome
|
||||
|
||||
import org.apache.kyuubi.Utils
|
||||
import org.apache.kyuubi.plugin.spark.authz.AccessControlException
|
||||
import org.apache.kyuubi.plugin.spark.authz.RangerTestNamespace._
|
||||
import org.apache.kyuubi.plugin.spark.authz.RangerTestUsers._
|
||||
import org.apache.kyuubi.plugin.spark.authz.util.AuthZUtils._
|
||||
import org.apache.kyuubi.tags.HudiTest
|
||||
import org.apache.kyuubi.util.AssertionUtils.interceptContains
|
||||
|
||||
/**
|
||||
* Tests for RangerSparkExtensionSuite on Hudi SQL.
|
||||
* Run this test should enbale `hudi` profile.
|
||||
*/
|
||||
@HudiTest
|
||||
class HudiCatalogRangerSparkExtensionSuite extends RangerSparkExtensionSuite {
|
||||
override protected val catalogImpl: String = "hive"
|
||||
// TODO: Apache Hudi not support Spark 3.5 and Scala 2.13 yet,
|
||||
// should change after Apache Hudi support Spark 3.5 and Scala 2.13.
|
||||
private def isSupportedVersion = !isSparkV35OrGreater && !isScalaV213
|
||||
|
||||
override protected val sqlExtensions: String =
|
||||
if (isSupportedVersion) {
|
||||
"org.apache.spark.sql.hudi.HoodieSparkSessionExtension"
|
||||
} else {
|
||||
""
|
||||
}
|
||||
|
||||
override protected val extraSparkConf: SparkConf =
|
||||
new SparkConf()
|
||||
.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
|
||||
|
||||
val namespace1 = hudiNamespace
|
||||
val table1 = "table1_hoodie"
|
||||
val table2 = "table2_hoodie"
|
||||
val outputTable1 = "outputTable_hoodie"
|
||||
|
||||
override def withFixture(test: NoArgTest): Outcome = {
|
||||
assume(isSupportedVersion)
|
||||
test()
|
||||
}
|
||||
|
||||
override def beforeAll(): Unit = {
|
||||
if (isSupportedVersion) {
|
||||
if (isSparkV32OrGreater) {
|
||||
spark.conf.set(
|
||||
s"spark.sql.catalog.$sparkCatalog",
|
||||
"org.apache.spark.sql.hudi.catalog.HoodieCatalog")
|
||||
spark.conf.set(s"spark.sql.catalog.$sparkCatalog.type", "hadoop")
|
||||
spark.conf.set(
|
||||
s"spark.sql.catalog.$sparkCatalog.warehouse",
|
||||
Utils.createTempDir("hudi-hadoop").toString)
|
||||
}
|
||||
super.beforeAll()
|
||||
}
|
||||
}
|
||||
|
||||
override def afterAll(): Unit = {
|
||||
if (isSupportedVersion) {
|
||||
super.afterAll()
|
||||
spark.sessionState.catalog.reset()
|
||||
spark.sessionState.conf.clear()
|
||||
}
|
||||
}
|
||||
|
||||
test("AlterTableCommand") {
|
||||
withCleanTmpResources(Seq((s"$namespace1.$table1", "table"), (namespace1, "database"))) {
|
||||
doAs(admin, sql(s"CREATE DATABASE IF NOT EXISTS $namespace1"))
|
||||
doAs(
|
||||
admin,
|
||||
sql(
|
||||
s"""
|
||||
|CREATE TABLE IF NOT EXISTS $namespace1.$table1(id int, name string, city string)
|
||||
|USING hudi
|
||||
|OPTIONS (
|
||||
| type = 'cow',
|
||||
| primaryKey = 'id',
|
||||
| 'hoodie.datasource.hive_sync.enable' = 'false'
|
||||
|)
|
||||
|PARTITIONED BY(city)
|
||||
|""".stripMargin))
|
||||
|
||||
// AlterHoodieTableAddColumnsCommand
|
||||
interceptContains[AccessControlException](
|
||||
doAs(someone, sql(s"ALTER TABLE $namespace1.$table1 ADD COLUMNS(age int)")))(
|
||||
s"does not have [alter] privilege on [$namespace1/$table1/age]")
|
||||
|
||||
// AlterHoodieTableChangeColumnCommand
|
||||
interceptContains[AccessControlException](
|
||||
doAs(someone, sql(s"ALTER TABLE $namespace1.$table1 CHANGE COLUMN id id bigint")))(
|
||||
s"does not have [alter] privilege" +
|
||||
s" on [$namespace1/$table1/id]")
|
||||
|
||||
// AlterHoodieTableDropPartitionCommand
|
||||
interceptContains[AccessControlException](
|
||||
doAs(someone, sql(s"ALTER TABLE $namespace1.$table1 DROP PARTITION (city='test')")))(
|
||||
s"does not have [alter] privilege" +
|
||||
s" on [$namespace1/$table1/city]")
|
||||
|
||||
// AlterHoodieTableRenameCommand
|
||||
interceptContains[AccessControlException](
|
||||
doAs(someone, sql(s"ALTER TABLE $namespace1.$table1 RENAME TO $namespace1.$table2")))(
|
||||
s"does not have [alter] privilege" +
|
||||
s" on [$namespace1/$table1]")
|
||||
|
||||
// AlterTableCommand && Spark31AlterTableCommand
|
||||
sql("set hoodie.schema.on.read.enable=true")
|
||||
interceptContains[AccessControlException](
|
||||
doAs(someone, sql(s"ALTER TABLE $namespace1.$table1 ADD COLUMNS(age int)")))(
|
||||
s"does not have [alter] privilege on [$namespace1/$table1]")
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -0,0 +1,29 @@
|
||||
/*
|
||||
* Licensed to the Apache Software Foundation (ASF) under one or more
|
||||
* contributor license agreements. See the NOTICE file distributed with
|
||||
* this work for additional information regarding copyright ownership.
|
||||
* The ASF licenses this file to You under the Apache License, Version 2.0
|
||||
* (the "License"); you may not use this file except in compliance with
|
||||
* the License. You may obtain a copy of the License at
|
||||
*
|
||||
* http://www.apache.org/licenses/LICENSE-2.0
|
||||
*
|
||||
* Unless required by applicable law or agreed to in writing, software
|
||||
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
* See the License for the specific language governing permissions and
|
||||
* limitations under the License.
|
||||
*/
|
||||
|
||||
package org.apache.kyuubi.tags;
|
||||
|
||||
import java.lang.annotation.ElementType;
|
||||
import java.lang.annotation.Retention;
|
||||
import java.lang.annotation.RetentionPolicy;
|
||||
import java.lang.annotation.Target;
|
||||
import org.scalatest.TagAnnotation;
|
||||
|
||||
@TagAnnotation
|
||||
@Retention(RetentionPolicy.RUNTIME)
|
||||
@Target({ElementType.METHOD, ElementType.TYPE})
|
||||
public @interface HudiTest {}
|
||||
16
pom.xml
16
pom.xml
@ -158,6 +158,8 @@
|
||||
<hive.archive.download.skip>false</hive.archive.download.skip>
|
||||
<httpclient.version>4.5.14</httpclient.version>
|
||||
<httpcore.version>4.4.16</httpcore.version>
|
||||
<hudi.version>0.14.0</hudi.version>
|
||||
<hudi.spark.binary.version>${spark.binary.version}</hudi.spark.binary.version>
|
||||
<iceberg.version>1.4.0</iceberg.version>
|
||||
<jackson.version>2.15.0</jackson.version>
|
||||
<jakarta.servlet-api.version>4.0.4</jakarta.servlet-api.version>
|
||||
@ -234,7 +236,7 @@
|
||||
<maven.plugin.frontend.version>1.12.1</maven.plugin.frontend.version>
|
||||
<maven.plugin.scala.version>4.8.0</maven.plugin.scala.version>
|
||||
<maven.plugin.scalatest.version>2.2.0</maven.plugin.scalatest.version>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.IcebergTest</maven.plugin.scalatest.exclude.tags>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.HudiTest</maven.plugin.scalatest.exclude.tags>
|
||||
<maven.plugin.scalatest.include.tags></maven.plugin.scalatest.include.tags>
|
||||
<maven.plugin.scalatest.debug.enabled>false</maven.plugin.scalatest.debug.enabled>
|
||||
<maven.plugin.spotless.version>2.30.0</maven.plugin.spotless.version>
|
||||
@ -1475,6 +1477,12 @@
|
||||
<artifactId>threeten-extra</artifactId>
|
||||
<version>${threeten.version}</version>
|
||||
</dependency>
|
||||
|
||||
<dependency>
|
||||
<groupId>org.apache.hudi</groupId>
|
||||
<artifactId>hudi-spark${hudi.spark.binary.version}-bundle_${scala.binary.version}</artifactId>
|
||||
<version>${hudi.version}</version>
|
||||
</dependency>
|
||||
</dependencies>
|
||||
</dependencyManagement>
|
||||
|
||||
@ -2239,7 +2247,7 @@
|
||||
<delta.version>2.4.0</delta.version>
|
||||
<spark.version>3.4.1</spark.version>
|
||||
<spark.binary.version>3.4</spark.binary.version>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.IcebergTest</maven.plugin.scalatest.exclude.tags>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.HudiTest</maven.plugin.scalatest.exclude.tags>
|
||||
</properties>
|
||||
</profile>
|
||||
|
||||
@ -2250,6 +2258,8 @@
|
||||
</modules>
|
||||
<properties>
|
||||
<delta.version>2.4.0</delta.version>
|
||||
<!-- Remove this when Hudi supports Spark 3.5 -->
|
||||
<hudi.spark.binary.version>3.4</hudi.spark.binary.version>
|
||||
<spark.version>3.5.0</spark.version>
|
||||
<spark.binary.version>3.5</spark.binary.version>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PySparkTest</maven.plugin.scalatest.exclude.tags>
|
||||
@ -2260,7 +2270,7 @@
|
||||
<id>spark-master</id>
|
||||
<properties>
|
||||
<spark.version>4.0.0-SNAPSHOT</spark.version>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.PySparkTest</maven.plugin.scalatest.exclude.tags>
|
||||
<maven.plugin.scalatest.exclude.tags>org.scalatest.tags.Slow,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.HudiTest,org.apache.kyuubi.tags.PySparkTest</maven.plugin.scalatest.exclude.tags>
|
||||
</properties>
|
||||
<repositories>
|
||||
<repository>
|
||||
|
||||
Loading…
Reference in New Issue
Block a user