[KYUUBI #5192] Make Spark sql lineage plugin compilable on Scala 2.13

### _Why are the changes needed?_

- to make Spark SQL lineage plugin compilable on Scala 2.13 with Spark 3.2/3.3/3.4 (Spark 3.1 does not support Scala 2.13)
`mvn clean install -DskipTests -Pflink-provided,hive-provided,spark-provided -Pscala-2.13 -pl :kyuubi-spark-lineage_2.13 -Pspark-3.3`
- fix type mismatch, by manually converting Iterable to ListMap
```
[ERROR] [Error] /Users/bw/dev/incubator-kyuubi/extensions/spark/kyuubi-spark-lineage/src/main/scala/org/apache/kyuubi/plugin/lineage/helper/SparkSQLLineageParseHelper.scala:220: type mismatch;
 found   : scala.collection.immutable.Iterable[(org.apache.spark.sql.catalyst.expressions.Attribute, org.apache.spark.sql.catalyst.expressions.AttributeSet)]
 required: LineageParser.this.AttributeMap[org.apache.spark.sql.catalyst.expressions.AttributeSet]
    (which expands to)  scala.collection.immutable.ListMap[org.apache.spark.sql.catalyst.expressions.Attribute,org.apache.spark.sql.catalyst.expressions.AttributeSet]
```
### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [x] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request

### _Was this patch authored or co-authored using generative AI tooling?_

No.

Closes #5192 from bowenliang123/scala213-lineage.

Closes #5192

a68ba8457 [liangbowen] adapt spark lineage plugin to Scala 2.13

Authored-by: liangbowen <liangbowen@gf.com.cn>
Signed-off-by: liangbowen <liangbowen@gf.com.cn>
This commit is contained in:
liangbowen 2023-08-23 17:15:31 +08:00
parent 22a47044e9
commit 9e3ac23df7

View File

@ -217,10 +217,11 @@ trait LineageParser {
getField[LogicalPlan](plan, "plan")
}
extractColumnsLineage(query, parentColumnsLineage).zipWithIndex.map {
val lineages = extractColumnsLineage(query, parentColumnsLineage).zipWithIndex.map {
case ((k, v), i) if outputCols.nonEmpty => k.withName(s"$view.${outputCols(i)}") -> v
case ((k, v), _) => k.withName(s"$view.${k.name}") -> v
}
}.toSeq
ListMap[Attribute, AttributeSet](lineages: _*)
case p if p.nodeName == "CreateDataSourceTableAsSelectCommand" =>
val table = getV1TableName(getField[CatalogTable](plan, "table").qualifiedName)