[KYUUBI #377] build/dist support --spark-provided

![pan3793](https://badgen.net/badge/Hello/pan3793/green) [![Closes #378](https://badgen.net/badge/Preview/Closes%20%23378/blue)](https://github.com/yaooqinn/kyuubi/pull/378) ![55](https://badgen.net/badge/%2B/55/red) ![13](https://badgen.net/badge/-/13/green) ![3](https://badgen.net/badge/commits/3/yellow) [&#10088;?&#10089;](https://pullrequestbadge.com/?utm_medium=github&utm_source=yaooqinn&utm_campaign=badge_info)<!-- PR-BADGE: PLEASE DO NOT REMOVE THIS COMMENT -->

<!--
Thanks for sending a pull request!

Here are some tips for you:
  1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html
  2. If the PR is related to an issue in https://github.com/yaooqinn/kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'.
  3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'.
-->

### _Why are the changes needed?_
<!--
Please clarify why the changes are needed. For instance,
  1. If you add a feature, you can talk about the use case of it.
  2. If you fix a bug, you can clarify why it is a bug.
-->
close #377

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [ ] [Run test](https://kyuubi.readthedocs.io/en/latest/tools/testing.html#running-tests) locally before make a pull request

- [x] Manual test

`build/dist --spark-provided`

```
(kyuubi) ➜  kyuubi git:(master) ✗ tree dist
dist
├── LICENSE
├── RELEASE
├── bin
│   ├── kyuubi
│   ├── kyuubi-logo
│   └── load-kyuubi-env.sh
├── conf
│   ├── kyuubi-defaults.conf.template
│   ├── kyuubi-env.sh.template
│   └── log4j.properties.template
├── externals
│   └── engines
│       └── spark
│           └── kyuubi-spark-sql-engine-1.1.0-SNAPSHOT.jar
├── jars
│   ├── apacheds-i18n-2.0.0-M15.jar
│   ├── apacheds-kerberos-codec-2.0.0-M15.jar
│   ├── api-asn1-api-1.0.0-M20.jar
│   ├── api-util-1.0.0-M20.jar
│   ├── audience-annotations-0.5.0.jar
│   ├── commons-beanutils-1.7.0.jar
│   ├── commons-beanutils-core-1.8.0.jar
│   ├── commons-codec-1.4.jar
│   ├── commons-collections-3.2.2.jar
│   ├── commons-configuration-1.6.jar
│   ├── commons-digester-1.8.jar
│   ├── commons-io-2.4.jar
│   ├── commons-lang-2.6.jar
│   ├── commons-lang3-3.10.jar
│   ├── commons-math-2.2.jar
│   ├── curator-client-2.7.1.jar
│   ├── curator-framework-2.7.1.jar
│   ├── curator-recipes-2.7.1.jar
│   ├── curator-test-2.7.1.jar
│   ├── guava-11.0.2.jar
│   ├── hadoop-annotations-2.7.4.jar
│   ├── hadoop-auth-2.7.4.jar
│   ├── hadoop-common-2.7.4.jar
│   ├── hive-service-rpc-2.3.7.jar
│   ├── httpclient-4.5.6.jar
│   ├── httpcore-4.4.12.jar
│   ├── javassist-3.18.1-GA.jar
│   ├── jcl-over-slf4j-1.7.30.jar
│   ├── jsr305-3.0.2.jar
│   ├── kyuubi-common-1.1.0-SNAPSHOT.jar
│   ├── kyuubi-ha-1.1.0-SNAPSHOT.jar
│   ├── kyuubi-main-1.1.0-SNAPSHOT.jar
│   ├── libfb303-0.9.3.jar
│   ├── libthrift-0.9.3.jar
│   ├── log4j-1.2.17.jar
│   ├── netty-3.7.0.Final.jar
│   ├── scala-library-2.12.10.jar
│   ├── slf4j-api-1.7.30.jar
│   ├── slf4j-log4j12-1.7.30.jar
│   ├── spotbugs-annotations-3.1.9.jar
│   └── zookeeper-3.4.14.jar
├── logs
├── pid
└── work
```

Bundle size
```
(kyuubi) ➜  kyuubi git:(KYUUBI-377) ll | grep tar.gz
-rw-r--r--   1 chengpan  staff   230M Feb 26 12:59 kyuubi-1.1.0-SNAPSHOT-bin-spark-3.0.2.tar.gz
-rw-r--r--   1 chengpan  staff    20M Feb 26 12:56 kyuubi-1.1.0-SNAPSHOT-bin-without-spark.tar.gz
```

Closes #378 from pan3793/KYUUBI-377.

442569e [Cheng Pan] add profile spark-provided for skip download spark binary
0d8da24 [Cheng Pan] update doc
5b7c41f [Cheng Pan] [KYUUBI #377] build/dist support --spark-provided

Authored-by: Cheng Pan <379377944@qq.com>
Signed-off-by: ulysses-you <ulyssesyou18@gmail.com>
This commit is contained in:
Cheng Pan 2021-02-26 15:00:42 +08:00 committed by ulysses-you
parent 968ed4f079
commit 2f8df2355d
4 changed files with 55 additions and 13 deletions

View File

@ -31,6 +31,7 @@ set -x
KYUUBI_HOME="$(cd "`dirname "$0"`/.."; pwd)"
DISTDIR="$KYUUBI_HOME/dist"
MAKE_TGZ=false
SPARK_PROVIDED=false
NAME=none
MVN="$KYUUBI_HOME/build/mvn"
@ -39,11 +40,12 @@ function usage {
echo "./build/dist - Tool for making binary distributions of Kyuubi"
echo ""
echo "Usage:"
echo "+------------------------------------------------------+"
echo "| ./build/dist [--name] [--tgz] <maven build options> |"
echo "+------------------------------------------------------+"
echo "name: - custom binary name, using project version if undefined"
echo "tgz: - whether to make a whole bundled package"
echo "+--------------------------------------------------------------------------------------+"
echo "| ./build/dist [--name <custom_name>] [--tgz] [--spark-provided] <maven build options> |"
echo "+--------------------------------------------------------------------------------------+"
echo "name: - custom binary name, using project version if undefined"
echo "tgz: - whether to make a whole bundled package"
echo "spark-provided: - whether to make a package without Spark binary"
echo ""
}
@ -59,6 +61,9 @@ while (( "$#" )); do
--tgz)
MAKE_TGZ=true
;;
--spark-provided)
SPARK_PROVIDED=true
;;
--name)
NAME="$2"
shift
@ -136,7 +141,11 @@ HIVE_VERSION=$("$MVN" help:evaluate -Dexpression=hive.version $@ 2>/dev/null\
echo "Building Kyuubi package of version $VERSION against Spark version - $SPARK_VERSION"
if [[ "$NAME" == "none" ]]; then
NAME="spark-"$SPARK_VERSION
if [[ "$SPARK_PROVIDED" == "true" ]]; then
NAME="without-spark"
else
NAME="spark-"$SPARK_VERSION
fi
fi
if [[ "$MAKE_TGZ" == "true" ]]; then
@ -145,7 +154,12 @@ else
echo "Making distribution for Kyuubi $VERSION named $NAME in '$DISTDIR'..."
fi
BUILD_COMMAND=("$MVN" -T 1C clean package -DskipTests $@)
MVN_DIST_OPT="-DskipTests"
if [[ "$SPARK_PROVIDED" == "true" ]]; then
MVN_DIST_OPT="$MVN_DIST_OPT -Pspark-provided"
fi
BUILD_COMMAND=("$MVN" -T 1C clean package $MVN_DIST_OPT $@)
echo -e "\nBuilding with..."
echo -e "\$ ${BUILD_COMMAND[@]}\n"
@ -176,7 +190,10 @@ else
HIVE_VERSION_SUFFIX="-hive1.2"
fi
cp -r "$KYUUBI_HOME/externals/kyuubi-download/target/spark-$SPARK_VERSION-bin-hadoop${HADOOP_VERSION:0:3}$HIVE_VERSION_SUFFIX/" "$DISTDIR/externals/spark-$SPARK_VERSION-bin-hadoop${HADOOP_VERSION:0:3}$HIVE_VERSION_SUFFIX/"
if [[ "$SPARK_PROVIDED" != "true" ]]; then
cp -r "$KYUUBI_HOME/externals/kyuubi-download/target/spark-$SPARK_VERSION-bin-hadoop${HADOOP_VERSION:0:3}$HIVE_VERSION_SUFFIX/" \
"$DISTDIR/externals/spark-$SPARK_VERSION-bin-hadoop${HADOOP_VERSION:0:3}$HIVE_VERSION_SUFFIX/"
fi
cp "$KYUUBI_HOME/externals/kyuubi-spark-sql-engine/target/kyuubi-spark-sql-engine-$VERSION.jar" "$DISTDIR/externals/engines/spark"
# Copy license and ASF files

View File

@ -15,11 +15,12 @@ For more information on usage, run `./build/dist --help`
./build/dist - Tool for making binary distributions of Kyuubi Server
Usage:
+------------------------------------------------------+
| ./build/dist [--name] [--tgz] <maven build options> |
+------------------------------------------------------+
name: - custom binary name, using project version if undefined
tgz: - whether to make a whole bundled package
+--------------------------------------------------------------------------------------+
| ./build/dist [--name <custom_name>] [--tgz] [--spark-provided] <maven build options> |
+--------------------------------------------------------------------------------------+
name: - custom binary name, using project version if undefined
tgz: - whether to make a whole bundled package
spark-provided: - whether to make a package without Spark binary
```
For instance,
@ -29,3 +30,11 @@ For instance,
```
This results a Kyuubi distribution named `kyuubi-{version}-bin-custom-name.tar.gz` for you.
If you are planing to deploy Kyuubi where `spark` is provided, in other word, it's not required to bundle spark binary, use
```bash
./build/dist --tgz --spark-provided
```
Then you will get a Kyuubi distribution without spark binary named `kyuubi-{version}-bin-without-spark.tar.gz`.

View File

@ -32,6 +32,7 @@
<name>Kyuubi Project Download Externals</name>
<properties>
<spark.download.skip>false</spark.download.skip>
<spark.archive.name>spark-${spark.version}-bin-hadoop2.7.tgz</spark.archive.name>
<!-- see more at http://www.apache.org/mirrors/, e.g. https://mirror.bit.edu.cn, https://mirrors.tuna.tsinghua.edu.cn e.t.c -->
<!-- spark.archive.mirror>https://mirrors.bfsu.edu.cn</spark.archive.mirror -->
@ -44,6 +45,9 @@
<plugin>
<groupId>com.googlecode.maven-download-plugin</groupId>
<artifactId>download-maven-plugin</artifactId>
<configuration>
<skip>${spark.download.skip}</skip>
</configuration>
<executions>
<execution>
<id>download-spark-release</id>
@ -65,4 +69,12 @@
</plugins>
</build>
<profiles>
<profile>
<id>spark-provided</id>
<properties>
<spark.download.skip>true</spark.download.skip>
</properties>
</profile>
</profiles>
</project>

View File

@ -1123,6 +1123,10 @@
</modules>
</profile>
<profile>
<id>spark-provided</id>
</profile>
<profile>
<id>spark-2.4</id>
<properties>