kyuubi/docs/develop_tools/building.md
Binjie Yang 5b49061d86 [KYUUBI #815] [DOC] [KUBERNETES] Doc for spark-block-cleaner
### _Why are the changes needed?_
Add Docs for kyuubi tools spark-block-cleaner.
* Explain the parameters
* Introduction to basic startup
* Give an example

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [X] [Run test](https://kyuubi.readthedocs.io/en/latest/tools/testing.html#running-tests) locally before make a pull request

Closes #815 from zwangsheng/doc/spark_block_cleaner.

Closes #815

1ec6795f [Binjie Yang] delete todo
bbf4d6e2 [Binjie Yang] make it common
9cf3e159 [Binjie Yang] format
0803995a [Binjie Yang] straighten out the article
f834b382 [Binjie Yang] refactor
25be318f [Binjie Yang] fix
7304e595 [Binjie Yang] docs for spark-block-cleaner

Authored-by: Binjie Yang <2213335496@qq.com>
Signed-off-by: ulysses-you <ulyssesyou18@gmail.com>
2021-07-16 22:21:39 +08:00

2.0 KiB

Building Kyuubi

Building Kyuubi with Apache Maven

Kyuubi is built based on Apache Maven,

./build/mvn clean package -DskipTests

This results in the creation of all sub-modules of Kyuubi project without running any unit test.

If you want to test it manually, you can start Kyuubi directly from the Kyuubi project root by running

bin/kyuubi start

Building a Submodule Individually

For instance, you can build the Kyuubi Common module using:

build/mvn clean package -pl :kyuubi-common -DskipTests

Building Submodules Individually

For instance, you can build the Kyuubi Common module using:

build/mvn clean package -pl :kyuubi-common,:kyuubi-ha -DskipTests

Skipping Some modules

For instance, you can build the Kyuubi modules without Kyuubi Codecov and Assembly modules using:

 mvn clean install -pl '!:kyuubi-codecov,!:kyuubi-assembly' -DskipTests

Building Kyuubi against Different Apache Spark versions

Since v1.1.0, Kyuubi support building with different Spark profiles,

Profile Default Since
-Pspark-3.0 Yes 1.0.0
-Pspark-3.1 No 1.1.0

Defining the Apache Mirror for Spark

By default, we use https://archive.apache.org/dist/spark/ to download the built-in Spark release package, but if you find it hard to reach, or the downloading speed is too slow, you can define the spark.archive.mirror property to a suitable Apache mirror site. For instance,

build/mvn clean package -Dspark.archive.mirror=https://mirrors.bfsu.edu.cn/apache/spark/spark-3.0.1

Visit Apache Mirrors and choose a mirror based on your region.

Specifically for developers in China mainland, you can use the pre-defined profile named mirror-cn which use mirrors.bfsu.edu.cn to speed up Spark Binary downloading. For instance,

build/mvn clean package -Pmirror-cn