### _Why are the changes needed?_ The Apache Spark Community found a performance regression with log4j2. See https://github.com/apache/spark/pull/36747. This PR to fix the performance issue on our side. ### _How was this patch tested?_ - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [ ] Add screenshots for manual tests if appropriate - [ ] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request ### _Was this patch authored or co-authored using generative AI tooling?_ No. Closes #5400 from ITzhangqiang/KYUUBI_5365. Closes #5365 dbb9d8b32 [ITzhangqiang] [KYUUBI #5365] Don't use Log4j2's extended throwable conversion pattern in default logging configurations Authored-by: ITzhangqiang <itzhangqiang@163.com> Signed-off-by: Cheng Pan <chengpan@apache.org> |
||
|---|---|---|
| .. | ||
| conf | ||
| image | ||
| script | ||
| .env | ||
| build-image.sh | ||
| compose.yml | ||
| README.md | ||
Playground
For Users
Setup
- Install Docker and Docker Compose;
- Go to
docker/playground, and usedocker compose up -dto run compose services as daemon;
Play
- Connect using
beeline
docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/tpcds/tiny';
- Connect using DBeaver
Add a Kyuubi datasource with
- connection url
jdbc:hive2://0.0.0.0:10009/tpcds/tiny - username:
anonymous - password:
<empty>
- Use built-in dataset
Kyuubi supply some built-in dataset, after Kyuubi started, you can run the following command to load the different datasets:
- For loading TPC-DS tiny dataset to
spark_catalog.tpcds_tiny, rundocker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/' -f /opt/load_data/load-dataset-tpcds-tiny.sql - For loading TPC-H tiny dataset to
spark_catalog.tpch_tiny, rundocker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/' -f /opt/load_data/load-dataset-tpch-tiny.sql
Access Service
- MinIO: http://localhost:9001
- PostgreSQL localhost:5432 (username: postgres, password: postgres)
- Spark UI: http://localhost:4040 (available after Spark application launching by Kyuubi, port may be 4041, 4042... if you launch more than one Spark applications)
Shutdown
- Stop compose services by
docker compose down;
For Maintainers
Build
- Build images
docker/playground/build-image.sh; - Optional to use
buildxto build and publish cross-platform imagesBUILDX=1 docker/playground/build-image.sh;