kyuubi/docker/playground
ITzhangqiang e51095edaa
[KYUUBI #5365] Don't use Log4j2's extended throwable conversion pattern in default logging configurations
### _Why are the changes needed?_

The Apache Spark Community found a performance regression with log4j2. See https://github.com/apache/spark/pull/36747.

This PR to fix the performance issue on our side.

### _How was this patch tested?_
- [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible

- [ ] Add screenshots for manual tests if appropriate

- [ ] [Run test](https://kyuubi.readthedocs.io/en/master/contributing/code/testing.html#running-tests) locally before make a pull request

### _Was this patch authored or co-authored using generative AI tooling?_
No.

Closes #5400 from ITzhangqiang/KYUUBI_5365.

Closes #5365

dbb9d8b32 [ITzhangqiang] [KYUUBI #5365] Don't use Log4j2's extended throwable conversion pattern in default logging configurations

Authored-by: ITzhangqiang <itzhangqiang@163.com>
Signed-off-by: Cheng Pan <chengpan@apache.org>
2023-10-11 21:41:22 +08:00
..
conf [KYUUBI #5365] Don't use Log4j2's extended throwable conversion pattern in default logging configurations 2023-10-11 21:41:22 +08:00
image [KYUUBI #5266] Upgrade playground to Kyuubi 1.7.1 2023-09-11 11:13:05 +00:00
script
.env [KYUUBI #5324] Bump latest 1.7.3 in playground and helm charts 2023-09-25 17:36:33 +00:00
build-image.sh [KYUUBI #4215] Remove useless build-arg CLICKHOUSE_JDBC_VERSION 2023-02-01 20:37:36 +08:00
compose.yml [KYUUBI #5266] Upgrade playground to Kyuubi 1.7.1 2023-09-11 11:13:05 +00:00
README.md [KYUUBI #4312] [DOCS] Include **/README.md in markdown style check 2023-02-14 02:23:32 +08:00

Playground

For Users

Setup

  1. Install Docker and Docker Compose;
  2. Go to docker/playground, and use docker compose up -d to run compose services as daemon;

Play

  1. Connect using beeline

docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/tpcds/tiny';

  1. Connect using DBeaver

Add a Kyuubi datasource with

  • connection url jdbc:hive2://0.0.0.0:10009/tpcds/tiny
  • username: anonymous
  • password: <empty>
  1. Use built-in dataset

Kyuubi supply some built-in dataset, after Kyuubi started, you can run the following command to load the different datasets:

  • For loading TPC-DS tiny dataset to spark_catalog.tpcds_tiny, run docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/' -f /opt/load_data/load-dataset-tpcds-tiny.sql
  • For loading TPC-H tiny dataset to spark_catalog.tpch_tiny, run docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/' -f /opt/load_data/load-dataset-tpch-tiny.sql

Access Service

  • MinIO: http://localhost:9001
  • PostgreSQL localhost:5432 (username: postgres, password: postgres)
  • Spark UI: http://localhost:4040 (available after Spark application launching by Kyuubi, port may be 4041, 4042... if you launch more than one Spark applications)

Shutdown

  1. Stop compose services by docker compose down;

For Maintainers

Build

  1. Build images docker/playground/build-image.sh;
  2. Optional to use buildx to build and publish cross-platform images BUILDX=1 docker/playground/build-image.sh;