### _Why are the changes needed?_ Fix https://github.com/apache/incubator-kyuubi/issues/3500 Supply a SQL script to create TPC tables in spark_catalog and load tiny scale data. ### _How was this patch tested?_ - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [ ] Add screenshots for manual tests if appropriate - [x] [Run test](https://kyuubi.apache.org/docs/latest/develop_tools/testing.html#running-tests) locally before make a pull request Closes #3526 from Yikf/playgroud-tiny-dataset. Closes #3500 8d06322d [yikf] supply tpcds tiny dataset Authored-by: yikf <yikaifei1@gmail.com> Signed-off-by: Cheng Pan <chengpan@apache.org> |
||
|---|---|---|
| .. | ||
| image | ||
| .env | ||
| build-image.sh | ||
| compose.yml | ||
| README.md | ||
Playground
For Users
Setup
- Install Docker and Docker Compose;
- Go to
docker/playground, and usedocker compose upto start compose services in the foreground, or usedocker compose up -dto run compose services as daemon;
Play
- Connect using
beeline
docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/';
- Connect using DBeaver
Add a Kyuubi datasource with
- connection url
jdbc:hive2://0.0.0.0:10009/ - username:
anonymous - password:
<empty>
- Using built-in dataset
Kyuubi supply some built-in dataset, After the Kyuubi starts, you can run the following command to load the different datasets:
- For loading TPC-DS tiny dataset to spark_catalog.tpcds_tiny, run
docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/' -f /opt/load_data/load-dataset-tpcds-tiny.sql - For loading TPC-H tiny dataset to spark_catalog.tpch_tiny, run
docker exec -it kyuubi /opt/kyuubi/bin/beeline -u 'jdbc:hive2://0.0.0.0:10009/' -f /opt/load_data/load-dataset-tpch-tiny.sql
Access Service
- MinIO: http://localhost:9001
- PostgreSQL localhost:5432 (username: postgres, password: postgres)
- Spark UI: http://localhost:4040 (available after Spark application launching by Kyuubi, port may be 4041, 4042... if you launch more than one Spark applications)
Shutdown
- Stop compose services by pressing
CTRL+Cif they are running on the foreground, or bydocker compose downif they are running as daemon; - Remove the stopped containers
docker compose rm;
For Maintainers
Build
- Build images
docker/playground/build-image.sh; - Optional to use
buildxto build and publish cross-platform imagesBUILDX=1 docker/playground/build-image.sh;