 [](https://github.com/yaooqinn/kyuubi/pull/411)      [❨?❩](https://pullrequestbadge.com/?utm_medium=github&utm_source=yaooqinn&utm_campaign=badge_info)<!-- PR-BADGE: PLEASE DO NOT REMOVE THIS COMMENT --> <!-- Thanks for sending a pull request! Here are some tips for you: 1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html 2. If the PR is related to an issue in https://github.com/yaooqinn/kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'. 3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'. --> ### _Why are the changes needed?_ <!-- Please clarify why the changes are needed. For instance, 1. If you add a feature, you can talk about the use case of it. 2. If you fix a bug, you can clarify why it is a bug. --> add a doc for hive JDBC usage. ### _How was this patch tested?_ - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [ ] Add screenshots for manual tests if appropriate - [x] [Run test](https://kyuubi.readthedocs.io/en/latest/tools/testing.html#running-tests) locally before make a pull request Closes #411 from yaooqinn/hivejdbcdoc. 03220df [Kent Yao] [DOC] Access Kyuubi with Hive JDBC and ODBC Drivers Authored-by: Kent Yao <yao@apache.org> Signed-off-by: Kent Yao <yao@apache.org>
71 lines
2.4 KiB
Markdown
71 lines
2.4 KiB
Markdown
<div align=center>
|
|
|
|

|
|
|
|
</div>
|
|
|
|
# Access Kyuubi with Hive JDBC and ODBC Drivers
|
|
|
|
|
|
## Instructions
|
|
|
|
Kyuubi does not provide its own JDBC Driver so far,
|
|
as it is fully compatible with Hive JDBC and ODBC drivers that let you connect to popular Business Intelligence (BI) tools to query,
|
|
analyze and visualize data though Spark SQL engines.
|
|
|
|
|
|
## Install Hive JDBC
|
|
|
|
For programing, the easiest way to get `hive-jdbc` is from [the maven central](https://mvnrepository.com/artifact/org.apache.hive/hive-jdbc). For example,
|
|
|
|
- **maven**
|
|
```xml
|
|
<dependency>
|
|
<groupId>org.apache.hive</groupId>
|
|
<artifactId>hive-jdbc</artifactId>
|
|
<version>2.3.8</version>
|
|
</dependency>
|
|
```
|
|
|
|
- **sbt**
|
|
```scala
|
|
libraryDependencies += "org.apache.hive" % "hive-jdbc" % "2.3.8"
|
|
```
|
|
|
|
- **gradle**
|
|
```gradle
|
|
implementation group: 'org.apache.hive', name: 'hive-jdbc', version: '2.3.8'
|
|
```
|
|
|
|
For BI tools, please refer to [Quick Start](../quick_start/index.html) to check the guide for the BI tool used.
|
|
If you find there is no specific document for the BI tool that you are using, don't worry, the configuration part for all BI tools are basically same.
|
|
Also, we will appreciate if you can help us to improve the document.
|
|
|
|
|
|
## JDBC URL
|
|
|
|
JDBC URLs have the following format:
|
|
|
|
```
|
|
jdbc:hive2://<host>:<port>/<dbName>;<sessionConfs>?<sparkConfs>#<[spark|hive]Vars>
|
|
```
|
|
|
|
JDBC Parameter | Description
|
|
---------------| -----------
|
|
host | The cluster node hosting Kyuubi Server.
|
|
port | The port number to which is Kyuubi Server listening.
|
|
dbName | Optional database name to set the current database to run the query against, use `default` if absent.
|
|
sessionConfs | Optional `Semicolon(;)` separated `key=value` parameters for the JDBC/ODBC driver. All of these will be set to the engine by `SparkSession.conf` which only accepts [Runtime SQL Configurations](http://spark.apache.org/docs/latest/configuration.html#runtime-sql-configuration);
|
|
sparkConfs | Optional `Semicolon(;)` separated `key=value` parameters for Kyuubi server to create the corresponding engine, dismissed if engine exists.
|
|
[spark|hive]Vars | Optional `Semicolon(;)` separated `key=value` parameters for Spark/Hive variables used for variable substitution.
|
|
|
|
## Example
|
|
|
|
```
|
|
jdbc:hive2://localhost:10009/default;spark.sql.adaptive.enabled=true?spark.ui.enabled=false#var_x=y
|
|
```
|
|
|
|
## Unsupported Hive Features
|
|
|
|
- Connect to HiveServer2 using HTTP transport. ```transportMode=http```
|