[KYUUBI #500] Speed up spark sql engine diagnostics

![ulysses-you](https://badgen.net/badge/Hello/ulysses-you/green) [![Closes #500](https://badgen.net/badge/Preview/Closes%20%23500/blue)](https://github.com/yaooqinn/kyuubi/pull/500) ![4](https://badgen.net/badge/%2B/4/red) ![3](https://badgen.net/badge/-/3/green) ![1](https://badgen.net/badge/commits/1/yellow) ![Target Issue](https://badgen.net/badge/Missing/Target%20Issue/ff0000) ![Test Plan](https://badgen.net/badge/Missing/Test%20Plan/ff0000) [<img width="16" alt="Powered by Pull Request Badge" src="https://user-images.githubusercontent.com/1393946/111216524-d2bb8e00-85d4-11eb-821b-ed4c00989c02.png">](https://pullrequestbadge.com/?utm_medium=github&utm_source=yaooqinn&utm_campaign=badge_info)<!-- PR-BADGE: PLEASE DO NOT REMOVE THIS COMMENT -->

<!--
Thanks for sending a pull request!

Here are some tips for you:
  1. If this is your first time, please read our contributor guidelines: https://kyuubi.readthedocs.io/en/latest/community/contributions.html
  2. If the PR is related to an issue in https://github.com/yaooqinn/kyuubi/issues, add '[KYUUBI #XXXX]' in your PR title, e.g., '[KYUUBI #XXXX] Your PR title ...'.
  3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][KYUUBI #XXXX] Your PR title ...'.
-->

### _Why are the changes needed?_
<!--
Please clarify why the changes are needed. For instance,
  1. If you add a feature, you can talk about the use case of it.
  2. If you fix a bug, you can clarify why it is a bug.
-->
Mark it as `lazy val` to avoid rebuld the string text every time.

### _How was this patch tested?_
Pass exists CI

Closes #500 from ulysses-you/speed-up-diagnostics.

Closes #500

d4a61f1 [ulysses-you] init

Authored-by: ulysses-you <ulyssesyou18@gmail.com>
Signed-off-by: fwang12 <fwang12@ebay.com>
This commit is contained in:
ulysses-you 2021-04-11 09:37:42 +08:00 committed by fwang12
parent d45c498346
commit b0aeb26230
3 changed files with 4 additions and 3 deletions

View File

@ -23,7 +23,8 @@ import org.apache.spark.sql.SparkSession
object KyuubiSparkUtil {
def diagnostics(spark: SparkSession): String = {
lazy val diagnostics: String = {
val spark = SparkSession.active
val sc = spark.sparkContext
val webUrl = sc.getConf.getOption(
"spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES")

View File

@ -116,7 +116,7 @@ object SparkSQLEngine extends Logging {
try {
spark = createSpark()
engine = startEngine(spark)
info(KyuubiSparkUtil.diagnostics(spark))
info(KyuubiSparkUtil.diagnostics)
// blocking main thread
countDownLatch.await()
} catch {

View File

@ -67,7 +67,7 @@ class ExecuteStatement(
private def executeStatement(): Unit = {
try {
setState(OperationState.RUNNING)
info(KyuubiSparkUtil.diagnostics(spark))
info(KyuubiSparkUtil.diagnostics)
Thread.currentThread().setContextClassLoader(spark.sharedState.jarClassLoader)
spark.sparkContext.setJobGroup(statementId, statement, forceCancel)
result = spark.sql(statement)