From 47533100f6f69b02a90985637a8fcc3f2cebb28c Mon Sep 17 00:00:00 2001 From: Wang Zhen Date: Tue, 4 Jan 2022 15:48:05 +0800 Subject: [PATCH] [KYUUBI #1674] Uncache cached tables when session closed MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit ### _Why are the changes needed?_ As mentioned in SPARK-29911 , there may also be memory leaks in kyuubi. #1674 ### _How was this patch tested?_ - [ ] Add some test cases that check the changes thoroughly including negative and positive cases if possible - [X] Add screenshots for manual tests if appropriate ![微信截图_20220104145456](https://user-images.githubusercontent.com/17894939/148021323-9f3e4268-ff54-4ad3-bb7e-2044d73ae154.png) close session: ![微信截图_20220104145530](https://user-images.githubusercontent.com/17894939/148021331-2a816934-2c07-47dd-b939-01117f93bb5b.png) - [X] [Run test](https://kyuubi.readthedocs.io/en/latest/develop_tools/testing.html#running-tests) locally before make a pull request Closes #1676 from wForget/KYUUBI-1674. Closes #1674 e2bec4ec [Wang Zhen] [KYUUBI-1674] Uncache cached tables when session closed Authored-by: Wang Zhen Signed-off-by: ulysses-you --- .../apache/kyuubi/engine/spark/session/SparkSessionImpl.scala | 1 + 1 file changed, 1 insertion(+) diff --git a/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/session/SparkSessionImpl.scala b/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/session/SparkSessionImpl.scala index 98ca7d190..95d22ea54 100644 --- a/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/session/SparkSessionImpl.scala +++ b/externals/kyuubi-spark-sql-engine/src/main/scala/org/apache/kyuubi/engine/spark/session/SparkSessionImpl.scala @@ -69,6 +69,7 @@ class SparkSessionImpl( sessionEvent.endTime = System.currentTimeMillis() EventLoggingService.onEvent(sessionEvent) super.close() + spark.sessionState.catalog.getTempViewNames().foreach(spark.catalog.uncacheTable(_)) sessionManager.operationManager.asInstanceOf[SparkSQLOperationManager].closeILoop(handle) }