[KYUUBI #7158] Spark engine respects session-level idle timeout threshold
### Why are the changes needed? Fixes the same class of issue as https://github.com/apache/kyuubi/pull/7138 Previously, `sessionIdleTimeoutThreshold` was initialized only once during session creation using `sessionManager.getConf`, preventing dynamic updates when clients pass new configurations during connection. we now: - Allow clients to set session-specific kyuubi.session.idle.timeout` during connection - Dynamically adjust idle timeout per session - Prevent connection pile-up by timely recycling idle sessions Closes #7158 from 1358035421/lc/sessio_idle_timeout_threshold. Closes #7158 abe513eed [liangzhaoyuan] fix review comments 3face844a [liangzhaoyuan] Use per-session idle timeout threshold instead of global sessionManager's value Authored-by: liangzhaoyuan <lwlzyl19940916@gmail.com> Signed-off-by: Cheng Pan <chengpan@apache.org>
This commit is contained in:
parent
0b4b5cabc4
commit
9a50bfa814
@ -24,6 +24,7 @@ import org.apache.spark.sql.{AnalysisException, SparkSession}
|
||||
import org.apache.spark.ui.SparkUIUtils.formatDuration
|
||||
|
||||
import org.apache.kyuubi.KyuubiSQLException
|
||||
import org.apache.kyuubi.config.KyuubiConf.SESSION_IDLE_TIMEOUT
|
||||
import org.apache.kyuubi.config.KyuubiReservedKeys.KYUUBI_SESSION_HANDLE_KEY
|
||||
import org.apache.kyuubi.engine.spark.events.SessionEvent
|
||||
import org.apache.kyuubi.engine.spark.operation.SparkSQLOperationManager
|
||||
@ -57,6 +58,13 @@ class SparkSessionImpl(
|
||||
}
|
||||
}
|
||||
|
||||
override val sessionIdleTimeoutThreshold: Long = {
|
||||
conf.get(SESSION_IDLE_TIMEOUT.key)
|
||||
.map(_.toLong)
|
||||
.getOrElse(
|
||||
sessionManager.getConf.get(SESSION_IDLE_TIMEOUT))
|
||||
}
|
||||
|
||||
private val sessionEvent = SessionEvent(this)
|
||||
|
||||
override def open(): Unit = {
|
||||
|
||||
Loading…
Reference in New Issue
Block a user