Seeking Solutions for Spark Context Stopping Without Engine Termination #6992
Unanswered
wangzhigang1999
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Health Check Mechanism in Spark Engine
Is there a health check mechanism between the Spark engine and the Spark context?
I often encounter an issue where the Spark context stops, but the engine process does not terminate.
I have reviewed issue 1800, but it only addresses the problem in cases of Out of Memory (OOM) errors. This does not fully resolve the issue I am facing.
Currently, I am using version 1.7.3. Has the higher version of Kyuubi completely resolved this issue? Alternatively, is there a specific mechanism or configuration that can help avoid this problem? After examining the latest code, it seems there is no obvious health check mechanism implemented.
If anyone has insights or solutions, I would greatly appreciate your input!
Beta Was this translation helpful? Give feedback.
All reactions