-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Description
What happened?
- ✋ I have searched the open/closed issues and my issue is not listed.
In v2.4.0, pod template file --conf flags were updated to always use a default Spec if user does not provide one in the submitted SparkApplication. This is a breaking change if users were not writing out the entire Driver and Executor specs within their SparkApplication YAML and relying on defaults provided in spark-defaults.conf mounted to SparkOperator pods. The new change will also override any user-provided spark --conf flags for the template variables.
Reproduction Code
- Set
spark.kubernetes.[Driver/Executor].podTemplateFileinspark-defaults.confmounted to SparkOperator referencing a template mounted to SparkOperator pod or provide template flags as key/values within thesparkConfof the SparkApplication - Submit SparkApplication spec without
app.Spec.[Driver/Executor].template
Expected behavior
SparkOperator should use default assigned within spark-defaults.conf (if it exists) or use user provided sparkConf values
Actual behavior
SparkOperator writes a blank Spec to a temporary file on SparkOperator and overwrites template vars to the tmp file location
Environment & Versions
- Kubernetes Version: 1.33.3
- Spark Operator Version: 2.4.0
- Apache Spark Version: 3.5.7
Additional context
No response
Impacted by this bug?
Give it a 👍 We prioritize the issues with most 👍