Skip to content

Pod Template defaults and SparkConfs being ignored #2793

@ggaskins-slack

Description

@ggaskins-slack

What happened?

  • ✋ I have searched the open/closed issues and my issue is not listed.

In v2.4.0, pod template file --conf flags were updated to always use a default Spec if user does not provide one in the submitted SparkApplication. This is a breaking change if users were not writing out the entire Driver and Executor specs within their SparkApplication YAML and relying on defaults provided in spark-defaults.conf mounted to SparkOperator pods. The new change will also override any user-provided spark --conf flags for the template variables.

Reproduction Code

  1. Set spark.kubernetes.[Driver/Executor].podTemplateFile in spark-defaults.conf mounted to SparkOperator referencing a template mounted to SparkOperator pod or provide template flags as key/values within the sparkConf of the SparkApplication
  2. Submit SparkApplication spec without app.Spec.[Driver/Executor].template

Expected behavior

SparkOperator should use default assigned within spark-defaults.conf (if it exists) or use user provided sparkConf values

Actual behavior

SparkOperator writes a blank Spec to a temporary file on SparkOperator and overwrites template vars to the tmp file location

Environment & Versions

  • Kubernetes Version: 1.33.3
  • Spark Operator Version: 2.4.0
  • Apache Spark Version: 3.5.7

Additional context

No response

Impacted by this bug?

Give it a 👍 We prioritize the issues with most 👍

Metadata

Metadata

Assignees

Labels

kind/bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions