-
Notifications
You must be signed in to change notification settings - Fork 16.4k
Add spark application env #61204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
sanchalitorpe-source
wants to merge
3
commits into
apache:main
Choose a base branch
from
sanchalitorpe-source:add-spark-application-env
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+199
−2
Open
Add spark application env #61204
Changes from all commits
Commits
Show all changes
3 commits
Select commit
Hold shift + click to select a range
5e370eb
Persist DataTable column visibility in localStorage
sanchalitorpe-source c9e4410
Add documentation for default logging configuration in Airflow
sanchalitorpe-source 60bddbb
Add SPARK_APPLICATION_NAME env variable to SparkKubernetesOperator pods
sanchalitorpe-source File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
70 changes: 70 additions & 0 deletions
70
airflow-core/src/airflow/airflow/docs/apache-airflow/logging-configuration.rst
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,70 @@ | ||
| Default Logging in Apache Airflow | ||
| ================================= | ||
|
|
||
| Apache Airflow has multiple loggers for different components, which can be confusing for new users. | ||
| This section explains the default loggers, their purposes, and how to modify their behavior. | ||
|
|
||
| Default Loggers | ||
| --------------- | ||
|
|
||
| +------------------------+----------------+----------------------------+--------------------------------+ | ||
| | Logger Name | Component | Output | Notes | | ||
| +========================+================+============================+================================+ | ||
| | root | Webserver | stdout / webserver log | Default root logger used by webserver. | | ||
| +------------------------+----------------+----------------------------+--------------------------------+ | ||
| | airflow.task | Scheduler/Worker | logs/<dag_id>/<task_id>/<execution_date>/<try_number>.log | A new log file is created per task instance and try. Shown in the Web UI. | | ||
| +------------------------+----------------+----------------------------+--------------------------------+ | ||
| | airflow.processor | Scheduler/Worker | logs/<dag_file_name>.log | Logs DAG parsing for scheduler and workers. | | ||
| +------------------------+----------------+----------------------------+--------------------------------+ | ||
| | airflow.processor_manager | Scheduler | logs/<dag_file_name>.log | Logs task instance execution control. | | ||
| +------------------------+----------------+----------------------------+--------------------------------+ | ||
| | flask_appbuilder | Webserver | filters verbose FAB logs | Typically used for filtering; no config needed by most users. | | ||
| +------------------------+----------------+----------------------------+--------------------------------+ | ||
|
|
||
| Logging by Airflow Component | ||
| ---------------------------- | ||
|
|
||
| - **Webserver**: Uses the root logger. Logs to stdout and webserver log file. | ||
| - **Worker**: Uses `airflow.task` and `airflow.processor`. Task logs are stored per task instance. DAG parsing logs are stored per DAG file. | ||
| - **Scheduler**: Uses `airflow.processor`, `airflow.processor_manager`, and the root logger. | ||
|
|
||
| Customizing Logging | ||
| ------------------- | ||
|
|
||
| You can influence the logging configuration using the following methods: | ||
|
|
||
| 1. **Configuration via airflow.cfg** | ||
| - `[logging]` section allows changing: | ||
| - Base log folder (`base_log_folder`) | ||
| - Remote logging settings | ||
| - Logging format | ||
|
|
||
| 2. **Custom Python logging configuration** | ||
| - Airflow uses `airflow.utils.log.logging_config.py` | ||
| - You can override `LOGGING_CONFIG` in `airflow_local_settings.py` | ||
| - Example: | ||
|
|
||
| .. code-block:: python | ||
|
|
||
| from airflow.utils.log.logging_config import DEFAULT_LOGGING_CONFIG | ||
| LOGGING_CONFIG = DEFAULT_LOGGING_CONFIG.copy() | ||
| LOGGING_CONFIG['handlers']['console']['level'] = 'INFO' | ||
|
|
||
| 3. **Environment Variables** | ||
| - Some logging options can be set via environment variables, e.g.: | ||
| - `AIRFLOW__LOGGING__BASE_LOG_FOLDER` | ||
| - `AIRFLOW__LOGGING__REMOTE_LOGGING` | ||
|
|
||
| Recommendations | ||
| --------------- | ||
|
|
||
| - Use `airflow.task` logs to debug task failures. | ||
| - Use `airflow.processor` to debug DAG parsing issues. | ||
| - For production, consider remote logging (S3, GCS, Elasticsearch) for scalability. | ||
| - Do **not** modify `flask_appbuilder` logger unless needed. | ||
|
|
||
| References | ||
| ---------- | ||
|
|
||
| - :doc:`/configuration/logging` | ||
| - :ref:`task-logs` |
107 changes: 107 additions & 0 deletions
107
airflow-core/src/airflow/airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,107 @@ | ||
| # airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py | ||
|
|
||
| from kubernetes.client import V1EnvVar | ||
| from airflow.providers.cncf.kubernetes.operators.kubernetes_pod import KubernetesPodOperator | ||
| from airflow.models import BaseOperator | ||
| from typing import Optional | ||
|
|
||
| class SparkKubernetesOperator(BaseOperator): | ||
| """ | ||
| SparkKubernetesOperator launches Spark driver and executor pods in Kubernetes. | ||
| This version adds a SPARK_APPLICATION_NAME environment variable to both pods. | ||
| """ | ||
|
|
||
| def __init__( | ||
| self, | ||
| *, | ||
| application_name: str, | ||
| namespace: str = "default", | ||
| # other arguments | ||
| **kwargs, | ||
| ): | ||
| super().__init__(**kwargs) | ||
| self.application_name = application_name | ||
| self.namespace = namespace | ||
| # Initialize other needed fields | ||
|
|
||
| def execute(self, context): | ||
| """ | ||
| Build and submit the driver and executor pods. | ||
| """ | ||
| dag_run = context.get("dag_run") | ||
| self.dag_run = dag_run # store DAG run to use in _get_spark_app_name | ||
|
|
||
| # --- Example: create driver pod spec --- | ||
| driver_spec = self._build_driver_pod_spec() | ||
|
|
||
| # Add SPARK_APPLICATION_NAME env variable to driver pod | ||
| driver_env = driver_spec['spec']['containers'][0].env or [] | ||
| driver_env.append( | ||
| V1EnvVar( | ||
| name="SPARK_APPLICATION_NAME", | ||
| value=self._get_spark_app_name() | ||
| ) | ||
| ) | ||
| driver_spec['spec']['containers'][0].env = driver_env | ||
|
|
||
| # --- Example: create executor pod spec --- | ||
| executor_spec = self._build_executor_pod_spec() | ||
|
|
||
| # Add SPARK_APPLICATION_NAME env variable to executor pod | ||
| executor_env = executor_spec['spec']['containers'][0].env or [] | ||
| executor_env.append( | ||
| V1EnvVar( | ||
| name="SPARK_APPLICATION_NAME", | ||
| value=self._get_spark_app_name() | ||
| ) | ||
| ) | ||
| executor_spec['spec']['containers'][0].env = executor_env | ||
|
|
||
| # Submit driver and executor pods | ||
| self._submit_driver(driver_spec) | ||
| self._submit_executors(executor_spec) | ||
|
|
||
| # Other existing logic... | ||
|
|
||
| def _get_spark_app_name(self) -> str: | ||
| """ | ||
| Returns the Spark application name for this DAG run. | ||
| Combines the base application name with DAG run ID for deterministic uniqueness. | ||
| """ | ||
| suffix = getattr(self, "dag_run", None).run_id if hasattr(self, "dag_run") else "manual" | ||
| return f"{self.application_name}-{suffix}" | ||
|
|
||
| # Placeholder methods for building and submitting pods | ||
| def _build_driver_pod_spec(self): | ||
| # Existing logic to create driver pod spec | ||
| return { | ||
| "spec": { | ||
| "containers": [ | ||
| { | ||
| "name": "spark-driver", | ||
| "env": [] | ||
| } | ||
| ] | ||
| } | ||
| } | ||
|
|
||
| def _build_executor_pod_spec(self): | ||
| # Existing logic to create executor pod spec | ||
| return { | ||
| "spec": { | ||
| "containers": [ | ||
| { | ||
| "name": "spark-executor", | ||
| "env": [] | ||
| } | ||
| ] | ||
| } | ||
| } | ||
|
|
||
| def _submit_driver(self, driver_spec): | ||
| # Existing submission logic | ||
| pass | ||
|
|
||
| def _submit_executors(self, executor_spec): | ||
| # Existing submission logic | ||
| pass | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All files should have license headers. This operator is already defined in providers/cncf/kubernetes/src/airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py. The PR also has unrelated changes from your other PRs. Can you please check your git flow, disclose usage of AI and also test changes before submitting PRs?
https://github.com/apache/airflow/blob/main/contributing-docs/05_pull_requests.rst