Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[draft] AIP-85 PoC #45684

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft

[draft] AIP-85 PoC #45684

wants to merge 1 commit into from

Conversation

IKholopov
Copy link
Contributor


This is a simple draft with PoC of AIP-85 https://cwiki.apache.org/confluence/display/AIRFLOW/AIP-85+Extendable+DAG+parsing+controls.

The key highlights and proposed interfaces/implementations are:

  1. airflow/dag_processing/dag_importer.py: DagImporter - a mechanism abstracting a way of how a "path" is translated into one or more parsed DAGs.

    • Default implementation is FSDagImporter (airflow/dag_processing/fs_dag_importer.py) - read a DAG from definition in a Python file.
    • Alternative "demo" implementation - NotebooksImporter (providers/src/airflow/providers/google/common/importers/notebooks_importer.py), which imports Python definitions from Jupyter notebook files (for demonstration purposes only, not part of the actual AIP).
  2. airflow/dag_processing/dag_ingester.py: DagIngester - a mechanism to abstract away the logic of operations of parsing/adding/updating/removing DAGs and their importing metadata (i.e. importing errors/warnings). Sample implementations:

    • Continuous ingester (airflow/dag_processing/continuous_ingester.py) - replication of the current regular dag-processor paring logic (by reusing current DagFileProcessingManagers)
    • Once ingester (airflow/dag_processing/once_ingester.py) - ingester that only imports DAGs once

Those 2 interfaces (that are allowed to be extended by providers/core Airflow) build a foundation to address AIP-85 requirements - increase flexibility of how DAGs are being updated and how they can be defined.

Apart from that, DagBag usage is replaced in most places with one of the following components:

  • DagStore (airflow/dag_processing/dag_store.py) - access to Airflow DAGs in metadata DBs (no access to DAG sources required). Used by any component that requires access to DAG's metadata (i.e. by scheduler, public API, etc.).
  • DagParser (airflow/dag_processing/dag_parser.py) - access to parsed DAGs (used by worker and dag-processor).

A lot of things are "swept under the rug" here as this is just a PoC based on an older Airflow branch before most of Airflow 3 changes landed. Big things that are missing:

  • Callbacks are not yet moved to a separate component (temporarily disabled)
  • Integration with bundles
  • Missing optimizations around module importing
  • DagCode is not propagated for non-FSDagImporter

^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.

@boring-cyborg boring-cyborg bot added area:API Airflow's REST/HTTP API area:CLI area:Scheduler including HA (high availability) scheduler area:serialization labels Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:API Airflow's REST/HTTP API area:CLI area:Scheduler including HA (high availability) scheduler area:serialization
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants