Skip to content

[Bug] Airflow 3 Assets not generated using DBT Fusion #2263

@ah12068

Description

@ah12068

Astronomer Cosmos Version

1.12.0

dbt-core version

1.11.2

Versions of dbt adapters

dbt-snowflake==1.11.0
dbt fusion 2.0.0-preview.92

LoadMode

DBT_LS_MANIFEST

ExecutionMode

LOCAL

InvocationMode

SUBPROCESS

airflow version

3.1.6rc1

Operating System

macOS Tahoe 26.1

If a you think it's an UI issue, what browsers are you seeing the problem on?

No response

Deployment

Docker-Compose

Deployment details

We docker compose an image of airflow + a custom python library that includes dbt-snowflake and cosmos.

Set up as follows:

  1. We have repos that perform dbt tasks - for relevant tasks an airflow asset is generated so that downstream dags can be ran.

What happened?

What happened: Using DBT fusion generates no airflow assets, meaning that asset-scheduled DAGs will not run on DBT Fusion. On regular DBT function is working fine

What we expected to happen: Using DBT fusion should generate airflow assets, therefore asset-scheduled DAGs should have ran on DBT Fusion.

Relevant log output

[2026-01-05 15:27:40] INFO - DAG bundles loaded: dags-folder
[2026-01-05 15:27:40] INFO - Filling up the DagBag from /opt/airflow/dags/data-product-salesforce/airflow_dags/dags/__init__.py
[2026-01-05 15:27:42] INFO - Trying to parse the dbt project `dbt` using a dbt manifest...
[2026-01-05 15:27:42] INFO - Total nodes: 200
[2026-01-05 15:27:42] INFO - Total filtered nodes: 200
[2026-01-05 15:27:42] INFO - Cosmos performance (salesforce__salesforce__dbt) -  [dev-airflow-worker-00000-0000|1149]: It took 0.129s to parse the dbt project for DAG using LoadMode.DBT_MANIFEST
[2026-01-05 15:27:43] INFO - Cosmos performance (salesforce__salesforce__dbt) - [dev-airflow-worker-00000-0000|1149]: It took 0.767s to build the Airflow DAG.
[2026-01-05 15:27:43] INFO - Cloning project to writable temp directory /tmp/tmph8k3sb_x from /opt/airflow/dags/data-product-salesforce/dbt
[2026-01-05 15:27:43] INFO - Copying the manifest from /opt/airflow/dags/data-product-salesforce/dbt/target/manifest.json...
[2026-01-05 15:27:43] INFO - Partial parse is enabled and the latest partial parse file is /tmp/cosmos/salesforce__salesforce__dbt/target/partial_parse.msgpack
[2026-01-05 15:27:43] INFO - Profile caching is enable.
[2026-01-05 15:27:43] INFO - Profile found in cache using profile: /tmp/cosmos/profile/00000/profiles.yml.
[2026-01-05 15:27:43] INFO - Trying to run the command:
 ['/home/airflow/dbt-fusion/dbt', 'test', '--select', 'accounts', '--project-dir', '/tmp/tmph8k3sb_x', '--profiles-dir', '/tmp/cosmos/profile/00000', '--profile', 'default', '--target', 'my_target']
From /tmp/tmph8k3sb_x
[2026-01-05 15:27:43] INFO - Tmp dir root location: 
 /tmp
[2026-01-05 15:27:43] INFO - Running command: ['/home/airflow/dbt-fusion/dbt', 'test', '--select', 'accounts', '--project-dir', '/tmp/tmph8k3sb_x', '--profiles-dir', '/tmp/cosmos/profile/00000', '--profile', 'default', '--target', 'my_target']
[2026-01-05 15:27:43] INFO - Command output:
[2026-01-05 15:27:44] INFO - dbt-fusion 2.0.0-preview.92
[2026-01-05 15:27:44] INFO -    Loading ../cosmos/profile/00000/profiles.yml
[2026-01-05 15:27:44] INFO -   Fetching packages.yml
[2026-01-05 15:27:48] INFO -  Downloading BRONZE_CUSTOMER.SALESFORCE.accounts (schema)
[2026-01-05 15:27:51] INFO -     Passed [  0.60s] test  dbt_test__audit.dbt_utils_at_least_one_accounts_account_id
[2026-01-05 15:27:52] INFO - 
[2026-01-05 15:27:52] INFO - =================== Errors and Warnings ====================
[2026-01-05 15:27:52] INFO - warning: dbt1700: --cache-selected is no longer supported
[2026-01-05 15:27:52] INFO - 
[2026-01-05 15:27:52] INFO - ==================== Execution Summary =====================
[2026-01-05 15:27:52] INFO - Finished 'test' with 1 warning for target 'my_target' [8.6s]
[2026-01-05 15:27:52] INFO - Processed: 1 test
[2026-01-05 15:27:52] INFO - Summary: 1 total | 1 success
[2026-01-05 15:27:52] INFO - Command exited with return code 0
[2026-01-05 15:27:52] INFO - dbt-fusion 2.0.0-preview.92   Loading ../cosmos/profile/00000/profiles.yml  Fetching packages.yml Downloading BRONZE_CUSTOMER.SALESFORCE.accounts (schema)    Passed [  0.60s] test  dbt_test__audit.dbt_utils_at_least_one_accounts_account_id=================== Errors and Warnings ====================warning: dbt1700: --cache-selected is no longer supported==================== Execution Summary =====================Finished 'test' with 1 warning for target 'my_target' [8.6s]Processed: 1 testSummary: 1 total | 1 success
[2026-01-05 15:27:52] WARNING - Artifact schema version: https://schemas.getdbt.com/dbt/manifest/v20.json is above dbt-ol tested version 12. Newer versions have not been tested and may not be compatible.
[2026-01-05 15:27:52] INFO - Inlets: [] <<<< ISSUE HERE
[2026-01-05 15:27:52] INFO - Outlets: []
[2026-01-05 15:27:52] INFO - Assigning outlets with DatasetAlias in Airflow 3

How to reproduce

  1. Create a DAG in cosmos that generates an asset automatically using the cosmos package. Make sure that you're using DBT Fusion 2.0.0
  2. Create another DAG that is scheduled based on an asset in point 1.
  3. Execute DAG in point 1 - check for asset in airflow and if DAG in point 2 is executed

Anything else :)?

We believe that in cosmos/operators/local.py in the class AbstractDbtLocalBase() and in the function calculate_openlineage_events_completes() which uses a package called OpenLineage

This piece of code is returning an error:

openlineage_processor = DbtLocalArtifactProcessor(
            producer=OPENLINEAGE_PRODUCER,
            job_namespace=settings.LINEAGE_NAMESPACE,
            project_dir=project_dir,
            profile_name=self.profile_config.profile_name,
            target=self.profile_config.target_name,
        )

This is because run_results.json is very different in DBT Fusion vs Regular DBT.

In DBT Fusion the run_results.json (without results for simplicity) looks like:

{
    "metadata": {
        "dbt_schema_version": "https://schemas.getdbt.com/dbt/run-results/v6.json",
        "dbt_version": "2.0.0-preview.92",
        "generated_at": "YYYY-MM-DDT11:08:54.601858Z",
        "invocation_id": "00000000-0000-0000-0000-000000000000",
        "env": {}
    },
    "results": [],
    "args": {
        "command": "run",
        "which": "run",
        "__other__": {
            "command": "run",
            "which": "run"
        }
    }

In Regular DBT:

{
  "metadata": {
    "dbt_schema_version": "https://schemas.getdbt.com/dbt/run-results/v6.json",
    "dbt_version": "1.10.15",
    "generated_at": "YYYY-MM-DDT11:41:37.499928Z",
    "invocation_id": "00000000-0000-0000-0000-000000000000",
    "invocation_started_at": "YYYY-MM-DDT11:41:28.260832Z",
    "env": {}
  },
  "results": [],
  "elapsed_time": 6.718767166137695,
  "args": {
    "profiles_dir": "path_to_profiles",
    ...
  }

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Contact Details

[email protected]

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtriage-neededItems need to be reviewed / assigned to milestone

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions