Releases: fivetran/dbt_fivetran_log
v2.3.1 dbt_fivetran_log
PR #169 includes the following updates:
Bug Fix
- Ensures that
return()
is only called once in theis_databricks_all_purpose_cluster()
macro. This resolves the following error that dbt Fusion users may have received:
Failed to add template invalid operation: return() is called in a non-block context
(in fivetran_log.default__is_databricks_all_purpose_cluster:7:12)
--> fivetran_log.default__is_databricks_all_purpose_cluster:7:12
Full Changelog: v2.3.0...v2.3.1
v2.3.0 dbt_fivetran_log
PR #164 includes the following updates:
Schema Data Changes
3 total changes • 3 possible breaking changes
To prevent potential errors from naming and materialization updates, a
--full-refresh
is required after upgrading.
Data Models | Change Type | Old | New | Notes |
---|---|---|---|---|
fivetran_platform__audit_table |
Model materialization (only for Databricks SQL Warehouse runtimes) | table | incremental | Added incremental model support for Databricks SQL Warehouses using the merge strategy. Previously, incremental support was limited to Databricks All Purpose Clusters via the insert_overwrite strategy. This update extends incremental functionality to SQL Warehouses, enabling more efficient model builds. |
fivetran_platform__audit_table |
New column | write_to_table_start_day |
Changed the column partitioned on from sync_start_day to the new write_to_table_start_day . The previous column could contain null values, which are not ideal for partitioning and may lead to unexpected behavior in incremental models. |
|
fivetran_platform__audit_table |
Deleted column | sync_start_day |
No longer in use given the above. |
Bug Fixes
- Updated
fivetran_platform__mar_table_history
to include consumption records do not have an associated active connection and/or destination.- As a result this table may now contain additional records that were previously excluded.
- For more details, see the corresponding DECISIONLOG entry.
dbt Fusion Compatibility Updates
- Updated package to maintain compatibility with dbt-core versions both before and after v1.10.6, which introduced a breaking change to multi-argument test syntax (e.g.,
unique_combination_of_columns
). - Temporarily removed unsupported tests to avoid errors and ensure smoother upgrades across different dbt-core versions. These tests will be reintroduced once a safe migration path is available.
- Removed all
dbt_utils.unique_combination_of_columns
tests. - Moved
loaded_at_field: _fivetran_synced
under theconfig:
block insrc_fivetran_log.yml
.
- Removed all
Under the Hood
- Updated the
is_incremental_compatible()
macro to include Databricks SQL Warehouses. - Introduced a new macro,
is_databricks_all_purpose_cluster()
, to distinguish between Databricks All Purpose Clusters and SQL Warehouses. - Updated conditions in
.github/workflows/auto-release.yml
. - Added
.github/workflows/generate-docs.yml
.
Full Changelog: v2.2.2...v2.3.0
v2.3.0-a1 dbt_fivetran_log
PR #162 includes the following updates:
Bug Fixes
- Updated
fivetran_platform__mar_table_history
to include consumption records not associated with an active connection.- As a result this table may now contain additional records that were previously excluded.
Under the Hood - July 2025 Updates
PR #161 includes the following updates:
- Updated conditions in
.github/workflows/auto-release.yml
. - Added
.github/workflows/generate-docs.yml
. - Added
+docs: show: False
tointegration_tests/dbt_project.yml
. - Migrated
flags
(e.g.,send_anonymous_usage_stats
,use_colors
) fromsample.profiles.yml
tointegration_tests/dbt_project.yml
. - Updated
maintainer_pull_request_template.md
with improved checklist. - Updated Python image version to
3.10.13
inpipeline.yml
. - Updated
.gitignore
to exclude additional DBT, Python, and system artifacts.
Full Changelog: v2.2.2...v2.3.0-a1
v2.2.2 dbt_fivetran_log
PR #160 includes the following updates:
Under the Hood
- Added BigQuery JSON field support for the following model and columns:
stg_fivetran_platform__log
:message_data
column.- Added the
field_conversion
cte to apply the conversion tomessage_data
prior to additional transformations.
- Added the
- Added the
json_to_string()
macro for BigQuery to convert JSON fields to strings for reliable downstream parsing. - Included json versions to the integration tests to ensure json data type compatibility.
Full Changelog: v2.2.1...v2.2.2
v2.2.1 dbt_fivetran_log
PR #153 includes the following updates:
Under the Hood
- Incorporated
fivetran_platform__credits_pricing
andfivetran_platform_using_transformations
into thequickstart.yml
file.
Full Changelog: v2.2.0...v2.2.1
v2.2.0 dbt_fivetran_log
PR #154 includes the following updates:
Breaking Change for dbt Core < 1.9.6
Note: This is not relevant to Fivetran Quickstart users.
Migratedfreshness
from a top-level source property to a sourceconfig
in alignment with recent updates from dbt Core. This will resolve the following deprecation warning that users running dbt >= 1.9.6 may have received:
[WARNING]: Deprecated functionality
Found `freshness` as a top-level property of `fivetran_platform` in file
`models/src_fivetran_platform.yml`. The `freshness` top-level property should be moved
into the `config` of `fivetran_platform`.
IMPORTANT: Users running dbt Core < 1.9.6 will not be able to utilize freshness tests in this release or any subsequent releases, as older versions of dbt will not recognize freshness as a source config
and therefore not run the tests.
If you are using dbt Core < 1.9.6 and want to continue running Fivetran Platform freshness tests, please elect one of the following options:
- (Recommended) Upgrade to dbt Core >= 1.9.6
- Do not upgrade your installed version of the
fivetran_log
package. Pin your dependency on v2.1.0 in yourpackages.yml
file. - Utilize a dbt override to overwrite the package's
fivetran_platform
source and apply freshness via the old top-level property route. This will require you to copy and paste the entirety of thesrc_fivetran_platform.yml
file and add anoverrides: fivetran_log
property.
Bug fixes
- Updated logic for identifying broken connections. Connection
sync_end
events havinglog_status = 'FAILURE'
, in addition toSEVERE
event types, are now considered broken connections. (PR #155)
Under the Hood
- Updated the package maintainer PR template.
Contributors
Full Changelog: v2.1.0...v2.2.0
v2.1.1-a1 dbt_fivetran_log
PR #153 includes the following updates:
Under the Hood
- Incorporated
fivetran_platform__credits_pricing
andfivetran_platform_using_transformations
into thequickstart.yml
file. - Updated the package maintainer PR template.
Full Changelog: v2.1.0...v2.1.1-a1
v2.1.0 dbt_fivetran_log
PR #150 includes the following updates:
Dependency Changes
- Removed the dependency on calogica/dbt_date as it is no longer actively maintained. To maintain functionality, key date macros have been replicated within the
fivetran_date_macros
folder with minimal modifications. Only macro versions supporting the Fivetran Log supported destinations are retained, and all have been prefixed withfivetran_
to avoid naming conflicts.date_part
->fivetran_date_part
day_name
->fivetran_day_name
day_of_month
->fivetran_day_of_month
Under the Hood
- Created consistency test on
fivetran_platform__audit_user_activity
to ensureday_name
andday_of_month
counts match.
Full Changelog: v2.0.0...v2.1.0
v2.0.0 dbt_fivetran_log
PR #144 includes the following updates:
Breaking Changes - Action Required
A
--full-refresh
is required after upgrading to prevent errors caused by naming and materialization changes. Additionally, downstream queries must be updated to reflect new model and column names.
-
The materialization of all
stg_*
staging models has been updated fromtable
toview
.- Previously
stg_*_tmp
models were views while the non-*_tmp
versions were tables. Now all are views to eliminate redundant data storage.
- Previously
-
Source Table Transition:
- The
CONNECTOR
source table is deprecated and replaced byCONNECTION
. During a brief transition period, both tables will be identical, butCONNECTOR
will stop receiving data and be removed at a later time.- This change clarifies the distinction: Connectors facilitate the creation of connections between sources and destinations.
- The
CONNECTION
table is now the default source.- For Quickstart users: The
CONNECTOR
will automatically be used ifCONNECTION
is not yet available. - For dbt Core users: Users without the
CONNECTION
source can continue usingCONNECTOR
by adding the following variable to your rootdbt_project.yml
file:vars: fivetran_platform_using_connection: false # default: true
- For more details, refer to the README.
- For Quickstart users: The
- The
-
New Columns:
- As part of the
CONNECTION
updates, the following columns have been added alongside theirconnector_*
equivalents:- INCREMENTAL_MAR:
connection_name
- LOG:
connection_id
- INCREMENTAL_MAR:
- As part of the
-
Renamed Models:
fivetran_platform__connector_status
→fivetran_platform__connection_status
fivetran_platform__connector_daily_events
→fivetran_platform__connection_daily_events
fivetran_platform__usage_mar_destination_history
→fivetran_platform__usage_history
stg_fivetran_platform__connector
→stg_fivetran_platform__connection
stg_fivetran_platform__connector_tmp
→stg_fivetran_platform__connection_tmp
NOTE: Ensure any downstream queries are updated to reflect the new model names.
- Renamed Columns:
- Renamed
connector_id
toconnection_id
andconnector_name
toconnection_name
in the following models:fivetran_platform__connection_status
- Also renamed
connector_health
toconnection_health
- Also renamed
fivetran_platform__mar_table_history
fivetran_platform__connection_daily_events
fivetran_platform__audit_table
fivetran_platform__audit_user_activity
fivetran_platform__schema_changelog
stg_fivetran_platform__connection
stg_fivetran_platform__log
connector_id
toconnection_id
only
stg_fivetran_platform__incremental_mar
connector_name
toconnection_name
only
- Renamed
NOTE: Ensure any downstream queries are updated to reflect the new column names.
Features
- Added macro
coalesce_cast
to ensure consistent data types when usingcoalesce
, preventing potential errors. - Added macro
get_connection_columns
for the newCONNECTION
source.
Documentation
- Updated documentation to reflect all renames and the source table transition.
Under the Hood (Maintainers Only)
- Updated consistency and integrity tests to align with naming changes.
- Refactored seeds and
get_*_columns
macros to reflect renames. - Added a new seed for the
CONNECTION
table. - Updated
run_models
to test new varfivetran_platform_using_connection
.
Full Changelog: v1.11.0...v2.0.0
v1.11.0 dbt_fivetran_log
PR #141 includes the following updates:
Schema Changes: Adding the Transformation Runs Table
-
This package now accounts for the
transformation_runs
source table. Therefore, a new staging modelstg_fivetran_platform__transformation_runs
has been added. Note that not all customers have thetransformation_runs
source table, particularly if they are not using Fivetran Transformations. If the table doesn't exist,stg_fivetran_platform__transformation_runs
will persist as an empty model and respective downstream fields will be null. -
In addition, the following fields have been added to the
fivetran_platform__usage_mar_destination_history
end model:paid_model_runs
free_model_runs
total_model_runs
Documentation Updates
- Included documentation about the
transformation_runs
source table and the aggregated*_model_runs
fields. - Added information about manually configuring the
fivetran_platform_using_transformations
variable in the DECISION LOG. - Added Quickstart model counts to README. (#145)
- Corrected references to connectors and connections in the README. (#145)
Under the Hood
- Introduced the variable
fivetran_platform_using_transformations
to control thestg_fivetran_platform__transformation_runs
output. It is configured based on whether thetransformation_runs
table exists. For more information, refer to the DECISION LOG. - Added the
get_transformation_runs_columns()
macro to ensure all required columns are present. - Added
transformation_runs
seed data inintegration_tests/seeds/
. - Added a
run_count__usage_mar_destination_history
validation test to check model run counts across staging and end model. - (Redshift only) Updates to use limit 1 instead of limit 0 for empty tables. This ensures that Redshift will respect the package's datatype casts.
Full Changelog: v1.10.0...v1.11.0