You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Settings have default values that can be overriden using:
9
+
10
+
- dbt project variables (and therefore also by CLI variable override)
11
+
- environment variables
12
+
13
+
Please note that the default region is `us` and there's no way, at the time of writing, to query cross region tables but you might run that project in each region you want to monitor and [then replicate the tables to a central region](https://cloud.google.com/bigquery/docs/data-replication) to build an aggregated view.
14
+
15
+
To know which region is related to a job, in the BQ UI, use the `Job history` (bottom panel), take a job and look at `Location` field when clicking on a job. You can also access the region of a dataset/table by opening the details panel of it and check the `Data location` field.
16
+
17
+
:::tip
18
+
19
+
To get the best out of this package, you should probably configure all data sources and settings:
20
+
- Choose the [Baseline mode](#modes) that fits your GCP setup
21
+
-[Add metadata to queries](#add-metadata-to-queries-recommended-but-optional)
-[Settings](/configuration/package-settings) (especially the pricing ones)
25
+
26
+
:::
27
+
28
+
29
+
## Modes
30
+
31
+
### Region mode (default)
32
+
33
+
In this mode, the package will monitor all the GCP projects in the region specified in the `dbt_project.yml` file.
34
+
35
+
```yml
36
+
vars:
37
+
# dbt bigquery monitoring vars
38
+
bq_region: 'us'
39
+
```
40
+
41
+
**Requirements**
42
+
43
+
- Execution project needs to be the same as the storage project else you'll need to use the second mode.
44
+
- If you have multiple GCP Projects in the same region, you should use the "project mode" (with `input_gcp_projects` setting to specify them) as else you will run into errors such as: `Within a standard SQL view, references to tables/views require explicit project IDs unless the entity is created in the same project that is issuing the query, but these references are not project-qualified: "region-us.INFORMATION_SCHEMA.JOBS"`.
45
+
46
+
### Project mode
47
+
48
+
To enable the "project mode", you'll need to define explicitly one mandatory setting to set in the `dbt_project.yml` file:
## Add metadata to queries (Recommended but optional)
57
+
58
+
To enhance your query metadata with dbt model information, the package provides a dedicated macro that leverage "dbt query comments" (the header set at the top of each query)
59
+
To configure the query comments, add the following config to `dbt_project.yml`.
GCP Billing export is a feature that allows you to export your billing data to BigQuery. It allows the package to track the real cost of your queries and storage overtime.
8
+
9
+
To enable on GCP end, you can follow the [official documentation](https://cloud.google.com/billing/docs/how-to/export-data-bigquery) to set up the export.
10
+
11
+
Then enable the GCP billing export monitoring in the package, you'll need to define the following settings in the `dbt_project.yml` file:
Copy file name to clipboardExpand all lines: docs/configuration/package-settings.md
+13-105
Original file line number
Diff line number
Diff line change
@@ -1,117 +1,21 @@
1
1
---
2
-
sidebar_position: 4
3
-
slug: /configuration
2
+
sidebar_position: 4.4
3
+
slug: /configuration/package-settings
4
4
---
5
5
6
-
# Configuration
7
-
8
-
Settings have default values that can be overriden using:
9
-
10
-
- dbt project variables (and therefore also by CLI variable override)
11
-
- environment variables
12
-
13
-
Please note that the default region is `us` and there's no way, at the time of writing, to query cross region tables but you might run that project in each region you want to monitor and [then replicate the tables to a central region](https://cloud.google.com/bigquery/docs/data-replication) to build an aggregated view.
14
-
15
-
To know which region is related to a job, in the BQ UI, use the `Job history` (bottom panel), take a job and look at `Location` field when clicking on a job. You can also access the region of a dataset/table by opening the details panel of it and check the `Data location` field.
16
-
17
-
## Modes
18
-
19
-
### Region mode (default)
20
-
21
-
In this mode, the package will monitor all the GCP projects in the region specified in the `dbt_project.yml` file.
22
-
23
-
```yml
24
-
vars:
25
-
# dbt bigquery monitoring vars
26
-
bq_region: 'us'
27
-
```
28
-
29
-
**Requirements**
30
-
31
-
- Execution project needs to be the same as the storage project else you'll need to use the second mode.
32
-
- If you have multiple GCP Projects in the same region, you should use the "project mode" (with `input_gcp_projects` setting to specify them) as else you will run into errors such as: `Within a standard SQL view, references to tables/views require explicit project IDs unless the entity is created in the same project that is issuing the query, but these references are not project-qualified: "region-us.INFORMATION_SCHEMA.JOBS"`.
33
-
34
-
### Project mode
35
-
36
-
To enable the "project mode", you'll need to define explicitly one mandatory setting to set in the `dbt_project.yml` file:
GCP Billing export is a feature that allows you to export your billing data to BigQuery. It allows the package to track the real cost of your queries and storage overtime.
46
-
To enable on GCP end, you can follow the [official documentation](https://cloud.google.com/billing/docs/how-to/export-data-bigquery) to set up the export.
47
-
Then enable the GCP billing export monitoring in the package, you'll need to define the following settings in the `dbt_project.yml` file:
[You might use environment variable as well](#gcp-bigquery-audit-logs-configuration).
73
-
74
-
### GCP Billing export
75
-
76
-
GCP Billing export is a feature that allows you to export your billing data to BigQuery. It allows the package to track the real cost of your queries and storage overtime.
77
-
78
-
To enable on GCP end, you can follow the [official documentation](https://cloud.google.com/billing/docs/how-to/export-data-bigquery) to set up the export.
79
-
80
-
Then enable the GCP billing export monitoring in the package, you'll need to define the following settings in the `dbt_project.yml` file:
## Add metadata to queries (Recommended but optional)
92
-
93
-
To enhance your query metadata with dbt model information, the package provides a dedicated macro that leverage "dbt query comments" (the header set at the top of each query)
94
-
To configure the query comments, add the following config to `dbt_project.yml`.
@@ -145,9 +53,9 @@ To do so, you can set the following variables in your `dbt_project.yml` file or
145
53
|`gcp_billing_export_dataset`|`DBT_BQ_MONITORING_GCP_BILLING_EXPORT_DATASET`| The dataset for GCP billing export data |`'placeholder'` if enabled, `None` otherwise |
146
54
|`gcp_billing_export_table`|`DBT_BQ_MONITORING_GCP_BILLING_EXPORT_TABLE`| The table for GCP billing export data |`'placeholder'` if enabled, `None` otherwise |
147
55
148
-
#### GCP BigQuery Audit logs configuration
56
+
### GCP BigQuery Audit logs configuration
149
57
150
-
See [GCP BigQuery Audit logs](#bigquery-audit-logs-mode) for more information.
58
+
See [GCP BigQuery Audit logs](/configuration/audit-logs) for more information.
Copy file name to clipboardExpand all lines: docs/contributing.md
+6-16
Original file line number
Diff line number
Diff line change
@@ -10,8 +10,6 @@ slug: /contributing
10
10
You're free to use the environment management tools you prefer but if you're familiar with those, you can use the following:
11
11
12
12
- pipx (to isolate the global tools from your local environment)
13
-
- tox (to run the tests)
14
-
- pre-commit (to run the linter)
15
13
- SQLFluff (to lint SQL)
16
14
- changie (to generate CHANGELOG entries)
17
15
@@ -27,20 +25,12 @@ pipx ensurepath
27
25
Then you'll be able to install tox, pre-commit and sqlfluff with pipx:
28
26
29
27
```bash
30
-
pipx install tox
31
-
pipx install pre-commit
32
28
pipx install sqlfluff
33
29
```
34
30
35
31
To install changie, there are few options depending on your OS.
36
32
See the [installation guide](https://changie.dev/guide/installation/) for more details.
37
33
38
-
To configure pre-commit hooks:
39
-
40
-
```bash
41
-
pre-commit install
42
-
```
43
-
44
34
To configure your dbt profile, run following command and follow the prompts:
45
35
46
36
```bash
@@ -52,7 +42,7 @@ dbt init
52
42
- Fork the repo
53
43
- Create a branch from `main`
54
44
- Make your changes
55
-
- Run `tox` to run the tests
45
+
- Run the tests
56
46
- Create your changelog entry with `changie new` (don't edit directly the CHANGELOG.md)
57
47
- Commit your changes (it will run the linter through pre-commit)
58
48
- Push your branch and open a PR on the repository
@@ -71,27 +61,27 @@ We use SQLFluff to keep SQL style consistent. By installing `pre-commit` per the
71
61
72
62
Lint all models in the /models directory:
73
63
```bash
74
-
tox -e lint_all
64
+
sqlfluff lint
75
65
```
76
66
77
67
Fix all models in the /models directory:
78
68
```bash
79
-
tox -e fix_all
69
+
sqlfluff fix
80
70
```
81
71
82
72
Lint (or subsitute lint to fix) a specific model:
83
73
```bash
84
-
tox -e lint -- models/path/to/model.sql
74
+
sqlfluff lint -- models/path/to/model.sql
85
75
```
86
76
87
77
Lint (or subsitute lint to fix) a specific directory:
88
78
```bash
89
-
tox -e lint -- models/path/to/directory
79
+
sqlfluff lint -- models/path/to/directory
90
80
```
91
81
92
82
#### Rules
93
83
94
-
Enforced rules are defined within `tox.ini`. To view the full list of available rules and their configuration, see the [SQLFluff documentation](https://docs.sqlfluff.com/en/stable/rules.html).
84
+
Enforced rules are defined within `.sqlfluff`. To view the full list of available rules and their configuration, see the [SQLFluff documentation](https://docs.sqlfluff.com/en/stable/rules.html).
0 commit comments