You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
DOC-591 Switch embedded example code to CodeExample component (#27605)
## Summary & Motivation
See title -- need to do this before re-implementing versioned docs.
## How I Tested These Changes
## Changelog
> Insert changelog entry or delete this section.
---------
Signed-off-by: nikki everett <[email protected]>
Copy file name to clipboardExpand all lines: docs/docs-beta/CONTRIBUTING.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -129,7 +129,7 @@ To include code snippets, use the following format:
129
129
<CodeExample path="path/to/file.py" />
130
130
```
131
131
132
-
You can optionally include [additional properties](https://github.com/dagster-io/dagster/blob/master/docs/docs-beta/src/components/CodeExample.tsx#L4), such as `language`, `title`, `lineStart`, `lineEnd`, `startAfter`, and `endBefore`:
132
+
You can optionally include [additional properties](https://github.com/dagster-io/dagster/blob/master/docs/docs-beta/src/components/CodeExample.tsx#L6), such as `language`, `title`, `lineStart`, `lineEnd`, `startAfter`, and `endBefore`:
As you can see, our assets use an [I/O manager](/guides/build/io-managers/) named `snowflake_io_manager`. Using I/O managers and other resources allow us to swap out implementations per environment without modifying our business logic.
111
64
@@ -119,46 +72,7 @@ Dagster automatically sets certain [environment variables](/dagster-plus/deploym
119
72
120
73
Because we want to configure our assets to write to Snowflake using a different set of credentials and database in each environment, we'll configure a separate I/O manager for each environment:
Refer to the [Dagster+ environment variables documentation](/dagster-plus/deployment/management/environment-variables/) for more info about available environment variables.
164
78
@@ -177,89 +91,17 @@ these tasks, like viewing them in the Global Asset Graph.
We've defined `drop_database_clone` and `clone_production_database` to utilize the <PyObjectsection="libraries"object="SnowflakeResource"module="dagster_snowflake" />. The Snowflake resource will use the same configuration as the Snowflake I/O manager to generate a connection to Snowflake. However, while our I/O manager writes outputs to Snowflake, the Snowflake resource executes queries against Snowflake.
219
97
220
98
We now need to define resources that configure our jobs to the current environment. We can modify the resource mapping by environment as follows:
## Step 4: Create our database clone upon opening a branch
265
107
@@ -268,37 +110,7 @@ defs = Definitions(
268
110
269
111
The `branch_deployments.yml` file located in `.github/workflows/branch_deployments.yml` defines a `dagster_cloud_build_push` job with a series of steps that launch a branch deployment. Because we want to queue a run of `clone_prod` within each deployment after it launches, we'll add an additional step at the end `dagster_cloud_build_push`. This job is triggered on multiple pull request events: `opened`, `synchronize`, `reopen`, and `closed`. This means that upon future pushes to the branch, we'll trigger a run of `clone_prod`. The `if` condition below ensures that `clone_prod` will not run if the pull request is closed:
Opening a pull request for our current branch will automatically kick off a branch deployment. After the deployment launches, we can confirm that the `clone_prod` job has run:
304
116
@@ -315,53 +127,7 @@ We can also view our database in Snowflake to confirm that a clone exists for ea
315
127
316
128
The `.gitlab-ci.yaml` script contains a `deploy` job that defines a series of steps that launch a branch deployment. Because we want to queue a run of `clone_prod` within each deployment after it launches, we'll add an additional step at the end of `deploy`. This job is triggered on when a merge request is created or updated. This means that upon future pushes to the branch, we'll trigger a run of `clone_prod`.
Opening a merge request for our current branch will automatically kick off a branch deployment. After the deployment launches, we can confirm that the `clone_prod` job has run:
367
133
@@ -382,91 +148,14 @@ We can also view our database in Snowflake to confirm that a clone exists for ea
382
148
383
149
Finally, we can add a step to our `branch_deployments.yml` file that queues a run of our `drop_prod_clone` job:
0 commit comments