-
Notifications
You must be signed in to change notification settings - Fork 117
An example of a bundle that creates and uses a secret scope #83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 8 commits
c2824fa
10dda4e
39895b6
14b9682
db64d48
514f305
b59433d
b24820a
9fdb323
482137a
7fe95ff
fccf07a
ecb3539
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,31 @@ | ||
| # Databricks job that reads a secret from a secret scope | ||
|
|
||
| This example demonstrates how to define a secret scope and a job with a task that reads from it in a Databricks Asset Bundle. | ||
|
|
||
| It includes and deploys an example secret scope, and a job with a task in a bundle that reads a secret from the secret scope to a Databricks workspace. | ||
|
|
||
| For more information about Databricks secrets, see the [documentation](https://docs.databricks.com/aws/en/security/secrets). | ||
|
|
||
| ## Prerequisites | ||
|
|
||
| * Databricks CLI v0.252.0 or above | ||
|
|
||
| ## Usage | ||
|
|
||
| Modify `databricks.yml`: | ||
| * Update the `host` field under `workspace` to the Databricks workspace to deploy to | ||
|
|
||
| Run `databricks bundle deploy` to deploy the bundle. | ||
|
|
||
| Run a script to write a secret to the secret scope: | ||
|
|
||
| ``` | ||
| SECRET_SCOPE_NAME=$(databricks bundle summary -o json | jq -r '.resources.secret_scopes.my_secret_scope.name') | ||
|
|
||
| databricks secrets put-secret ${SECRET_SCOPE_NAME} example-key --string-value example-value --profile ${DATABRICKS_PROFILE} | ||
anton-107 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||
| ``` | ||
|
|
||
| Run the job: | ||
| ``` | ||
| databricks bundle run example_python_job | ||
| ``` | ||
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -0,0 +1,47 @@ | ||||||
| bundle: | ||||||
| name: job-read-secret-example | ||||||
|
|
||||||
| # workspace: | ||||||
| # host: https://myworkspace.cloud.databricks.com | ||||||
|
|
||||||
| resources: | ||||||
| secret_scopes: | ||||||
| my_secret_scope: | ||||||
| name: secrets-scope-1 | ||||||
anton-107 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
anton-107 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||||||
| permissions: | ||||||
| - level: CAN_VIEW | ||||||
anton-107 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||||||
| group_name: users | ||||||
| - level: CAN_MANAGE | ||||||
|
||||||
| group_name: admins | ||||||
| jobs: | ||||||
| example_python_job: | ||||||
| name: "example-python-job" | ||||||
| parameters: | ||||||
| - name: "scope_name" | ||||||
| default: ${resources.secret_scopes.my_secret_scope.name} | ||||||
| tasks: | ||||||
| - task_key: example_python_task | ||||||
| spark_python_task: | ||||||
| python_file: "src/example_spark_python_task.py" | ||||||
| parameters: | ||||||
| - --scope_name={{job.parameters.scope_name}} | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We could simplify this.
Suggested change
This way the secret scope name is directly interpolated and you don't have to pass that as a job parameter.
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I would prefer to keep it this way in the example since this seems to be more idiomatic with the jobs docs and it also documents this particular way (not quite straightforward) to achieve this result |
||||||
|
|
||||||
| # Defines the targets for this bundle. | ||||||
| # Targets allow you to deploy the same bundle to different Databricks workspaces. | ||||||
| targets: | ||||||
| prod: { | ||||||
| # No overrides | ||||||
| } | ||||||
anton-107 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||||||
| dev: | ||||||
| # This target is for development purposes. | ||||||
| # It defaults to the current Databricks workspace. | ||||||
| default: true | ||||||
| mode: development | ||||||
| resources: | ||||||
| secret_scopes: | ||||||
| my_secret_scope: | ||||||
| name: ${workspace.current_user.short_name}-my-secrets | ||||||
| jobs: | ||||||
| example_python_job: | ||||||
| name: "${workspace.current_user.short_name}-example-python-job" | ||||||
anton-107 marked this conversation as resolved.
Outdated
Show resolved
Hide resolved
|
||||||
|
|
||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,33 @@ | ||
| #!/usr/bin/env python | ||
|
|
||
| import os | ||
| from datetime import datetime | ||
| import argparse | ||
|
|
||
|
|
||
| def main(): | ||
| # Get current timestamp | ||
| now = datetime.now().strftime("%Y-%m-%d %H:%M:%S") | ||
|
|
||
| # Print job information | ||
| print(f"Example Python job started at: {now}") | ||
|
|
||
| # Read a secret from a passed secret scope | ||
| try: | ||
| parser = argparse.ArgumentParser() | ||
| parser.add_argument("-s", "--scope_name", help="Name of the secret scope") | ||
| args = parser.parse_args() | ||
| scope_name = args.scope_name | ||
|
|
||
| secret_value = dbutils.secrets.get(scope=scope_name, key="example-key") | ||
| print( | ||
| f"Successfully retrieved secret. First few characters: {secret_value[:3]}***" | ||
| ) | ||
| except Exception as e: | ||
| print(f"Could not access secret: {str(e)}") | ||
|
|
||
| print("Example Python job completed successfully") | ||
|
|
||
|
|
||
| if __name__ == "__main__": | ||
| main() |
Uh oh!
There was an error while loading. Please reload this page.