Skip to content

refactor(gcp): workspace_deployment module#228

Open
ashkan-db wants to merge 1 commit intomainfrom
gcp
Open

refactor(gcp): workspace_deployment module#228
ashkan-db wants to merge 1 commit intomainfrom
gcp

Conversation

@ashkan-db
Copy link
Copy Markdown
Contributor

  • Uniform resource naming: ${resource_prefix}--${deployment_suffix}
  • Add serverless workspace support (compute_mode = SERVERLESS)
  • Add flexible DNS for PSC: create zone + records, use existing zone, or manual
  • Add tunnel. DNS A-record for SCC relay (fixes cluster launch timeout)
  • Add workspace hardening: IP access lists, verbose audit, DBFS browser off, 90-day tokens
  • Add resource_owner admin assignment with skip_user_lookup for safe destroy
  • Add use_cmek master flag; support creating fresh KMS keyring+key or reusing existing
  • Replace deprecated default_catalog_name with databricks_default_namespace_setting
  • Gate db_subnet_ingress firewall on !use_existing_vpc (BYO creates zero GCP resources)
  • Decouple Databricks VPC endpoint registration from use_existing_vpc
  • Fix network_name exceeding 30-char Databricks API limit
  • Fix DNS record count depending on computed attributes (plan-time error)
  • Comment out GCS backend in examples (default to local state)
  • Bump Databricks provider to >=1.113.0
  • Update end-to-end example: enable PSC + CMEK + hardened network
  • Document googleapis.com DNS zone prerequisite for PSC + hardened network

- Uniform resource naming: ${resource_prefix}-<resource>-${deployment_suffix}
- Add serverless workspace support (compute_mode = SERVERLESS)
- Add flexible DNS for PSC: create zone + records, use existing zone, or manual
- Add tunnel.<region> DNS A-record for SCC relay (fixes cluster launch timeout)
- Add workspace hardening: IP access lists, verbose audit, DBFS browser off, 90-day tokens
- Add resource_owner admin assignment with skip_user_lookup for safe destroy
- Add use_cmek master flag; support creating fresh KMS keyring+key or reusing existing
- Replace deprecated default_catalog_name with databricks_default_namespace_setting
- Gate db_subnet_ingress firewall on !use_existing_vpc (BYO creates zero GCP resources)
- Decouple Databricks VPC endpoint registration from use_existing_vpc
- Fix network_name exceeding 30-char Databricks API limit
- Fix DNS record count depending on computed attributes (plan-time error)
- Comment out GCS backend in examples (default to local state)
- Bump Databricks provider to >=1.113.0
- Update end-to-end example: enable PSC + CMEK + hardened network
- Document googleapis.com DNS zone prerequisite for PSC + hardened network
@ashkan-db ashkan-db requested a review from AleksCallebat April 20, 2026 20:03
Copy link
Copy Markdown
Contributor

@AleksCallebat AleksCallebat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great job - 2 small issues ;

  • one bit I don't understand and am afraid it's missing (causing potential errors at use if the uuid is not specified).
  • one bit where the firewall rule is not defined in the right place

# value = databricks_mws_workspaces.this.token[0].token_value
# sensitive = true
# } No newline at end of file
output "deployment_suffix" {
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which part generates the deployment_suffix in case it's empty in the tfvars? We need to validate that if it's empty the suffix is generated via uuid as advertised

direction = "EGRESS"
priority = 1000
destination_ranges = [
# ADD REGIONAL IPS as listed here: https://docs.databricks.com/gcp/en/resources/ip-domain-region
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This will make the rule empty by default (nobody will scroll till there)
Need to :

  1. Add a variable which are the ip-domain
  2. Add this link to documentation as part of the var file

Conscious this is a default previous version already had

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants