Skip to content

A sample demonstrating write support from Databricks to Spanner.#5

Open
MaxKsyunz wants to merge 4 commits into
integ/databricks_write_examplefrom
databricks_write_example
Open

A sample demonstrating write support from Databricks to Spanner.#5
MaxKsyunz wants to merge 4 commits into
integ/databricks_write_examplefrom
databricks_write_example

Conversation

@MaxKsyunz
Copy link
Copy Markdown

No description provided.

MaxKsyunz and others added 4 commits January 15, 2026 22:14
…icks to Spanner.

Signed-off-by: Max Ksyunz <max.ksyunz@improving.com>
To ensure most recent file is uploaded.

Signed-off-by: Max Ksyunz <max.ksyunz@improving.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Max Ksyunz <max.ksyunz@improving.com>

## Step 1: Create a Databricks on Google Cloud Workspace

1. Navigate to the [Google Cloud Marketplace listing for Databricks](https://console.cloud.google.com/marketplace/product/databricks/databricks).
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This link resulted in "Failed to load" message. Possibly should it be
https://console.cloud.google.com/marketplace/product/databricks-prod/databricks ?

1. Navigate to the [Google Cloud Marketplace listing for Databricks](https://console.cloud.google.com/marketplace/product/databricks/databricks).
2. Click **Subscribe** and follow the prompts to enable the Databricks API.
3. Once subscribed, click **MANAGE ON PROVIDER**. You will be redirected to the Databricks account console to configure your workspace.
4. Follow the workspace creation wizard. When prompted, ensure you select a region and enable **Unity Catalog**. This will create a new Unity Catalog metastore for your workspace.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need extra guidance here? Such as, which option to choose for Storage & Compute? I think it has to be "Use your existing cloud account" rather than the serverless option, doesn't it?
If so, there would be options for Cloud credentials and storage, which should be set up as a prerequisite.

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't have much guidance here. I didn't set it up for us because I lacked necessary permissions on both GCP and AWS.

I don't believe there was serverless option when subscribing via cloud provider -- going that route explicitly ties Databricks instance to the current cloud provider and account.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants