A sample demonstrating write support from Databricks to Spanner.#5
A sample demonstrating write support from Databricks to Spanner.#5MaxKsyunz wants to merge 4 commits into
Conversation
…icks to Spanner. Signed-off-by: Max Ksyunz <max.ksyunz@improving.com>
To ensure most recent file is uploaded. Signed-off-by: Max Ksyunz <max.ksyunz@improving.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Max Ksyunz <max.ksyunz@improving.com>
|
|
||
| ## Step 1: Create a Databricks on Google Cloud Workspace | ||
|
|
||
| 1. Navigate to the [Google Cloud Marketplace listing for Databricks](https://console.cloud.google.com/marketplace/product/databricks/databricks). |
There was a problem hiding this comment.
This link resulted in "Failed to load" message. Possibly should it be
https://console.cloud.google.com/marketplace/product/databricks-prod/databricks ?
| 1. Navigate to the [Google Cloud Marketplace listing for Databricks](https://console.cloud.google.com/marketplace/product/databricks/databricks). | ||
| 2. Click **Subscribe** and follow the prompts to enable the Databricks API. | ||
| 3. Once subscribed, click **MANAGE ON PROVIDER**. You will be redirected to the Databricks account console to configure your workspace. | ||
| 4. Follow the workspace creation wizard. When prompted, ensure you select a region and enable **Unity Catalog**. This will create a new Unity Catalog metastore for your workspace. |
There was a problem hiding this comment.
Do we need extra guidance here? Such as, which option to choose for Storage & Compute? I think it has to be "Use your existing cloud account" rather than the serverless option, doesn't it?
If so, there would be options for Cloud credentials and storage, which should be set up as a prerequisite.
There was a problem hiding this comment.
I don't have much guidance here. I didn't set it up for us because I lacked necessary permissions on both GCP and AWS.
I don't believe there was serverless option when subscribing via cloud provider -- going that route explicitly ties Databricks instance to the current cloud provider and account.
No description provided.