Skip to content

Writing Tables to Non-Unity Catalog Databricks using databricks ODBC driver #932

@mrworthington

Description

@mrworthington

Hi,

I have a simple test script that's attempting to write a local R data frame to a table in Databricks. I figured since it was easy to retrieve tables from Databricks using the new odbc::databricks() connector, that it shouldn't that hard to push a table back up. However, I keep striking out and can't find any documentation, issues, or blog posts online that speaks to whether or not you can write tables up to Databricks using the odbc::databricks() connection.

Any clarity on this would be appreciated!

library(DBI)

# Create Connection
sc <- DBI::dbConnect(workspace = Sys.getenv("DATABRICKS_HOST"), 
                     odbc::databricks(), 
                     httpPath = Sys.getenv("DATABRICKS_HTTP_PATH"))

# Create a local dataframe called 'my_df'
my_df <- data.frame(
  id = 1:100,
  value = rnorm(100)
)

# Write to Databricks
dbWriteTable(sc, Id(catlog = "catalog_name", schema = "schema_name", table = "table_name"), my_df, overwrite = TRUE)

However, I keep getting this error:

> dbWriteTable(sc, Id(catlog = "catalog_name", schema = "schema_name", table = "table_name"), my_df, overwrite = TRUE)
Error in `dbWriteTable()`:
! ODBC failed with error 56038 from [Simba][Hardy][UC_NOT_ENABLED][UC_NOT_ENABLED].
✖  (80) Syntax or semantic analysis error thrown in server while executing query. Error message from server:
  org.apache.hive.service.cli.HiveSQLException: Error running query: org.apache.spark.sql.AnalysisException: Unity Catalog is not
  enabled on this cluster. SQLSTATE: 56038

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions