Skip to content

Error when writing a table to databricks with dbWriteTable() #904

@emilianomm

Description

@emilianomm

Hi

Trying to automatize the upload of excels files to databricks I encountered an error with dbWriteTable(). I'm using a code very much like this one

library(readxl)
library(DBI)
library(odbc)
library(dplyr)
library(janitor)

# Define the path to the Excel file
file1 <- read_excel("path/to/your/excel1.xlsx")
file2 <- read_excel("path/to/your/excel2.xlsx")

# Read the Excel file
df_final <- bind_rows(
  file1,
  file2
) %>%
  clean_names()

# Define the connection details
access_token <- "your_access_token_here"

# Create a connection string using JDBC
con <- dbConnect(
  odbc::databricks(),
  workspace = "your_workspace_url",
  httpPath = "your_http_path",
  uid = "token",
  pwd = access_token
)

# Error part
dbWriteTable(
  con,
  name = Id(schema = "your_schema", table = "your_table", catalog = "your_catalog"),
  value = df_final,
  append = TRUE
)

But the last R statement returns the following error

[UNBOUND_SQL_PARAMETER] Found the unbound parameter: _1980. Please, fix argsand provide a mapping of the parameter to either a SQL literal or collection constructor functions such asmap(), array(), struct(). SQLSTATE: 42P02; line 2 pos 8

Upon reviewing the monitoring tab on the sql warehouse (within databricks), it show the same error on the job history.

What may be causing this error?

Thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions