Skip to content

Latest commit

 

History

History
150 lines (105 loc) · 5.59 KB

destination_databricks.md

File metadata and controls

150 lines (105 loc) · 5.59 KB
page_title subcategory description
airbyte_destination_databricks Resource - terraform-provider-airbyte
DestinationDatabricks Resource

airbyte_destination_databricks (Resource)

DestinationDatabricks Resource

Example Usage

resource "airbyte_destination_databricks" "my_destination_databricks" {
  configuration = {
    accept_terms = false
    authentication = {
      personal_access_token = {
        personal_access_token = "...my_personal_access_token..."
      }
    }
    database            = "...my_database..."
    hostname            = "abc-12345678-wxyz.cloud.databricks.com"
    http_path           = "sql/1.0/warehouses/0000-1111111-abcd90"
    port                = "443"
    purge_staging_data  = false
    raw_schema_override = "...my_raw_schema_override..."
    schema              = "default"
  }
  definition_id = "fb6a88f5-a304-46f5-ab8b-4280a6d91f99"
  name          = "...my_name..."
  workspace_id  = "2615758c-c904-459e-9fd6-c8a55cba9327"
}

Schema

Required

  • configuration (Attributes) (see below for nested schema)
  • name (String) Name of the destination e.g. dev-mysql-instance.
  • workspace_id (String)

Optional

  • definition_id (String) The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.

Read-Only

  • created_at (Number)
  • destination_id (String)
  • destination_type (String)
  • resource_allocation (Attributes) actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level. (see below for nested schema)

Nested Schema for configuration

Required:

  • authentication (Attributes) Authentication mechanism for Staging files and running queries (see below for nested schema)
  • database (String) The name of the unity catalog for the database
  • hostname (String) Databricks Cluster Server Hostname.
  • http_path (String) Databricks Cluster HTTP Path.

Optional:

  • accept_terms (Boolean) You must agree to the Databricks JDBC Driver Terms & Conditions to use this connector. Default: false
  • port (String) Databricks Cluster Port. Default: "443"
  • purge_staging_data (Boolean) Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
  • raw_schema_override (String) The schema to write raw tables into (default: airbyte_internal). Default: "airbyte_internal"
  • schema (String) The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"

Nested Schema for configuration.authentication

Optional:

Nested Schema for configuration.authentication.o_auth2_recommended

Required:

  • client_id (String)
  • secret (String, Sensitive)

Nested Schema for configuration.authentication.personal_access_token

Required:

  • personal_access_token (String, Sensitive)

Nested Schema for resource_allocation

Read-Only:

Nested Schema for resource_allocation.default

Read-Only:

  • cpu_limit (String)
  • cpu_request (String)
  • ephemeral_storage_limit (String)
  • ephemeral_storage_request (String)
  • memory_limit (String)
  • memory_request (String)

Nested Schema for resource_allocation.job_specific

Read-Only:

  • job_type (String) enum that describes the different types of jobs that the platform runs. must be one of ["get_spec", "check_connection", "discover_schema", "sync", "reset_connection", "connection_updater", "replicate"]
  • resource_requirements (Attributes) optional resource requirements to run workers (blank for unbounded allocations) (see below for nested schema)

Nested Schema for resource_allocation.job_specific.resource_requirements

Read-Only:

  • cpu_limit (String)
  • cpu_request (String)
  • ephemeral_storage_limit (String)
  • ephemeral_storage_request (String)
  • memory_limit (String)
  • memory_request (String)

Import

Import is supported using the following syntax:

terraform import airbyte_destination_databricks.my_airbyte_destination_databricks ""