page_title | subcategory | description |
---|---|---|
airbyte_destination_snowflake Resource - terraform-provider-airbyte |
DestinationSnowflake Resource |
DestinationSnowflake Resource
resource "airbyte_destination_snowflake" "my_destination_snowflake" {
configuration = {
credentials = {
o_auth20 = {
access_token = "...my_access_token..."
client_id = "...my_client_id..."
client_secret = "...my_client_secret..."
refresh_token = "...my_refresh_token..."
}
}
database = "AIRBYTE_DATABASE"
disable_type_dedupe = true
host = "accountname.us-east-2.aws.snowflakecomputing.com"
jdbc_url_params = "...my_jdbc_url_params..."
raw_data_schema = "...my_raw_data_schema..."
retention_period_days = 9
role = "AIRBYTE_ROLE"
schema = "AIRBYTE_SCHEMA"
use_merge_for_upsert = false
username = "AIRBYTE_USER"
warehouse = "AIRBYTE_WAREHOUSE"
}
definition_id = "fce231ce-04a4-46ec-a244-d1436db0281f"
name = "...my_name..."
workspace_id = "058d9730-38a6-485c-8631-dc0cc86125f9"
}
configuration
(Attributes) (see below for nested schema)name
(String) Name of the destination e.g. dev-mysql-instance.workspace_id
(String)
definition_id
(String) The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
created_at
(Number)destination_id
(String)destination_type
(String)resource_allocation
(Attributes) actor or actor definition specific resource requirements. if default is set, these are the requirements that should be set for ALL jobs run for this actor definition. it is overriden by the job type specific configurations. if not set, the platform will use defaults. these values will be overriden by configuration at the connection level. (see below for nested schema)
Required:
database
(String) Enter the name of the database you want to sync data intohost
(String) Enter your Snowflake account's locator (in the format <account_locator>...snowflakecomputing.com)role
(String) Enter the role that you want to use to access Snowflakeschema
(String) Enter the name of the default schemausername
(String) Enter the name of the user you want to use to access the databasewarehouse
(String) Enter the name of the warehouse that you want to use as a compute cluster
Optional:
credentials
(Attributes) (see below for nested schema)disable_type_dedupe
(Boolean) Disable Writing Final Tables. WARNING! The data format in _airbyte_data is likely stable but there are no guarantees that other metadata columns will remain the same in future versions. Default: falsejdbc_url_params
(String) Enter the additional properties to pass to the JDBC URL string when connecting to the database (formatted as key=value pairs separated by the symbol &). Example: key1=value1&key2=value2&key3=value3raw_data_schema
(String) The schema to write raw tables into (default: airbyte_internal)retention_period_days
(Number) The number of days of Snowflake Time Travel to enable on the tables. See Snowflake's documentation for more information. Setting a nonzero value will incur increased storage costs in your Snowflake instance. Default: 1use_merge_for_upsert
(Boolean) Use MERGE for de-duplication of final tables. This option no effect if Final tables are disabled or Sync mode is not DEDUPE. Default: false
Optional:
key_pair_authentication
(Attributes) (see below for nested schema)o_auth20
(Attributes) (see below for nested schema)username_and_password
(Attributes) (see below for nested schema)
Required:
private_key
(String, Sensitive) RSA Private key to use for Snowflake connection. See the docs for more information on how to obtain this key.
Optional:
private_key_password
(String, Sensitive) Passphrase for private key
Required:
access_token
(String, Sensitive) Enter you application's Access Tokenrefresh_token
(String, Sensitive) Enter your application's Refresh Token
Optional:
client_id
(String, Sensitive) Enter your application's Client IDclient_secret
(String, Sensitive) Enter your application's Client secret
Required:
password
(String, Sensitive) Enter the password associated with the username.
Read-Only:
default
(Attributes) optional resource requirements to run workers (blank for unbounded allocations) (see below for nested schema)job_specific
(Attributes List) (see below for nested schema)
Read-Only:
cpu_limit
(String)cpu_request
(String)ephemeral_storage_limit
(String)ephemeral_storage_request
(String)memory_limit
(String)memory_request
(String)
Read-Only:
job_type
(String) enum that describes the different types of jobs that the platform runs. must be one of ["get_spec", "check_connection", "discover_schema", "sync", "reset_connection", "connection_updater", "replicate"]resource_requirements
(Attributes) optional resource requirements to run workers (blank for unbounded allocations) (see below for nested schema)
Read-Only:
cpu_limit
(String)cpu_request
(String)ephemeral_storage_limit
(String)ephemeral_storage_request
(String)memory_limit
(String)memory_request
(String)
Import is supported using the following syntax:
terraform import airbyte_destination_snowflake.my_airbyte_destination_snowflake ""