-
Notifications
You must be signed in to change notification settings - Fork 29
Description
Describe the bug
When using semantic layer resources (dbtcloud_semantic_layer_configuration, dbtcloud_semantic_layer_credential_service_token_mapping), on the first run, the resource gets created as expected. On the seconds run, Terraform complains that the resource already exists and throws errors. This is unexpected as the second run should do nothing with the resource and should not want to create it and therefore not causing a conflict error.
Terraform also reports after the first apply that the created semantic layer "dbtcloud_semantic_layer_configuration" resource could not be found and removes it from the state. However, the resource got correctly created in dbt Cloud.
Error message
Error message for dbtcloud_semantic_layer_configuration:
15:21:13.601 STDOUT terraform: ╷
15:21:13.601 STDOUT terraform: │ Warning: Resource not found
15:21:13.601 STDOUT terraform: │
15:21:13.601 STDOUT terraform: │ with dbtcloud_semantic_layer_configuration.semantic_layer["analytical_layer"],
15:21:13.602 STDOUT terraform: │ on dbt_cloud_semantic_layer.tf line 34, in resource "dbtcloud_semantic_layer_configuration" "semantic_layer":
15:21:13.602 STDOUT terraform: │ 34: resource "dbtcloud_semantic_layer_configuration" "semantic_layer" {
15:21:13.602 STDOUT terraform: │
15:21:13.602 STDOUT terraform: │ The Semantic Layer configuration was not found and has been removed from
15:21:13.602 STDOUT terraform: │ the state.
15:21:13.602 STDOUT terraform: │
15:21:13.602 STDOUT terraform: │ (and one more similar warning elsewhere)
15:21:13.602 STDOUT terraform: ╵
15:21:13.602 STDERR terraform: ╷
15:21:13.602 STDERR terraform: │ Error: Unable to create Semantic Layer configuration
15:21:13.602 STDERR terraform: │
15:21:13.602 STDERR terraform: │ with dbtcloud_bigquery_semantic_layer_credential.semantic_layer["analytical_layer-semantic layer analytical layer"],
15:21:13.602 STDERR terraform: │ on dbt_cloud_semantic_layer.tf line 48, in resource "dbtcloud_bigquery_semantic_layer_credential" "semantic_layer":
15:21:13.602 STDERR terraform: │ 48: resource "dbtcloud_bigquery_semantic_layer_credential" "semantic_layer" {
15:21:13.602 STDERR terraform: │
15:21:13.602 STDERR terraform: │ Error: resource-not-found:
15:21:13.602 STDERR terraform: │ {"status":{"code":400,"is_success":false,"user_message":"Resource could not
15:21:13.602 STDERR terraform: │ be created or updated because it conflicts with an existing
15:21:13.602 STDERR terraform: │ resource.","developer_message":"duplicate key value violates unique
15:21:13.602 STDERR terraform: │ constraint \"sl_creds_project_name_unique\"\nDETAIL: Key (project_id,
15:21:13.603 STDERR terraform: │ name)=(<redacted>, semantic layer analytical layer) already
15:21:13.603 STDERR terraform: │ exists.\n"},"data":{},"extra":{},"error_code":null}
15:21:13.603 STDERR terraform: ╵
Error for dbtcloud_semantic_layer_credential_service_token_mapping:
05:58:00.694 STDERR terraform: ╷
05:58:00.694 STDERR terraform: │ Error: Error Creating Semantic Layer Credential Service Token Mapping
05:58:00.694 STDERR terraform: │
05:58:00.694 STDERR terraform: │ with dbtcloud_semantic_layer_credential_service_token_mapping.semantic_layer["analytical_layer-semantic layer analytical layer-<redacted>"],
05:58:00.694 STDERR terraform: │ on dbt_cloud_semantic_layer.tf line 71, in resource "dbtcloud_semantic_layer_credential_service_token_mapping" "semantic_layer":
05:58:00.694 STDERR terraform: │ 71: resource "dbtcloud_semantic_layer_credential_service_token_mapping" "semantic_layer" {
05:58:00.694 STDERR terraform: │
05:58:00.694 STDERR terraform: │ resource-not-found:
05:58:00.695 STDERR terraform: │ {"status":{"code":400,"is_success":false,"user_message":"Resource could not
05:58:00.695 STDERR terraform: │ be created or updated because it conflicts with an existing
05:58:00.695 STDERR terraform: │ resource.","developer_message":"duplicate key value violates unique
05:58:00.695 STDERR terraform: │ constraint \"sl_token_creds_service_token_project_unique\"\nDETAIL: Key
05:58:00.695 STDERR terraform: │ (service_token_id, project_id)=(<redacted>, <redacted>) already
05:58:00.695 STDERR terraform: │ exists.\n"},"data":{},"extra":{},"error_code":null}
05:58:00.695 STDERR terraform: ╵
Resource configuration
locals {
semantic_layer_projects = {
for k, v in local.projects_config : k => v if try(v.semantic_layer, null) != null
}
semantic_layer_credentials = flatten([
for k, v in local.semantic_layer_projects : [
for credential in v.semantic_layer.credentials : [
merge(
credential,
{ "project_key" : k }
)
]
]
])
semantic_layer_credentials_unique_keys = {
for credential in local.semantic_layer_credentials :
"${credential.project_key}-${credential.name}" => credential
}
semantic_layer_service_tokens = flatten([
for credential_key, credential in local.semantic_layer_credentials_unique_keys : [
for token in credential.service_tokens : {
project_name = credential.project_key,
credential_key = credential_key,
service_token_id = token
}
]
])
semantic_layer_service_tokens_unique_keys = {
for token in local.semantic_layer_service_tokens:
"${token.credential_key}-${token.service_token_id}" => token
}
}
resource "dbtcloud_semantic_layer_configuration" "semantic_layer" {
for_each = local.semantic_layer_projects
project_id = dbtcloud_project.projects[each.key].id
environment_id = each.value.semantic_layer.environment_id
}
data "google_kms_secret" "semantic_layer_service_account_keys" {
for_each = local.semantic_layer_credentials_unique_keys
crypto_key = data.google_kms_crypto_key.kms.id
ciphertext = filebase64(
"${var.root_dir}/../../secrets/${var.env}/service_account_semantic_layer_${each.value.service_account}.enc"
)
}
resource "dbtcloud_bigquery_semantic_layer_credential" "semantic_layer" {
for_each = local.semantic_layer_credentials_unique_keys
configuration = {
project_id = dbtcloud_project.projects[each.value.project_key].id
name = each.value.name
adapter_version = "bigquery_v0" # static for now
}
credential = {
project_id = dbtcloud_project.projects[each.value.project_key].id
is_active = each.value.is_active
num_threads = each.value.num_threads
dataset = each.value.dataset
}
private_key_id = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).private_key_id
private_key = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).private_key
client_email = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).client_email
client_id = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).client_id
auth_uri = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).auth_uri
token_uri = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).token_uri
auth_provider_x509_cert_url = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).auth_provider_x509_cert_url
client_x509_cert_url = jsondecode(data.google_kms_secret.semantic_layer_service_account_keys[each.key].plaintext).client_x509_cert_url
}
resource "dbtcloud_semantic_layer_credential_service_token_mapping" "semantic_layer" {
for_each = local.semantic_layer_service_tokens_unique_keys
semantic_layer_credential_id = dbtcloud_bigquery_semantic_layer_credential.semantic_layer[each.value.credential_key].id
service_token_id = each.value.service_token_id
project_id = dbtcloud_project.projects[each.value.project_name].id
}
Expected behavior
The first Terraform apply creates the resource, the second run is silent and doesn't want to do anything with the resources.
Config (please complete the following information):
( the version can be retrieved running the command terraform providers
)
- dbt Cloud provider version 1.2.1
Additional comments
One thing I noticed is that this only happens when I execute the code in our CI/CD pipeline (they use the same terraform state). If I run the code multiple times on my local machine, there is no issue, but once I run the CI/CD pipeline, the issue starts to happen. This CI/CD pipeline has been managing the dbt Cloud Terraform state for a while now and things like this never happened before. Therefore, I assume it's not a general problem with our setup as everything else is working fine (all other TF resources).