Skip to content

use persistent GCP cloud storage bucket for temporary SQL imports #1263

@stephencompall-DA

Description

@stephencompall-DA

Replace on-the-fly defined GCP buckets with a single persistent bucket that can have narrower IAM grants applied to it, avoiding the gsutil mb/gsutil rb commands during SQL import.

We currently create here

# create temporary bucket
echo "Creating temporary bucket $TMP_BUCKET"
gsutil mb --pap enforced -p "$PRIVATE_NETWORK_PROJECT" \
-l "$COMPUTE_REGION" "gs://$TMP_BUCKET"
# grant DB service account access to the bucket
echo "Granting CloudSQL DB access to $TMP_BUCKET"
gsutil iam ch "serviceAccount:$SERVICE_ACCOUNT_EMAIL:roles/storage.objectAdmin" \
"gs://$TMP_BUCKET"

and delete here

echo 'Cleaning up temporary GCS object and bucket'
gsutil rm "$GCS_URI" || true
gsutil rb "gs://$TMP_BUCKET" || true

Instead, we can

  • define a gcp.storage.Bucket in the Pulumi (bigQuery.ts)
  • passing its location in as an argument to the script:

const scriptArgs = pulumi.interpolate`\
--private-network-project="${privateNetwork.project}" \
--compute-region="${cloudsdkComputeRegion()}" \

  • and in cleanup, revoke the IAM grant to the $SERVICE_ACCOUNT_EMAIL.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions