You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Oct 16, 2025. It is now read-only.
When I run the script, this line returns the following error:
+ jobId=demo-dlp-deid-pipeline-20210908-064259
+ gcloud dataflow jobs run demo-dlp-deid-pipeline-20210908-064259 --gcs-location gs://dataflow-templates/latest/Stream_DLP_GCS_Text_to_BigQuery --parameters --region=us-central1,inputFilePattern=gs://<project_id>-demo2-demo-data/CCRecords_1564602825.csv,dlpProjectId=<project_id>,deidentifyTemplateName=projects/<project_id>/deidentifyTemplates/dlp-demo-deid-latest-1631083353071,inspectTemplateName=projects/<project_id>/inspectTemplates/dlp-demo-inspect-latest-1631083353071,datasetName=demo_dataset,batchSize=500
ERROR: (gcloud.dataflow.jobs.run) argument --parameters: expected one argument
Usage: gcloud dataflow jobs run JOB_NAME --gcs-location=GCS_LOCATION [optional flags]
optional flags may be --additional-experiments | --dataflow-kms-key |
--disable-public-ips | --enable-streaming-engine |
--help | --max-workers | --network | --num-workers |
--parameters | --region | --service-account-email |
--staging-location | --subnetwork |
--worker-machine-type | --worker-region |
--worker-zone | --zone
Correct command that I have tried:
gcloud dataflow jobs run ${jobId} --gcs-location gs://dataflow-templates/latest/Stream_DLP_GCS_Text_to_BigQuery --region=us-central1 --parameters "inputFilePattern=gs://${DATA_STORAGE_BUCKET}/CCRecords_1564602825.csv,dlpProjectId=${PROJECT_ID},deidentifyTemplateName=${DEID_TEMPLATE_NAME},inspectTemplateName=${INSPECT_TEMPLATE_NAME},datasetName=${BQ_DATASET_NAME},batchSize=500"