Skip to content
Original file line number Diff line number Diff line change
Expand Up @@ -456,8 +456,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location $REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -476,8 +475,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -546,10 +544,9 @@
},
"outputs": [],
"source": [
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectCreator $BUCKET_URI\n",
"\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectViewer $BUCKET_URI"
]
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectCreator\n", "\n",
# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.
! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The automated migration has introduced a syntax error that will corrupt the notebook file. The source field in a notebook cell must be an array of strings. In this change, the comment on line 548 and the shell command on line 549 are not enclosed in quotes, making the JSON invalid. Additionally, the closing bracket ] for the array has been moved to the end of line 549, which is also incorrect.

        "# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.",
        "! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer"
      ]

Copy link
Contributor

@gurusai-voleti gurusai-voleti Dec 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Resolved] fixed

},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -678,8 +675,7 @@
"outputs": [],
"source": [
"GCS_AVRO_SCHEMA = BUCKET_URI + \"/gaming_schema.avsc\"\n",
"! gsutil cp gaming_schema.avsc $GCS_AVRO_SCHEMA\n",
"\n",
"! gcloud storage cp gaming_schema.avsc $GCS_AVRO_SCHEMA\n", "\n",
"GCS_FLEX_TEMPLATE_PATH = \"gs://dataflow-templates/latest/flex/File_Format_Conversion\"\n",
"GCS_CONVERT_IN = \"gs://dataflow-samples/game/5000_gaming_data.csv\"\n",
"GCS_CONVERT_OUT = BUCKET_URI + \"/parquet_out/\"\n",
Expand Down Expand Up @@ -757,8 +753,7 @@
"\n",
"pipeline.run()\n",
"\n",
"! gsutil ls $GCS_CONVERT_OUT\n",
"\n",
"! gcloud storage ls $GCS_CONVERT_OUT\n", "\n",
"! rm -f dataflow_file_conversion.yaml gaming_schema.avsc"
]
},
Expand Down Expand Up @@ -813,8 +808,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -309,12 +309,10 @@
},
"outputs": [],
"source": [
"count = ! gsutil cat $DATASET_FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $DATASET_FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $DATASET_FILE | head"
]
"! gcloud storage cat $DATASET_FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -173,11 +173,9 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -201,8 +199,7 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n", "\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\"\n",
Expand Down Expand Up @@ -300,8 +297,7 @@
"%cd ImageBind/.assets\n",
"! git reset --hard 95d27c7fd5a8362f3527e176c3a80ae5a4d880c0\n",
"\n",
"! gsutil cp -r . $DATA_BUCKET\n",
"\n",
"! gcloud storage cp --recursive . $DATA_BUCKET\n", "\n",
"%cd ../.."
]
},
Expand Down Expand Up @@ -571,8 +567,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -148,11 +148,10 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
" # Note: The format of the full listing output is different. gcloud storage uses a title case for keys and will not display a field if its value is \"None\".\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -174,7 +173,8 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

This comment line is not a valid JSON string because it is not enclosed in quotes. Each element in the source array of a notebook cell must be a string. This will cause an error when loading the notebook.

Suggested change
# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.
"# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.",

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Resolved] fixed

"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\"\n",
Expand Down Expand Up @@ -583,8 +583,7 @@
"\n",
"delete_bucket = True # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_URI\n",
"\n"
" ! gcloud storage rm --recursive $BUCKET_URI\n", "\n"
]
}
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -287,8 +287,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l {LOCATION} -p {PROJECT_ID} {BUCKET_URI}"
]
"! gcloud storage buckets create --location={LOCATION} --project={PROJECT_ID} {BUCKET_URI}" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -394,12 +393,10 @@
"else:\n",
" FILE = IMPORT_FILE\n",
"\n",
"count = ! gsutil cat $FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $FILE | head"
]
"! gcloud storage cat $FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -580,8 +577,7 @@
},
"outputs": [],
"source": [
"test_items = !gsutil cat $IMPORT_FILE | head -n2\n",
"if len(str(test_items[0]).split(\",\")) == 3:\n",
"test_items = !gcloud storage cat $IMPORT_FILE | head -n2\n", "if len(str(test_items[0]).split(\",\")) == 3:\n",
" _, test_item_1, test_label_1 = str(test_items[0]).split(\",\")\n",
" _, test_item_2, test_label_2 = str(test_items[1]).split(\",\")\n",
"else:\n",
Expand Down Expand Up @@ -614,9 +610,7 @@
"file_1 = test_item_1.split(\"/\")[-1]\n",
"file_2 = test_item_2.split(\"/\")[-1]\n",
"\n",
"! gsutil cp $test_item_1 $BUCKET_URI/$file_1\n",
"! gsutil cp $test_item_2 $BUCKET_URI/$file_2\n",
"\n",
"! gcloud storage cp $test_item_1 $BUCKET_URI/$file_1\n", "! gcloud storage cp $test_item_2 $BUCKET_URI/$file_2\n", "\n",
"test_item_1 = BUCKET_URI + \"/\" + file_1\n",
"test_item_2 = BUCKET_URI + \"/\" + file_2"
]
Expand Down Expand Up @@ -659,8 +653,7 @@
" f.write(json.dumps(data) + \"\\n\")\n",
"\n",
"print(gcs_input_uri)\n",
"! gsutil cat $gcs_input_uri"
]
"! gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -803,8 +796,7 @@
"# Delete the cloud storage bucket\n",
"delete_bucket = False # set True for deletion\n",
"if delete_bucket:\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -297,8 +297,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $LOCATION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$LOCATION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -386,7 +385,7 @@
"source": [
"#### Copying data between Google Cloud Storage Buckets\n",
"\n",
"In this step, you prevent access issues for the images in your original dataset. The code below extracts folder paths from image paths, constructs destination paths for Cloud Storage, copies images using gsutil commands, updates image paths in the DataFrame, and finally saves the modified DataFrame back to Cloud Storage as a CSV file."
"In this step, you prevent access issues for the images in your original dataset. The code below extracts folder paths from image paths, constructs destination paths for Cloud Storage, copies images using gcloud storage commands, updates image paths in the DataFrame, and finally saves the modified DataFrame back to Cloud Storage as a CSV file."
]
},
{
Expand Down Expand Up @@ -415,8 +414,7 @@
"\n",
"# Copy images using gsutil commands directly\n",
"for src, dest in zip(df.iloc[:, 0], df[\"destination_path\"]):\n",
" ! gsutil -m cp {src} {dest}\n",
"\n",
" ! gcloud storage cp {src} {dest}\n", "\n",
"print(f\"Files copied to {BUCKET_URI}\")"
]
},
Expand Down Expand Up @@ -496,12 +494,10 @@
"else:\n",
" FILE = IMPORT_FILE\n",
"\n",
"count = ! gsutil cat $FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $FILE | head"
]
"! gcloud storage cat $FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -706,8 +702,7 @@
},
"outputs": [],
"source": [
"test_items = !gsutil cat $IMPORT_FILE | head -n1\n",
"cols = str(test_items[0]).split(\",\")\n",
"test_items = !gcloud storage cat $IMPORT_FILE | head -n1\n", "cols = str(test_items[0]).split(\",\")\n",
"if len(cols) == 11:\n",
" test_item = str(cols[1])\n",
" test_label = str(cols[2])\n",
Expand Down Expand Up @@ -834,8 +829,7 @@
"dag.delete()\n",
"\n",
"if delete_bucket:\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -304,8 +304,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l {LOCATION} -p {PROJECT_ID} {BUCKET_URI}"
]
"! gcloud storage buckets create {BUCKET_URI} --location={LOCATION} --project={PROJECT_ID}" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -778,8 +777,7 @@
"# Delete Cloud Storage objects that were created\n",
"delete_bucket = False # set True for deletion\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -355,8 +355,7 @@
"):\n",
" BUCKET_URI = \"gs://\" + PROJECT_ID + \"-aip-\" + UUID\n",
"\n",
"! gsutil mb -l $LOCATION -p $PROJECT_ID $BUCKET_URI"
]
"! gcloud storage buckets create --location=$LOCATION --project=$PROJECT_ID $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -519,8 +518,7 @@
"outputs": [],
"source": [
"examples.to_json(\"evaluation_dataset.json\", orient=\"records\", lines=True)\n",
"! gsutil cp evaluation_dataset.json $BUCKET_URI/input/evaluation_dataset.json\n",
"DATASET = f\"{BUCKET_URI}/input/evaluation_dataset.json\""
"! gcloud storage cp evaluation_dataset.json $BUCKET_URI/input/evaluation_dataset.json\n", "DATASET = f\"{BUCKET_URI}/input/evaluation_dataset.json\""
]
},
{
Expand Down Expand Up @@ -765,8 +763,7 @@
"# Delete Cloud Storage objects that were created\n",
"delete_bucket = False\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -316,8 +316,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $LOCATION -p $PROJECT_ID $BUCKET_URI"
]
"! gcloud storage buckets create --location=$LOCATION --project=$PROJECT_ID $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -480,7 +479,7 @@
"outputs": [],
"source": [
"# Download the sample code\n",
"! gsutil cp gs://cloud-samples-data/ai-platform/hello-custom/hello-custom-sample-v1.tar.gz - | tar -xzv\n",
"! gcloud storage cp gs://cloud-samples-data/ai-platform/hello-custom/hello-custom-sample-v1.tar.gz - | tar -xzv\n",
"%cd hello-custom-sample/"
]
},
Expand Down Expand Up @@ -664,8 +663,7 @@
"outputs": [],
"source": [
"GCS_BUCKET_TRAINING = f\"{BUCKET_URI}/data/\"\n",
"! gsutil cp dist/hello-custom-training-3.0.tar.gz {GCS_BUCKET_TRAINING}"
]
"! gcloud storage cp dist/hello-custom-training-3.0.tar.gz {GCS_BUCKET_TRAINING}" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -787,8 +785,7 @@
"# Delete GCS bucket.\n",
"delete_bucket = False\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_URI\n",
"\n",
" ! gcloud storage rm --recursive $BUCKET_URI\n", "\n",
"!rm -rf ../hello-custom-sample/"
]
}
Expand Down
Loading
Loading