Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -445,8 +445,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -465,8 +464,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1427,8 +1425,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -450,8 +450,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -470,8 +469,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -496,8 +494,7 @@
"source": [
"# set GCS bucket object TTL to 7 days\n",
"! echo '{\"rule\":[{\"action\": {\"type\": \"Delete\"},\"condition\": {\"age\": 7}}]}' > gcs_lifecycle.tmp\n",
"! gsutil lifecycle set gcs_lifecycle.tmp {BUCKET_URI}\n",
"! rm gcs_lifecycle.tmp"
"! gcloud storage buckets update --lifecycle-file=gcs_lifecycle.tmp {BUCKET_URI}\n", "! rm gcs_lifecycle.tmp"
]
},
{
Expand Down Expand Up @@ -1047,8 +1044,7 @@
"\n",
"print(\"trained model without custom TF ops:\", model_artifact_no_custom_op)\n",
"\n",
"! gsutil ls {model_artifact_no_custom_op}\n",
"\n",
"! gcloud storage ls {model_artifact_no_custom_op}\n", "\n",
"model_evaluation = get_evaluation_metrics(pipeline_task_details)\n",
"print(\"Model evaluation artifacts:\", model_evaluation)"
]
Expand Down Expand Up @@ -1337,8 +1333,7 @@
"\n",
"delete_bucket = False\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -rf $BUCKET_URI"
]
" ! gcloud storage rm --recursive --continue-on-error $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -450,8 +450,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -470,8 +469,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -541,10 +539,10 @@
},
"outputs": [],
"source": [
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectCreator $BUCKET_URI\n",
"\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectViewer $BUCKET_URI"
]
"! # Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectCreator\n", "\n",
# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

This line is a Python-style comment (# Note...) in a cell where shell commands are being executed (prefixed with !). This will cause a SyntaxError when the notebook cell is run. The comment should be a shell comment, prefixed with ! #. Additionally, it appears to be breaking the JSON format of the notebook file, as it's not a string literal within the source array.

"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1015,14 +1013,11 @@
" + \"/evaluation_metrics\"\n",
" )\n",
" if tf.io.gfile.exists(EXECUTE_OUTPUT):\n",
" ! gsutil cat $EXECUTE_OUTPUT\n",
" return EXECUTE_OUTPUT\n",
" ! gcloud storage cat $EXECUTE_OUTPUT\n", " return EXECUTE_OUTPUT\n",
" elif tf.io.gfile.exists(GCP_RESOURCES):\n",
" ! gsutil cat $GCP_RESOURCES\n",
" return GCP_RESOURCES\n",
" ! gcloud storage cat $GCP_RESOURCES\n", " return GCP_RESOURCES\n",
" elif tf.io.gfile.exists(EVAL_METRICS):\n",
" ! gsutil cat $EVAL_METRICS\n",
" return EVAL_METRICS\n",
" ! gcloud storage cat $EVAL_METRICS\n", " return EVAL_METRICS\n",
"\n",
" return None\n",
"\n",
Expand All @@ -1035,7 +1030,7 @@
"print(\"\\n\\n\")\n",
"print(\"model-upload\")\n",
"artifacts = print_pipeline_output(pipeline, \"model-upload\")\n",
"output = !gsutil cat $artifacts\n",
"output = !gcloud storage cat $artifacts\n",
"output = json.loads(output[0])\n",
"model_id = output[\"artifacts\"][\"model\"][\"artifacts\"][0][\"metadata\"][\"resourceName\"]\n",
"print(\"\\n\")\n",
Expand Down Expand Up @@ -1253,8 +1248,7 @@
"print(\"\\n\\n\")\n",
"print(\"model-upload\")\n",
"artifacts = print_pipeline_output(pipeline, \"model-upload\")\n",
"output = !gsutil cat $artifacts\n",
"output = json.loads(output[0])\n",
"output = !gcloud storage cat $artifacts\n", "output = json.loads(output[0])\n",
"model_id = output[\"artifacts\"][\"model\"][\"artifacts\"][0][\"metadata\"][\"resourceName\"]\n",
"print(\"\\n\")\n",
"print(\"MODEL ID\", model_id)\n",
Expand Down Expand Up @@ -1332,8 +1326,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -458,8 +458,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -478,8 +477,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1129,8 +1127,7 @@
},
"outputs": [],
"source": [
"! gsutil cp gs://cloud-ml-data/img/flower_photos/daisy/100080576_f52e8ee070_n.jpg test.jpg"
]
"! gcloud storage cp gs://cloud-ml-data/img/flower_photos/daisy/100080576_f52e8ee070_n.jpg test.jpg" ]
},
{
"cell_type": "code",
Expand Down Expand Up @@ -1314,8 +1311,7 @@
" print(e)\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -rf {BUCKET_URI}"
]
" ! gcloud storage rm --recursive --continue-on-error {BUCKET_URI}" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -171,11 +171,9 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -199,7 +197,8 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n",
"\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
Expand All @@ -216,8 +215,7 @@
"# Download config files.\n",
"CONFIG_DIR = os.path.join(BUCKET_URI, \"config\")\n",
"! wget https://raw.githubusercontent.com/tensorflow/models/master/official/vision/configs/experiments/semantic_segmentation/deeplabv3plus_resnet101_cityscapes_gpu_multiworker_mirrored.yaml\n",
"! gsutil cp deeplabv3plus_resnet101_cityscapes_gpu_multiworker_mirrored.yaml $CONFIG_DIR/"
]
"! gcloud storage cp deeplabv3plus_resnet101_cityscapes_gpu_multiworker_mirrored.yaml $CONFIG_DIR/" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -358,8 +356,7 @@
" checkpoint_path = os.path.relpath(checkpoint_path, checkpoint_name)\n",
" break\n",
"\n",
" ! gsutil cp -r $checkpoint_name $CHECKPOINT_BUCKET/\n",
" checkpoint_uri = os.path.join(CHECKPOINT_BUCKET, checkpoint_name, checkpoint_path)\n",
" ! gcloud storage cp --recursive $checkpoint_name $CHECKPOINT_BUCKET/\n", " checkpoint_uri = os.path.join(CHECKPOINT_BUCKET, checkpoint_name, checkpoint_path)\n",
" print(\"Checkpoint uploaded to\", checkpoint_uri)\n",
" return checkpoint_uri\n",
"\n",
Expand All @@ -375,8 +372,7 @@
" \"\"\"\n",
" label_map_filename = os.path.basename(label_map_yaml_filepath)\n",
" subprocess.check_output(\n",
" [\"gsutil\", \"cp\", label_map_yaml_filepath, label_map_filename],\n",
" stderr=subprocess.STDOUT,\n",
" [\"gcloud\", \"storage\", \"cp\", label_map_yaml_filepath, label_map_filename],\n", " stderr=subprocess.STDOUT,\n",
" )\n",
" with open(label_map_filename, \"rb\") as input_file:\n",
" label_map = yaml.safe_load(input_file.read())[\"label_map\"]\n",
Expand Down Expand Up @@ -556,8 +552,7 @@
" current_trial_best_ckpt_evaluation_filepath = os.path.join(\n",
" current_trial_best_ckpt_dir, \"info.json\"\n",
" )\n",
" ! gsutil cp $current_trial_best_ckpt_evaluation_filepath .\n",
" with open(\"info.json\", \"r\") as f:\n",
" ! gcloud storage cp $current_trial_best_ckpt_evaluation_filepath .\n", " with open(\"info.json\", \"r\") as f:\n",
" eval_metric_results = json.load(f)\n",
" current_performance = eval_metric_results[evaluation_metric]\n",
" if current_performance > best_performance:\n",
Expand Down Expand Up @@ -739,8 +734,7 @@
" if new_width <= 0:\n",
" test_file = os.path.basename(test_filepath)\n",
" subprocess.check_output(\n",
" [\"gsutil\", \"cp\", test_filepath, test_file], stderr=subprocess.STDOUT\n",
" )\n",
" [\"gcloud\", \"storage\", \"cp\", test_filepath, test_file], stderr=subprocess.STDOUT\n", " )\n",
" with open(test_file, \"rb\") as input_file:\n",
" encoded_string = base64.b64encode(input_file.read()).decode(\"utf-8\")\n",
" else:\n",
Expand Down Expand Up @@ -975,8 +969,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME\n",
"\n",
" ! gcloud storage rm --recursive $BUCKET_NAME\n", "\n",
"# Delete custom and hpt jobs.\n",
"if data_converter_custom_job.list(filter=f'display_name=\"{data_converter_job_name}\"'):\n",
" data_converter_custom_job.delete()\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -321,8 +321,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $LOCATION $BUCKET_URI"
]
"! gcloud storage buckets create --location $LOCATION $BUCKET_URI" ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The gcloud storage buckets create command is missing an equals sign for the --location flag. It should be --location=$LOCATION.

        "! gcloud storage buckets create --location=$LOCATION $BUCKET_URI"

Copy link
Contributor

@bhandarivijay-png bhandarivijay-png Dec 12, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"! gcloud storage buckets create --location $LOCATION $BUCKET_URI" command is correct.
eg.gcloud storage buckets create --location $LOCATION $BUCKET_URI
Creating gs://vijay-bucket-090909/...

},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1238,8 +1237,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_URI/trainer.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_URI/trainer.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1451,8 +1449,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket:\n",
" ! gsutil rm -rf {BUCKET_URI}"
]
" ! gcloud storage rm --recursive --continue-on-error {BUCKET_URI}" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -308,8 +308,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $LOCATION -p $PROJECT_ID $BUCKET_URI"
]
"! gcloud storage buckets create --location=$LOCATION --project=$PROJECT_ID $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -381,10 +380,8 @@
},
"outputs": [],
"source": [
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectCreator $BUCKET_URI\n",
"\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectViewer $BUCKET_URI"
]
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectCreator\n", "\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -837,8 +834,7 @@
},
"outputs": [],
"source": [
"!gsutil cp -r python_package/dist/* $BUCKET_URI/training_package/"
]
"!gcloud storage cp --recursive python_package/dist/* $BUCKET_URI/training_package/" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1215,8 +1211,7 @@
"\n",
"# Delete Cloud Storage objects\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_URI\n",
"\n",
" ! gcloud storage rm --recursive $BUCKET_URI\n", "\n",
"! rm $PIPELINE_FILE_NAME"
]
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -324,8 +324,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $LOCATION -p $PROJECT_ID $BUCKET_URI"
]
"! gcloud storage buckets create --location $LOCATION --project $PROJECT_ID $BUCKET_URI" ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The gcloud storage buckets create command is missing equals signs for the --location and --project flags. It should be --location=$LOCATION and --project=$PROJECT_ID.

        "! gcloud storage buckets create --location=$LOCATION --project=$PROJECT_ID $BUCKET_URI"

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"! gcloud storage buckets create --location $LOCATION --project $PROJECT_ID $BUCKET_URI" command is correct

},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -595,8 +594,7 @@
},
"outputs": [],
"source": [
"!gsutil ls gs://$BUCKET_NAME/$GCS_PREFIX/data"
]
"!gcloud storage ls gs://$BUCKET_NAME/$GCS_PREFIX/data" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1359,8 +1357,7 @@
},
"outputs": [],
"source": [
"!gsutil ls $gcs_source_test_url"
]
"!gcloud storage ls $gcs_source_test_url" ]
},
{
"cell_type": "code",
Expand Down Expand Up @@ -1488,8 +1485,7 @@
"batch_predict_job.delete()\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Loading
Loading