Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -467,8 +467,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location $REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -487,8 +486,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -711,8 +709,7 @@
"IMPORT_FILE = \"petfinder-tabular-classification-tabnet-with-header.csv\"\n",
"TRAINING_DATA_PATH = f\"{BUCKET_URI}/data/petfinder/train.csv\"\n",
"\n",
"! gsutil cp gs://cloud-samples-data/ai-platform-unified/datasets/tabular/{IMPORT_FILE} {TRAINING_DATA_PATH}"
]
"! gcloud storage cp gs://cloud-samples-data/ai-platform-unified/datasets/tabular/{IMPORT_FILE} {TRAINING_DATA_PATH}" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1422,8 +1419,7 @@
" print(e)\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -441,8 +441,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create $BUCKET_URI --location=$REGION" ]
},
{
"cell_type": "markdown",
Expand All @@ -461,8 +460,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -531,10 +529,12 @@
},
"outputs": [],
"source": [
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectCreator $BUCKET_URI\n",
"\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectViewer $BUCKET_URI"
]
"# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectCreator\n", "\n",
# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.
# Note: Conditions: gsutil iam ch does not support modifying IAM policies that contain conditions. gcloud storage commands do support conditions.
# Note: File Argument Order: In gsutil iam set, the policy file comes before the URLs. In gcloud storage bucket set-iam-policy, the policy file comes after the URL.
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer" ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The automated migration introduced a syntax error in this notebook cell. The comments on lines 534-536 and the command on line 537 are not enclosed in quotes, which makes the JSON for this cell's source array invalid. This will cause the notebook to fail to load or execute. All lines within the source array must be valid strings.

        "# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
        "# Note: Conditions: gsutil iam ch does not support modifying IAM policies that contain conditions. gcloud storage commands do support conditions.\n",
        "# Note: File Argument Order: In gsutil iam set, the policy file comes before the URLs. In gcloud storage bucket set-iam-policy, the policy file comes after the URL.\n",
        "! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer"

},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1006,8 +1006,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_URI/trainer_horses_or_humans.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_URI/trainer_horses_or_humans.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1433,14 +1432,11 @@
" + \"/evaluation_metrics\"\n",
" )\n",
" if tf.io.gfile.exists(EXECUTE_OUTPUT):\n",
" ! gsutil cat $EXECUTE_OUTPUT\n",
" return EXECUTE_OUTPUT\n",
" ! gcloud storage cat $EXECUTE_OUTPUT\n", " return EXECUTE_OUTPUT\n",
" elif tf.io.gfile.exists(GCP_RESOURCES):\n",
" ! gsutil cat $GCP_RESOURCES\n",
" return GCP_RESOURCES\n",
" ! gcloud storage cat $GCP_RESOURCES\n", " return GCP_RESOURCES\n",
" elif tf.io.gfile.exists(EVAL_METRICS):\n",
" ! gsutil cat $EVAL_METRICS\n",
" return EVAL_METRICS\n",
" ! gcloud storage cat $EVAL_METRICS\n", " return EVAL_METRICS\n",
"\n",
" return None\n",
"\n",
Expand All @@ -1454,8 +1450,7 @@
"print(\"getbesttrialop\")\n",
"artifacts = print_pipeline_output(pipeline, \"getbesttrialop\")\n",
"print(\"\\n\\n\")\n",
"output = !gsutil cat $artifacts\n",
"output = json.loads(output[0])\n",
"output = !gcloud storage cat $artifacts\n", "output = json.loads(output[0])\n",
"best_trial = json.loads(output[\"parameters\"][\"Output\"][\"stringValue\"])\n",
"model_id = best_trial[\"id\"]\n",
"print(\"BEST MODEL\", model_id)\n",
Expand Down Expand Up @@ -1529,8 +1524,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -449,8 +449,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -469,8 +468,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1374,7 +1372,7 @@
},
"outputs": [],
"source": [
"! gsutil cp gs://cloud-ml-data/img/flower_photos/daisy/100080576_f52e8ee070_n.jpg test.jpg\n",
"! gcloud storage cp gs://cloud-ml-data/img/flower_photos/daisy/100080576_f52e8ee070_n.jpg test.jpg\n",
"\n",
"import base64\n",
"\n",
Expand Down Expand Up @@ -1574,8 +1572,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI\n",
"\n",
" ! gcloud storage rm --recursive $BUCKET_URI\n", "\n",
"!rm -f test.jpg"
]
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -455,8 +455,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location $REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -475,8 +474,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -927,8 +925,7 @@
" f.write(json.dumps(x.tolist()) + \"\\n\")\n",
"\n",
"# Upload to Cloud Storage bucket\n",
"! gsutil cp $BATCH_PREDICTION_INSTANCES_FILE $BATCH_PREDICTION_GCS_SOURCE\n",
"\n",
"! gcloud storage cp $BATCH_PREDICTION_INSTANCES_FILE $BATCH_PREDICTION_GCS_SOURCE\n", "\n",
"print(\"Uploaded instances to: \", BATCH_PREDICTION_GCS_SOURCE)"
]
},
Expand Down Expand Up @@ -1445,8 +1442,7 @@
" batch_prediction_job.delete()\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -rf {BUCKET_URI}"
]
" ! gcloud storage rm --recursive --continue-on-error {BUCKET_URI}" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -193,11 +193,9 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -221,8 +219,7 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n", "\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\"\n",
Expand Down Expand Up @@ -358,9 +355,7 @@
"if dataset_validation_util.is_gcs_path(pretrained_model_id):\n",
" # Download tokenizer.\n",
" ! mkdir tokenizer\n",
" ! gsutil cp {pretrained_model_id}/tokenizer.json ./tokenizer\n",
" ! gsutil cp {pretrained_model_id}/config.json ./tokenizer\n",
" tokenizer_path = \"./tokenizer\"\n",
" ! gcloud storage cp {pretrained_model_id}/tokenizer.json ./tokenizer\n", " ! gcloud storage cp {pretrained_model_id}/config.json ./tokenizer\n", " tokenizer_path = \"./tokenizer\"\n",
" access_token = \"\"\n",
"else:\n",
" tokenizer_path = pretrained_model_id\n",
Expand Down Expand Up @@ -1064,8 +1059,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -201,11 +201,10 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
"# Note: -L (Full Listing) Output: The format of the full listing output is different. gcloud storage uses a title case for keys and will not display a field if its value is \"None\".\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -229,8 +228,8 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"\n",
"# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n", "\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\"\n",
Expand Down Expand Up @@ -1215,8 +1214,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -185,11 +185,10 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
# Note: The format of the full listing output is different. gcloud storage uses a title case for keys and will not display a field if its value is "None".
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The automated migration introduced a syntax error in this notebook cell. This comment is not enclosed in quotes, which makes the JSON for this cell's source array invalid. This will cause the notebook to fail to load or execute. All lines within the source array must be valid strings.

Suggested change
# Note: The format of the full listing output is different. gcloud storage uses a title case for keys and will not display a field if its value is "None".
"# Note: The format of the full listing output is different. gcloud storage uses a title case for keys and will not display a field if its value is \"None\".\n",

" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -213,8 +212,8 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"\n",
"# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=\"serviceAccount:{SERVICE_ACCOUNT}\" --role=\"roles/storage.admin\"\n", "\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\""
Expand Down Expand Up @@ -395,9 +394,7 @@
"if dataset_validation_util.is_gcs_path(pretrained_model_id):\n",
" # Download tokenizer.\n",
" ! mkdir tokenizer\n",
" ! gsutil cp {pretrained_model_id}/tokenizer.json ./tokenizer\n",
" ! gsutil cp {pretrained_model_id}/config.json ./tokenizer\n",
" tokenizer_path = \"./tokenizer\"\n",
" ! gcloud storage cp {pretrained_model_id}/tokenizer.json ./tokenizer\n", " ! gcloud storage cp {pretrained_model_id}/config.json ./tokenizer\n", " tokenizer_path = \"./tokenizer\"\n",
" access_token = \"\"\n",
"else:\n",
" tokenizer_path = pretrained_model_id\n",
Expand Down Expand Up @@ -1084,8 +1081,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -496,8 +496,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION -p $PROJECT_ID $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION --project=$PROJECT_ID $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -516,8 +515,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -687,9 +685,7 @@
"outputs": [],
"source": [
"%cd ..\n",
"!gsutil cp {LOCAL_MODEL_ARTIFACTS_DIR}/* {BUCKET_URI}/{MODEL_ARTIFACT_DIR}/\n",
"!gsutil ls {BUCKET_URI}/{MODEL_ARTIFACT_DIR}/"
]
"!gcloud storage cp {LOCAL_MODEL_ARTIFACTS_DIR}/* {BUCKET_URI}/{MODEL_ARTIFACT_DIR}/\n", "!gcloud storage ls {BUCKET_URI}/{MODEL_ARTIFACT_DIR}/" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1792,8 +1788,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Loading
Loading