Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
d84ae4f
Migrate gsutil usage to gcloud storage
googlyrahman Oct 17, 2025
5d897fd
Manual Changes
bhandarivijay-png Dec 5, 2025
71c777d
Manual Changes
bhandarivijay-png Dec 5, 2025
eed921b
Fix: Updated gcloud storage command without formatting
bhandarivijay-png Dec 5, 2025
fd8bb16
Manual Changes
bhandarivijay-png Dec 9, 2025
a7a7bda
Manual Changes
bhandarivijay-png Dec 9, 2025
3f68933
Revert "Manual Changes"
bhandarivijay-png Dec 9, 2025
3959c2b
Manual Changes
bhandarivijay-png Dec 9, 2025
5c2c3f9
Revert "Manual Changes"
bhandarivijay-png Dec 9, 2025
8bd0d77
Manual Changes
bhandarivijay-png Dec 9, 2025
501a589
Revert "Manual Changes"
bhandarivijay-png Dec 9, 2025
d44e3b1
Manual Changes
bhandarivijay-png Dec 9, 2025
46fb663
Manual changes
bhandarivijay-png Dec 9, 2025
ce16436
Changes for 4339
bhandarivijay-png Dec 15, 2025
e86ab07
Changes for 4339
bhandarivijay-png Dec 15, 2025
9e18216
Changes for 4339
bhandarivijay-png Dec 15, 2025
3908d9e
Fix: Applied linter formatting and resolved style issues
bhandarivijay-png Dec 16, 2025
54ac7e9
Merge pull request #16 from bhandarivijay-png/ai-gsutil-migration-c51…
gurusai-voleti Dec 22, 2025
45891a6
gcloud to gsutilchanges for 4339
bhandarivijay-png Dec 22, 2025
75be161
Merge pull request #80 from bhandarivijay-png/ai-gsutil-migration-c51…
gurusai-voleti Dec 22, 2025
8aba254
removed gsutil to gcloud migration
bhandarivijay-png Dec 22, 2025
b949266
Manual changes
bhandarivijay-png Dec 22, 2025
36d6835
Merge pull request #83 from bhandarivijay-png/ai-gsutil-migration-c51…
gurusai-voleti Dec 22, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -472,8 +472,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location $REGION $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -492,8 +491,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -565,8 +563,7 @@
"outputs": [],
"source": [
"# Copy the sample data into your DATA_PATH\n",
"! gsutil cp \"gs://cloud-samples-data/vertex-ai/community-content/tf_agents_bandits_movie_recommendation_with_kfp_and_vertex_sdk/u.data\" $DATA_PATH"
]
"! gcloud storage cp \"gs://cloud-samples-data/vertex-ai/community-content/tf_agents_bandits_movie_recommendation_with_kfp_and_vertex_sdk/u.data\" $DATA_PATH" ]
},
{
"cell_type": "code",
Expand Down Expand Up @@ -1784,8 +1781,7 @@
"! gcloud ai models delete $model.name --quiet\n",
"\n",
"# Delete Cloud Storage objects that were created\n",
"! gsutil -m rm -r $ARTIFACTS_DIR"
]
"! gcloud storage rm --recursive $ARTIFACTS_DIR" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -421,8 +421,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -441,8 +440,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -886,12 +884,10 @@
"else:\n",
" FILE = IMPORT_FILE\n",
"\n",
"count = ! gsutil cat $FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $FILE | head"
]
"! gcloud storage cat $FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1329,8 +1325,7 @@
},
"outputs": [],
"source": [
"test_items = !gsutil cat $IMPORT_FILE | head -n2\n",
"if len(str(test_items[0]).split(\",\")) == 3:\n",
"test_items = !gcloud storage cat $IMPORT_FILE | head -n2\n", "if len(str(test_items[0]).split(\",\")) == 3:\n",
" _, test_item_1, test_label_1 = str(test_items[0]).split(\",\")\n",
" _, test_item_2, test_label_2 = str(test_items[1]).split(\",\")\n",
"else:\n",
Expand Down Expand Up @@ -1363,9 +1358,7 @@
"file_1 = test_item_1.split(\"/\")[-1]\n",
"file_2 = test_item_2.split(\"/\")[-1]\n",
"\n",
"! gsutil cp $test_item_1 $BUCKET_NAME/$file_1\n",
"! gsutil cp $test_item_2 $BUCKET_NAME/$file_2\n",
"\n",
"! gcloud storage cp $test_item_1 $BUCKET_NAME/$file_1\n", "! gcloud storage cp $test_item_2 $BUCKET_NAME/$file_2\n", "\n",
"test_item_1 = BUCKET_NAME + \"/\" + file_1\n",
"test_item_2 = BUCKET_NAME + \"/\" + file_2"
]
Expand Down Expand Up @@ -1408,8 +1401,7 @@
" f.write(json.dumps(data) + \"\\n\")\n",
"\n",
"print(gcs_input_uri)\n",
"! gsutil cat $gcs_input_uri"
]
"! gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1693,8 +1685,7 @@
"source": [
"def get_latest_predictions(gcs_out_dir):\n",
" \"\"\" Get the latest prediction subfolder using the timestamp in the subfolder name\"\"\"\n",
" folders = !gsutil ls $gcs_out_dir\n",
" latest = \"\"\n",
" folders = !gcloud storage ls $gcs_out_dir\n", " latest = \"\"\n",
" for folder in folders:\n",
" subfolder = folder.split(\"/\")[-2]\n",
" if subfolder.startswith(\"prediction-\"):\n",
Expand All @@ -1711,10 +1702,8 @@
" raise Exception(\"Batch Job Failed\")\n",
" else:\n",
" folder = get_latest_predictions(predictions)\n",
" ! gsutil ls $folder/prediction*.jsonl\n",
"\n",
" ! gsutil cat $folder/prediction*.jsonl\n",
" break\n",
" ! gcloud storage ls $folder/prediction*.jsonl\n", "\n",
" ! gcloud storage cat $folder/prediction*.jsonl\n", " break\n",
" time.sleep(60)"
]
},
Expand Down Expand Up @@ -1808,8 +1797,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -421,8 +421,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -441,8 +440,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -887,12 +885,10 @@
"else:\n",
" FILE = IMPORT_FILE\n",
"\n",
"count = ! gsutil cat $FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $FILE | head"
]
"! gcloud storage cat $FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1333,8 +1329,7 @@
},
"outputs": [],
"source": [
"test_items = !gsutil cat $IMPORT_FILE | head -n2\n",
"cols_1 = str(test_items[0]).split(\",\")\n",
"test_items = !gcloud storage cat $IMPORT_FILE | head -n2\n", "cols_1 = str(test_items[0]).split(\",\")\n",
"cols_2 = str(test_items[1]).split(\",\")\n",
"if len(cols_1) == 11:\n",
" test_item_1 = str(cols_1[1])\n",
Expand Down Expand Up @@ -1373,9 +1368,7 @@
"file_1 = test_item_1.split(\"/\")[-1]\n",
"file_2 = test_item_2.split(\"/\")[-1]\n",
"\n",
"! gsutil cp $test_item_1 $BUCKET_NAME/$file_1\n",
"! gsutil cp $test_item_2 $BUCKET_NAME/$file_2\n",
"\n",
"! gcloud storage cp $test_item_1 $BUCKET_NAME/$file_1\n", "! gcloud storage cp $test_item_2 $BUCKET_NAME/$file_2\n", "\n",
"test_item_1 = BUCKET_NAME + \"/\" + file_1\n",
"test_item_2 = BUCKET_NAME + \"/\" + file_2"
]
Expand Down Expand Up @@ -1418,8 +1411,7 @@
" f.write(json.dumps(data) + \"\\n\")\n",
"\n",
"print(gcs_input_uri)\n",
"! gsutil cat $gcs_input_uri"
]
"! gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1705,8 +1697,7 @@
"source": [
"def get_latest_predictions(gcs_out_dir):\n",
" \"\"\" Get the latest prediction subfolder using the timestamp in the subfolder name\"\"\"\n",
" folders = !gsutil ls $gcs_out_dir\n",
" latest = \"\"\n",
" folders = !gcloud storage ls $gcs_out_dir\n", " latest = \"\"\n",
" for folder in folders:\n",
" subfolder = folder.split(\"/\")[-2]\n",
" if subfolder.startswith(\"prediction-\"):\n",
Expand All @@ -1723,10 +1714,8 @@
" raise Exception(\"Batch Job Failed\")\n",
" else:\n",
" folder = get_latest_predictions(predictions)\n",
" ! gsutil ls $folder/prediction*.jsonl\n",
"\n",
" ! gsutil cat $folder/prediction*.jsonl\n",
" break\n",
" ! gcloud storage ls $folder/prediction*.jsonl\n", "\n",
" ! gcloud storage cat $folder/prediction*.jsonl\n", " break\n",
" time.sleep(60)"
]
},
Expand Down Expand Up @@ -1820,8 +1809,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -735,14 +735,11 @@
},
"outputs": [],
"source": [
"count = ! gsutil cat $IMPORT_FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $IMPORT_FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $IMPORT_FILE | head\n",
"\n",
"heading = ! gsutil cat $IMPORT_FILE | head -n1\n",
"label_column = str(heading).split(\",\")[-1].split(\"'\")[0]\n",
"! gcloud storage cat $IMPORT_FILE | head\n", "\n",
"heading = ! gcloud storage cat $IMPORT_FILE | head -n1\n", "label_column = str(heading).split(\",\")[-1].split(\"'\")[0]\n",
"print(\"Label Column Name\", label_column)\n",
"if label_column is None:\n",
" raise Exception(\"label column missing\")"
Expand Down Expand Up @@ -1820,8 +1817,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -421,8 +421,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location $REGION $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -441,8 +440,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -888,12 +886,10 @@
"else:\n",
" FILE = IMPORT_FILE\n",
"\n",
"count = ! gsutil cat $FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $FILE | head"
]
"! gcloud storage cat $FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1377,8 +1373,7 @@
" f.write(json.dumps(data) + \"\\n\")\n",
"\n",
"print(gcs_input_uri)\n",
"! gsutil cat $gcs_input_uri"
]
"! gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1653,8 +1648,7 @@
"source": [
"def get_latest_predictions(gcs_out_dir):\n",
" \"\"\" Get the latest prediction subfolder using the timestamp in the subfolder name\"\"\"\n",
" folders = !gsutil ls $gcs_out_dir\n",
" latest = \"\"\n",
" folders = !gcloud storage ls $gcs_out_dir\n", " latest = \"\"\n",
" for folder in folders:\n",
" subfolder = folder.split(\"/\")[-2]\n",
" if subfolder.startswith(\"prediction-\"):\n",
Expand All @@ -1671,10 +1665,8 @@
" raise Exception(\"Batch Job Failed\")\n",
" else:\n",
" folder = get_latest_predictions(predictions)\n",
" ! gsutil ls $folder/prediction*.jsonl\n",
"\n",
" ! gsutil cat $folder/prediction*.jsonl\n",
" break\n",
" ! gcloud storage ls $folder/prediction*.jsonl\n", "\n",
" ! gcloud storage cat $folder/prediction*.jsonl\n", " break\n",
" time.sleep(60)"
]
},
Expand Down Expand Up @@ -1768,8 +1760,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Loading
Loading