Skip to content

Conversation

@TJ-Liu
Copy link
Contributor

@TJ-Liu TJ-Liu commented Apr 1, 2025

REQUIRED: Add a summary of your PR here, typically including why the change is needed and what was changed. Include any design alternatives for discussion purposes.


--- YOUR PR SUMMARY GOES HERE ---


REQUIRED: Fill out the below checklists or remove if irrelevant

  1. If you are opening a PR for Official Notebooks under the notebooks/official folder, follow this mandatory checklist:
  • Use the notebook template as a starting point.
  • Follow the style and grammar rules outlined in the above notebook template.
  • Verify the notebook runs successfully in Colab since the automated tests cannot guarantee this even when it passes.
  • Passes all the required automated checks. You can locally test for formatting and linting with these instructions.
  • You have consulted with a tech writer to see if tech writer review is necessary. If so, the notebook has been reviewed by a tech writer, and they have approved it.
  • This notebook has been added to the CODEOWNERS file under the Official Notebooks section, pointing to the author or the author's team.
  • The Jupyter notebook cleans up any artifacts it has created (datasets, ML models, endpoints, etc) so as not to eat up unnecessary resources.

  1. If you are opening a PR for Community Notebooks under the notebooks/community folder:
  • This notebook has been added to the CODEOWNERS file under the Community Notebooks section, pointing to the author or the author's team.
  • Passes all the required formatting and linting checks. You can locally test with these instructions.

  1. If you are opening a PR for Community Content under the community-content folder:
  • Make sure your main Content Directory Name is descriptive, informative, and includes some of the key products and attributes of your content, so that it is differentiable from other content
  • The main content directory has been added to the CODEOWNERS file under the Community Content section, pointing to the author or the author's team.
  • Passes all the required formatting and linting checks. You can locally test with these instructions.

@TJ-Liu TJ-Liu requested a review from a team as a code owner April 1, 2025 17:28
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @TJ-Liu, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

Summary of Changes

This pull request introduces a new Colab notebook that guides users on how to get started with online prediction using dedicated endpoints in Vertex AI. The notebook covers creating a dedicated endpoint, deploying a TensorFlow model to it, making predictions using both the Python SDK and direct HTTP/gRPC requests, and exploring features like traffic splitting, custom timeouts, and request/response logging. The notebook also includes cleanup instructions to remove created resources.

Highlights

  • Dedicated Endpoints: The notebook demonstrates how to create and use dedicated endpoints for online prediction, highlighting their benefits such as dedicated networking, optimized latency, larger payload support, longer timeouts, and Generative AI readiness.
  • Prediction Methods: The notebook provides examples of making predictions using the Vertex AI SDK for Python, as well as direct HTTP/gRPC requests to the dedicated endpoint.
  • Feature Exploration: The notebook showcases advanced features like traffic splitting between different model versions, customizing inference timeouts, and enabling request/response logging for monitoring and debugging.
  • Chat Completion: The notebook provides instructions on how to use OpenAI client library to do chat completion.

Changelog

Click here to see the changelog
  • notebooks/official/CODEOWNERS
    • Added an entry to the CODEOWNERS file, assigning ownership of the new notebook to @tianjiaoliu.
  • notebooks/official/prediction/get_started_with_dedicated_endpoint.ipynb
    • Created a new Colab notebook that guides users on how to get started with online prediction using dedicated endpoints in Vertex AI.
    • The notebook covers the following topics:
      • Installing the Vertex AI SDK and other required packages
      • Authenticating your notebook environment (Colab only)
      • Setting Google Cloud project information and initializing the Vertex AI SDK
      • Creating a dedicated endpoint
      • Deploying a TensorFlow model to the endpoint
      • Making predictions using the Python SDK and direct HTTP/gRPC requests
      • Exploring features like traffic splitting, custom timeouts, and request/response logging
      • Cleaning up created resources
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.


A dedicated endpoint's grace,
For predictions, time, and space.
No noisy neighbors near,
Just focused service here,
In Vertex AI's embrace.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new Colab notebook demonstrating how to use dedicated endpoints for online prediction with Vertex AI. The notebook covers creating endpoints, deploying models, making predictions using both the Python SDK and direct HTTP/GRPC requests, and managing traffic splits, custom timeouts, and request/response logging. Overall, the notebook provides a comprehensive guide to using dedicated endpoints. However, there are a few areas that could be improved for clarity and completeness.

Summary of Findings

  • Missing Project Number: The notebook uses PROJECT_ID but not PROJECT_NUMBER which is needed for constructing the dedicated endpoint DNS. This could lead to confusion for users who are not familiar with the difference between the two.
  • Incomplete Chat Completion Example: The Chat Completion example is incomplete, with ... indicating missing code. A more complete example would be beneficial.
  • Stream Raw Predict Issues: The stream_raw_predict method is not actually a method of the endpoint object. It should be endpoint.raw_predict with streaming enabled. Also, the code does not correctly iterate through the stream responses.
  • Inconsistent HTTP Request Examples: The HTTP request examples use ${DEDICATED_ENDPOINT} which is not defined in the notebook. It should be replaced with the actual dedicated endpoint DNS.

Merge Readiness

The pull request introduces a valuable new notebook. However, the issues identified above should be addressed before merging to ensure the notebook is accurate, complete, and easy to use. I am unable to directly approve this pull request, and recommend that others review and approve this code before merging. In particular, the issues related to the chat completion example and stream raw predict should be addressed before merging.

Comment on lines 338 to 339
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

It's important to note that the dedicated DNS requires the project number, not the project ID. Consider adding a step to retrieve the project number and use that in the DNS string. Otherwise, users may get confused when the notebook doesn't work for them.

# Get project number
PROJECT_NUMBER = !gcloud projects describe $PROJECT_ID --format='value(projectNumber)'
PROJECT_NUMBER = PROJECT_NUMBER[0]

# ...

# Dedicated endpoint DNS
dedicated_endpoint_dns = f"https://{endpoint.gca_resource.id}-{PROJECT_NUMBER}.{LOCATION}-aiplatform.googleapis.com"

Comment on lines +219 to +220
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Consider adding a check to see if the bucket already exists, and if so, skip the creation step. This will make the notebook more robust.

import subprocess

try:
  subprocess.check_call(['gsutil', 'mb', '-l', LOCATION, '-p', PROJECT_ID, BUCKET_URI])
except subprocess.CalledProcessError as e:
  print(f"Bucket {BUCKET_URI} already exists or another error occurred: {e}")

@gericdong
Copy link
Contributor

@TJ-Liu: looks like your notebook requires 3.10. Please add the following in the introduction section. Thanks

NOTE: This notebook has been tested in the following environment:

Python version = 3.10

@gericdong gericdong added this pull request to the merge queue May 15, 2025
Merged via the queue into GoogleCloudPlatform:main with commit 2270937 May 15, 2025
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants