An ADK agent with Vertex AI search
This project demonstrates how to build an GenAI agent using Gemini Enterprise App APIs and OAuth authentication to perform search on existing data sources. The agent showcases:
- Secure OAuth Integration: Implements OAuth 2.0 authentication to access protected data sources (Google Drive, BigQuery, etc.)
- Vertex AI Search: Leverages Google's Discovery Engine API for AI-powered search and answer
- ADK Framework: Built on Google's Agent Development Kit (ADK) for agent deployment
This example serves as a reference implementation for organizations looking to build AI agents that can securely search and retrieve information from various enterprise data repositories while maintaining proper access controls through OAuth.
- Python 3.12+ with modern type hints
uvfor dependency management- Vertex AI Search - Integrated search capabilities with OAuth authentication
- Cloud Logging - Automatic integration with Google Cloud Logging
- File-based Prompts - Easy-to-edit markdown prompt files
- Deployment scripts for Agent Engine and Agentspace
- Management scripts - List and unregister agents
- Modular architecture with Pydantic v2
- Python 3.12+
uv- Google Cloud SDK
-
Install dependencies
uv sync
-
Configure environment
cp .env.example .env # Edit .env with your configurationRequired Configuration:
GOOGLE_CLOUD_PROJECT: Your GCP project IDGOOGLE_CLOUD_LOCATION: GCP region (e.g.,us-central1)AS_APP: Your Agentspace app IDAUTH_ID: OAuth authorization IDOAUTH_CLIENT_ID,OAUTH_CLIENT_SECRET: OAuth credentialsVERTEX_SEARCH_ENGINE_ID: Your Vertex AI Search engine ID (same as AS_APP - see Vertex AI Search Configuration below)
Note on
STAGING_BUCKET: By default, the deployment script will create a staging bucket for you. If you want to use an existing bucket, you can specify it in the.envfile. -
Configure Vertex AI Search (Optional)
If you want to enable search capabilities, configure your Vertex AI Search engine:
# In .env file VERTEX_SEARCH_ENABLED=true VERTEX_SEARCH_ENGINE_ID=your-search-engine-id VERTEX_SEARCH_LOCATION=global VERTEX_SEARCH_COLLECTION_NAME=default_collectionTo disable search, set
VERTEX_SEARCH_ENABLED=falseor leaveVERTEX_SEARCH_ENGINE_IDempty. -
Deploy the agent
./scripts/deploy.sh
oauth_search/
├── .env.example
├── .gitignore
├── CLOUD_LOGGING.md # Cloud Logging integration guide
├── oauth_search/
│ ├── __init__.py
│ ├── agent.py
│ ├── global_instruction.md # Global instruction prompt
│ ├── instruction.md # Task-specific instruction prompt
│ ├── logging_config.py # Logging with Cloud Logging support
│ ├── prompts.py # Dynamic prompt loader
│ ├── search_tool.py # Vertex AI Search integration
│ ├── settings.py # Settings with Pydantic v2
│ └── tool.py
├── agentspace.yaml
├── deploy_to_agent_engine.py
├── Dockerfile
├── pyproject.toml
├── README.md
├── scripts/
│ ├── api_config.sh
│ ├── cleanup.sh
│ ├── create_authorization.sh
│ ├── create_or_patch_agent.sh
│ ├── deploy.sh
│ ├── list_agents.sh # List registered agents
│ └── unregister_agent.sh # Delete agents from Agentspace
└── tests/
├── __init__.py
└── test_agent.py
The agent's prompts are stored in separate markdown files for easy editing:
oauth_search/global_instruction.md: Global instructions that apply to all agent interactionsoauth_search/instruction.md: Task-specific instructions for the agent
You can edit these files directly to customize the agent's behavior. The prompts support variable substitution using {{VARIABLE_NAME}} syntax (double curly braces). For example, {{AGENT_VERSION}} in global_instruction.md is automatically replaced with the actual agent version.
Note: After editing prompt files, restart the agent (e.g., restart adk web) for changes to take effect.
The agent automatically integrates with Google Cloud Logging when deployed to Agent Engine:
Local Development:
- Logs appear in console (stdout/stderr)
- All logs prefixed with
###agent_name###
Agent Engine:
- Logs sent to Cloud Logging automatically
- Structured logging with custom fields
- View in Cloud Logging Console
Usage:
from oauth_search.logging_config import get_logger
logger = get_logger(__name__)
# Basic logging
logger.info("Processing request")
# Structured logging with custom fields
logger.info("Query processed", user_id="123", duration_ms=456)See CLOUD_LOGGING.md for complete documentation.
The agent includes integrated Vertex AI Search capabilities that use OAuth authentication for secure access to your search engine.
Setup:
-
Create a Vertex AI Search Engine in your GCP project:
- Go to Vertex AI Search Console
- Create a new search engine or use an existing one
- Note the engine ID (the last part of the engine resource name)
-
Configure in
.env:VERTEX_SEARCH_ENABLED=true VERTEX_SEARCH_ENGINE_ID=your-engine-id-here VERTEX_SEARCH_LOCATION=global VERTEX_SEARCH_COLLECTION_NAME=default_collection
-
OAuth Authentication: The search tool automatically uses the OAuth credentials configured in your
.envfile (AUTH_ID,OAUTH_CLIENT_ID, etc.) to authenticate API requests.
How it works:
- When a user queries the agent, the search tool retrieves the OAuth access token from the tool context
- Makes authenticated requests to the Discovery Engine API
- Formats and returns search results with titles, descriptions, snippets, and links
Disabling Search:
Set VERTEX_SEARCH_ENABLED=false or leave VERTEX_SEARCH_ENGINE_ID empty to disable search functionality.
To add your own tools, you can create a new Python file in the oauth_search directory and define your tool using the FunctionTool class. For example:
from google.adk.tools import FunctionTool
def my_custom_tool(param: str) -> str:
"""A custom tool that does something."""
return f"You passed: {param}"
def create_my_custom_tool() -> FunctionTool:
"""Factory function to create the custom tool."""
return FunctionTool(my_custom_tool)Then, you can add your tool to the agent in oauth_search/agent.py:
from .my_custom_tool import create_my_custom_tool
# ...
# Build tools list
tools = []
# Add your custom tool
custom_tool = create_my_custom_tool()
tools.append(custom_tool)
# Add other tools...
agent = Agent(
# ...
tools=tools
)Main deployment workflow that handles the complete deployment process:
./scripts/deploy.shThis script:
- Deploys the agent to Agent Engine (creates reasoning engine)
- Creates OAuth authorization (if configured)
- Registers the agent in Agentspace
- Links the agent to the authorization
Create a new agent or update an existing one in Agentspace:
./scripts/create_or_patch_agent.shThis script:
- Checks if an agent with the same display name exists
- Creates a new agent if it doesn't exist
- Updates (patches) the existing agent if it does exist
- Links the agent to the reasoning engine from
.env
Note: This is automatically called by deploy.sh, but you can run it manually to update agent configuration without redeploying the reasoning engine.
List all registered agents in your Agentspace:
./scripts/list_agents.shThis shows all agents with their IDs, display names, descriptions, and reasoning engines. Agents matching your current REASONING_ENGINE are highlighted.
Interactively delete agents from Agentspace:
# Unregister agents for current reasoning engine
./scripts/unregister_agent.sh
# Unregister agents for a specific reasoning engine
./scripts/unregister_agent.sh projects/123/locations/us-central1/reasoningEngines/456The script will:
- List all agents for the specified reasoning engine
- Prompt you to select which agent to delete
- Ask for confirmation before deletion
Create OAuth 2.0 authorization resource (if using OAuth):
./scripts/create_authorization.shThis script creates the authorization resource needed for OAuth-enabled agents. It's automatically called by deploy.sh if OAuth is configured.
Remove all deployed resources (agent, authorization, reasoning engine):
./scripts/cleanup.shThis script:
- Deletes the agent from Agentspace
- Removes the OAuth authorization (if it exists)
- Deletes the reasoning engine from Agent Engine
Warning: This is destructive and cannot be undone. Use with caution.
- Check your credentials: Make sure your
gcloudCLI is authenticated and has the correct permissions. - Check your environment variables: Ensure that all the required environment variables are set in your
.envfile. - Check the logs: Look at the logs in the Google Cloud Console for more information about the error.
- Verify
GOOGLE_CLOUD_PROJECTandGOOGLE_CLOUD_LOCATIONare set in.env - Check that
google-cloud-logging>=3.11.0is installed - Logs only appear in Cloud Logging when deployed to Agent Engine (not locally)
- See
CLOUD_LOGGING.mdfor troubleshooting steps
uv run pytestFormatting:
uv run ruff format .Linting:
uv run ruff check .Auto-fix linting issues:
uv run ruff check --fixThis project follows modern Python best practices:
- Python 3.10+ type hints (
T | Noneinstead ofOptional[T]) - Pydantic v2 patterns with
ConfigDict - Comprehensive error handling with proper logging
- Security: Path validation to prevent directory traversal
- Thread-safe logging configuration
- Module-level logger instantiation for performance
This project is licensed under the Apache-2.0 License - see the LICENSE file for details.