A template implementation of a comprehensive deep research agent using LangGraph and GPT-4. This agent performs thorough research on any given topic by orchestrating multiple research phases, gathering information from multiple sources, and compiling detailed reports with citations and structured analysis.
- Features
- Quick Start
- Prerequisites
- Installation
- Usage
- Project Structure
- Troubleshooting
- Contributing
- Support
- License
- Comprehensive multi-phase research methodology
- Automated research planning and outline generation
- Parallel processing of research sections for efficiency
- Web search integration via Tavily for up-to-date information
- Structured report generation with proper citations
- Configurable search depth and report formatting
- Built on LangGraph for sophisticated workflow orchestration
- Easy deployment and integration with Blaxel platform
For those who want to get up and running quickly:
# Clone the repository
git clone https://github.com/blaxel-ai/template-deepresearch.git
# Navigate to the project directory
cd template-deepresearch
# Install dependencies
uv sync
# Set up your Tavily API key
cp .env-sample .env
# Edit .env and add your TAVILY_API_KEY
# Start the server
bl serve --hotreload
# In another terminal, test the agent
bl chat --local template-deepresearch
- Python: 3.12 or later
- UV: An extremely fast Python package and project manager, written in Rust
- Tavily API Key: Required for web search functionality
- Blaxel Platform Setup: Complete Blaxel setup by following the quickstart guide
- Blaxel CLI: Ensure you have the Blaxel CLI installed. If not, install it globally:
curl -fsSL https://raw.githubusercontent.com/blaxel-ai/toolkit/main/install.sh | BINDIR=/usr/local/bin sudo -E sh
- Blaxel login: Login to Blaxel platform
bl login YOUR-WORKSPACE
- Blaxel CLI: Ensure you have the Blaxel CLI installed. If not, install it globally:
Clone the repository and install dependencies:
git clone https://github.com/blaxel-ai/template-deepresearch.git
cd template-deepresearch
uv sync
Set up environment variables:
cp .env-sample .env
Then update the following values with your own credentials:
- Tavily API Key:
TAVILY_API_KEY
Start the development server with hot reloading:
bl serve --hotreload
Note: This command starts the server and enables hot reload so that changes to the source code are automatically reflected.
You can test your deep research agent using the chat interface:
bl chat --local template-deepresearch
Example research query: "Do a report of annual revenue for the last 10 years of NVIDIA"
Or run it directly with specific parameters:
bl run agent template-deepresearch --local --data '{"input": "Do a report of annual revenue for the last 10 years of NVIDIA", "report_plan_depth": 20, "recursion_limit": 100}'
When you are ready to deploy your application:
bl deploy
This command uses your code and the configuration files under the .blaxel
directory to deploy your application.
- src/main.py - Application entry point
- src/agent/ - Core research agent implementation
- agent.py - Main agent configuration and HTTP response streaming
- llmlogic.py - Research workflow logic and LangGraph implementation
- prompts.py - Research prompts and templates
- src/server/ - Server implementation and routing
- router.py - API route definitions
- middleware.py - Request/response middleware
- error.py - Error handling utilities
- pyproject.toml - UV package manager configuration
- blaxel.toml - Blaxel deployment configuration
- .env-sample - Environment variables template
-
Blaxel Platform Issues:
- Ensure you're logged in to your workspace:
bl login MY-WORKSPACE
- Verify models are available:
bl get models
- Check that functions exist:
bl get functions
- Ensure you're logged in to your workspace:
-
Tavily API Issues:
- Ensure your Tavily API key is valid and active
- Check API usage limits and quotas
- Verify network connectivity to Tavily services
-
Research Depth Configuration:
- Adjust
report_plan_depth
for more detailed outlines - Increase
recursion_limit
for complex research topics - Monitor processing time for large research projects
- Adjust
-
Memory and Performance:
- Large research topics may require significant processing time
- Consider breaking down extremely broad topics
- Monitor system resources during parallel processing
-
Citation and Source Quality:
- Review search terms and refine for better results
- Check date ranges for time-sensitive research
- Verify source credibility in generated reports
For more help, please submit an issue on GitHub.
Contributions are welcome! Here's how you can contribute:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Commit your changes:
git commit -m 'Add amazing feature'
- Push to the branch:
git push origin feature/amazing-feature
- Submit a Pull Request
Please make sure to update tests as appropriate and follow the code style of the project.
If you need help with this template:
- Submit an issue for bug reports or feature requests
- Visit the Blaxel Documentation for platform guidance
- Check the LangGraph Documentation for workflow framework help
- Join our Discord Community for real-time assistance
This project is licensed under the MIT License. See the LICENSE file for more details.
Special thanks to: