A template implementation of a deep research agent using LangGraph and GPT-4. This agent performs comprehensive research on any given topic by:
- Generating an initial research plan and outline
- Breaking down the topic into logical sections
- Performing targeted web searches using Tavily for each section
- Writing detailed section content based on search results
- Compiling and refining the final report
The implementation uses LangGraph for orchestrating the research workflow, with parallel processing of sections for improved efficiency. It leverages GPT-4 for content generation and the Tavily search API for gathering relevant, up-to-date information from the web.
Key features:
- Automated research planning and execution
- Parallel processing of research sections
- Web search integration via Tavily
- Structured report generation with citations
- Configurable search depth and report formatting
Special thanks to:
- MG: Creating this agent and showing an amazing tutorial
- Langchain: Creating an awesome implementation of DeepResearch with langgraph
- OpenAI: Showing everyone this feature
- Python: 3.12 or later.
- UV: An extremely fast Python package and project manager, written in Rust.
- Blaxel CLI: Ensure you have the Blaxel CLI installed. If not, install it globally:
curl -fsSL https://raw.githubusercontent.com/beamlit/toolkit/main/install.sh | BINDIR=$HOME/.local/bin sh
- Blaxel login: Login to Blaxel platform
bl login YOUR-WORKSPACE
-
Clone the repository and install the dependencies:
git clone https://github.com/beamlit/template-deepresearch.git cd template-deepresearch uv sync
-
Environment Variables: Create a
.env
file with your configuration. You can begin by copying the sample file:cp .env-sample .env
Then, update the following values with your own credentials:
- Tavily Api Key:
TAVILY_API_KEY
- Tavily Api Key:
Start the development server with hot reloading using the Blaxel CLI command:
bl serve --hotreload
Note: This command starts the server and enables hot reload so that changes to the source code are automatically reflected.
bl chat --local template-deepresearch
Note: This command starts a chat interface. Example question: Do a report of annual revenu for the last 10 years of NVIDIA
or
bl run agent template-deepresearch --local --data '{"input": "Do a report of annual revenu for the last 10 years of NVIDIA", "report_plan_depth": 20, "recursion_limit": 100 }'
When you are ready to deploy your application, run:
bl deploy
This command uses your code and the configuration files under the .blaxel
directory to deploy your application.
- src/main.py - This is your entrypoint
- src/agent
/agent.py
- Configures the chat agent, streams HTTP responses, and integrates conversational context./llmlogic.py
- Where the magic happens/prompts.py
- List of prompts given to agents
- src/server
/router.py
- Define routes for your API/middleware.py
- Define middleware for your API/error.py
- Handle error for Request
- pyproject.toml - UV package manager file.
- blaxel.toml - Configuration file for deployment on blaxel
This project is licensed under the MIT License. See the LICENSE file for more details.