We're building an MCP-powered multi-agent, multi-platform deep researcher, it can perform deep web searches using Brightdata's (Web MCP server), with agents orchestrated through CrewAI.
We use:
- Brightdata (Web MCP server)
- CrewAI (Agentic design)
- Ollama to locally serve LLM
- Streamlit to wrap the logic in an interactive UI
Follow these steps one by one:
Create a .env file in the root directory of your project with the following content:
OPENAI_API_KEY=<your_openai_api_key>
BRIGHT_DATA_API_TOKEN=<your_bright_data_api_token>Download and install Ollama for your operating system. Ollama is used to run large language models locally.
For example, on linux, you can use the following command:
curl -fsSL https://ollama.com/install.sh | shPull the required model:
ollama pull gpt-ossuv sync
source .venv/bin/activateThis command will install all the required dependencies for the project.
To run the CrewAI flow, execute the following command:
python flow.pyRunning this command will start the CrewAI agentic workflow, which will handle the multi-agent orchestration for deep web research using Brightdata's Web MCP server.
To run the Streamlit interface, execute the following command:
streamlit run app.pyRunning this command will start the Streamlit interface, allowing you to interact with the deep research application through a user-friendly web interface. Check the terminal output for the local URL to access the interface in your web browser. Go to the provided URL (usually http://localhost:8501) to access the Streamlit app.
Get a FREE Data Science eBook 📖 with 150+ essential lessons in Data Science when you subscribe to our newsletter! Stay in the loop with the latest tutorials, insights, and exclusive resources. Subscribe now!
Contributions are welcome! Feel free to fork this repository and submit pull requests with your improvements.
