A locally run AI assistant that uses Ollama (WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7B or LLM of your choice) and scrape the web using Bright Data's Unlocker MCP tools β all orchestrated in Python and Streamlit with some Node.js under the hood. No cloud LLMs. No data sharing. Just local compute magic.
π Local, Private, and Powerful β because your prompts are nobody else's business.
- Python + Asyncio
- LangChain
- Streamlit
- Bright Data MCP
- Ollama
- Terminal spinner magic courtesy of
itertools
- Go to brightdata.com
- Under Proxies and Scraping, create an
Unlocker_MCPzone - Make sure to:
- Allow Admin Access
- Set token to never expire
- Paste the token into your terminal session:
export BRD_API_KEY=your_token_hereπΈ They offer free credits on signup (no credit card needed). Ignore the payment screens, click through.
git clone https://github.com/drewesk/ai-mcp-py.git
cd ai-mcp-pyconda create -n mcp_env python=3.12
conda activate mcp_env
conda install nodejsπ§ͺ Make sure
node,npx, andcondawork from your terminal. You may need to update your.zshrcor.bashrc.
pip install -r requirements.txtnpx @brightdata/mcp API_TOKEN=$BRD_API_KEYIn a separate terminal:
ollama serve
ollama run WhiteRabbitNeo/WhiteRabbitNeo-2.5-Qwen-2.5-Coder-7BMake sure the model runs smoothly on your machine (GPU preferred).
streamlit run mcp_app.py- Paste URL β
https://medium.com/ayuth/install-anaconda-on-macos-with-homebrew-c94437d63a37 - Prompt β
"What does this article tell me to do?" - Hit Submit, wait for the π spinner to complete
- VoilΓ ! β you'll see a local LLM-generated summary right on screen
Open PRs, make suggestions, or fork at your liesure.