Give your AI agent the ability to scrape any website.
A Model Context Protocol (MCP) server that routes web requests through proxies with built-in anti-detection. Works with Claude Desktop, Cursor, GitHub Copilot, and any MCP-compatible client.
AI agents need to access web content, but direct requests get blocked. proxy-mcp solves this by routing every request through your proxy infrastructure with realistic browser fingerprints, automatic header rotation, and stealth defaults -- so your agent can scrape anything without getting flagged.
npm install -g @birdproxies/proxy-mcpAdd to your claude_desktop_config.json:
{
"mcpServers": {
"proxy-mcp": {
"command": "npx",
"args": ["-y", "@birdproxies/proxy-mcp"],
"env": {
"PROXY_URL": "http://user:pass@proxy.example.com:8080"
}
}
}
}Add to .cursor/mcp.json in your project root:
{
"mcpServers": {
"proxy-mcp": {
"command": "npx",
"args": ["-y", "@birdproxies/proxy-mcp"],
"env": {
"PROXY_URL": "http://user:pass@proxy.example.com:8080"
}
}
}
}Add to your .vscode/mcp.json:
{
"servers": {
"proxy-mcp": {
"command": "npx",
"args": ["-y", "@birdproxies/proxy-mcp"],
"env": {
"PROXY_URL": "http://user:pass@proxy.example.com:8080"
}
}
}
}Fetch any URL through a proxy and return clean content. Best for static pages and APIs.
| Parameter | Type | Default | Description |
|---|---|---|---|
url |
string | required | The URL to scrape |
format |
"markdown" | "text" | "html" |
"markdown" |
Output format |
headers |
object | {} |
Additional HTTP headers |
timeout |
number | 30000 |
Request timeout in ms |
follow_redirects |
boolean | true |
Follow HTTP redirects |
Scrape JavaScript-rendered pages using a headless browser. Requires puppeteer as a peer dependency (npm install puppeteer).
| Parameter | Type | Default | Description |
|---|---|---|---|
url |
string | required | The URL to scrape |
format |
"markdown" | "text" | "html" |
"markdown" |
Output format |
wait_for |
string | - | CSS selector to wait for before extracting |
timeout |
number | 60000 |
Page load timeout in ms |
screenshot |
boolean | false |
Capture page screenshot |
javascript |
boolean | true |
Execute JavaScript on page |
Verify proxy IP address and geolocation. Use before scraping to confirm your proxy is in the right region.
| Parameter | Type | Default | Description |
|---|---|---|---|
service |
"ipinfo" | "ipapi" |
"ipinfo" |
IP lookup service |
Search Google through a proxy and get structured results with titles, URLs, and snippets.
| Parameter | Type | Default | Description |
|---|---|---|---|
query |
string | required | Search query |
num_results |
number | 10 |
Results to return (max 100) |
language |
string | "en" |
Language code |
country |
string | "us" |
Country code |
Manage a round-robin proxy pool. Add proxies, rotate between them, and monitor performance stats.
| Parameter | Type | Default | Description |
|---|---|---|---|
action |
"next" | "add" | "remove" | "stats" |
"next" |
Pool action |
proxy_url |
string | - | Proxy URL (required for add/remove) |
All configuration is done through environment variables:
| Variable | Required | Description |
|---|---|---|
PROXY_URL |
Yes | Proxy URL in standard format |
http://host:port
http://user:pass@host:port
https://user:pass@host:port
socks5://user:pass@host:port
socks4://host:port
host:port:user:pass
Once connected, just talk to your AI agent naturally:
"Scrape the pricing page at https://example.com/pricing and summarize the plans."
"Search Google for 'best restaurants in Tokyo' and get me the top 5 results."
"Check what IP address and country my proxy is using."
"Scrape https://example.com/dashboard using the browser tool -- it needs JavaScript to render."
"Add a second proxy and rotate between them for the next few requests."
"Get the HTML from https://example.com/api/products and parse the JSON response."
Every request includes built-in stealth measures:
- Realistic browser headers -- rotates through Chrome versions, platforms, and language preferences
- TLS fingerprint matching -- uses standard Node.js/Chromium TLS settings
- Sec-CH-UA headers -- includes Client Hints that match the User-Agent
- Proper header ordering -- mimics the order real browsers send headers
- CAPTCHA detection -- detects and reports when a target shows a CAPTCHA
git clone https://github.com/birdproxies/proxy-mcp.git
cd proxy-mcp
npm install
npm run build
PROXY_URL=http://user:pass@localhost:8080 npm startMIT -- Built by BirdProxies - Premium Proxy Provider