Skip to content
/ mcp-use Public

πŸš€ mcp-use is a TypeScript library that makes it easy to connect LangChain.js-compatible LLMs with MCP servers. Build powerful, flexible AI agents with dynamic tool access and multi-server support.

License

Notifications You must be signed in to change notification settings

zandko/mcp-use

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

13 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Unified MCP Client Library

npm version License Code style: ESLint GitHub stars

🌐 MCP Client is the open-source way to connect any LLM to any MCP server in TypeScript/Node.js, letting you build custom agents with tool access without closed-source dependencies.

πŸ’‘ Let developers easily connect any LLM via LangChain.js to tools like web browsing, file operations, 3D modeling, and more.


✨ Key Features

Feature Description
πŸ”„ Ease of use Create an MCP-capable agent in just a few lines of TypeScript.
πŸ€– LLM Flexibility Works with any LangChain.js-supported LLM that supports tool calling.
🌐 HTTP Support Direct SSE/HTTP connection to MCP servers.
βš™οΈ Dynamic Server Selection Agents select the right MCP server from a pool on the fly.
🧩 Multi-Server Support Use multiple MCP servers in one agent.
πŸ›‘οΈ Tool Restrictions Restrict unsafe tools like filesystem or network.
πŸ”§ Custom Agents Build your own agents with LangChain.js adapter or implement new adapters.

πŸš€ Quick Start

Requirements

  • Node.js 22.0.0 or higher
  • npm, yarn, or pnpm (examples use pnpm)

Installation

# Install from npm
npm install mcp-use
# LangChain.js and your LLM provider (e.g., OpenAI)
npm install langchain @langchain/openai dotenv

Create a .env:

OPENAI_API_KEY=your_api_key

Basic Usage

import { ChatOpenAI } from '@langchain/openai'
import { MCPAgent, MCPClient } from 'mcp-use'
import 'dotenv/config'

async function main() {
  // 1. Configure MCP servers
  const config = {
    mcpServers: {
      playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
    }
  }
  const client = MCPClient.fromDict(config)

  // 2. Create LLM
  const llm = new ChatOpenAI({ modelName: 'gpt-4o' })

  // 3. Instantiate agent
  const agent = new MCPAgent({ llm, client, maxSteps: 20 })

  // 4. Run query
  const result = await agent.run('Find the best restaurant in Tokyo using Google Search')
  console.log('Result:', result)
}

main().catch(console.error)

πŸ“‚ Configuration File

You can store servers in a JSON file:

{
  "mcpServers": {
    "playwright": {
      "command": "npx",
      "args": ["@playwright/mcp@latest"]
    }
  }
}

Load it:

import { MCPClient } from 'mcp-use'
const client = MCPClient.fromConfigFile('./mcp-config.json')

πŸ”„ Multi-Server Example

const config = {
  mcpServers: {
    airbnb: { command: 'npx', args: ['@openbnb/mcp-server-airbnb'] },
    playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
  }
}
const client = MCPClient.fromDict(config)
const agent = new MCPAgent({ llm, client, useServerManager: true })
await agent.run('Search Airbnb in Barcelona, then Google restaurants nearby')

πŸ”’ Tool Access Control

const agent = new MCPAgent({
  llm,
  client,
  disallowedTools: ['file_system', 'network']
})

πŸ‘₯ Contributors

Zane/
Zane

πŸ“œ License

MIT Β© Zane

About

πŸš€ mcp-use is a TypeScript library that makes it easy to connect LangChain.js-compatible LLMs with MCP servers. Build powerful, flexible AI agents with dynamic tool access and multi-server support.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published