This repository contains a set of use cases demonstrating how to build AI-featured applications.
This demo shows how to create a simple AI agent using LangGraph and integrate it into a Next.js application. LangGraph is a robust framework for building agent and multi-agent workflows. It provides flexibility to build complex logic and has great tooling for debugging (LangGraph Studio) and monitoring (LangSmith). Next.js is a popular framework for building web applications.
The demo includes the following capabilities:
- Streaming. Agent streams LLM tokens to the client application.
- Generative UI. Renders components based on agent state. For example, weather widget.
- Human in the loop. Agent can ask users for clarification to proceed with tasks. For example, reminder creation confirmation.
- Persistence. LangGraph has a built-in persistence layer. It can be used to persist agent state between sessions. In the demo app, state is persisted in memory. See LangGraph Persistence for how to use PostgreSQL or MongoDB.
- Reply and Fork. Agent can be replied to or forked from any checkpoint.
- Agent state replication. Agent state is fully replicated on the client side based on the graph checkpoints.
- Error handling. The app displays global agent errors, such as when an agent is not accessible, as well as errors that occur at the graph node level.
- Stop agent. Agent execution can be stopped and resumed later.
- No dependencies. There are no dependencies on third-party libraries for integration. You can adjust it to your needs.
- Clean UI. The app is based on shadcn components and has dark and light theme support.
There are some features that are not implemented yet:
- Graph interruption (Human in the loop) in parallel nodes.
- Send custom events from the same parallel nodes. E.g., when checking weather for multiple cities at the same time, it is not possible to distinguish between them on the client side.
This demo shows how to create and use the Model Context Protocol (MCP) in your application. The Model Context Protocol is a method for integrating external data sources or services into your LLM application. The demo includes the following:
- TypeScript and Python MCP servers implementations
STDIO
andSSE
transport protocols- Integraion MCP servers with LangGraph servers
You can use this project as the starting point for your projects:
- Clone the repository
- Adjust the AI agent logic in the
graph.py
file or create a brand new one - Adjust the agent state in the
agent-types.ts
file - In the client app, call agent using
useLangGraphAgent
hook in your components
Add .env file to the /agent
directory and set your OPENAI_API_KEY (See .env.example
)
cd agent/
poetry install
poetry run server
To run the AI server with MCP tools using the SSE
protocol, first start the MCP servers. MCP servers using the STDIO
protocol run automatically.
- Start Booking MCP demo server
cd mcp-servers/booking-mcp
bun install
npm start
- Start Calendar MCP demo server
cd mcp-servers/calandar-mcp
uv sync
uv run python calendar-mcp-server.py sse
- Edit MCP servers confuguration in graph.py
- Run agent server with
--mcp
flag
cd agent/
poetry install
poetry run server --mcp
cd client/
npm install
npm run dev
Application will start at http://localhost:3000 by default
Get expert support with Akveo's AI development services
.