Skip to content

VitorAEltz/tdc-2025-workshop-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

7 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

TDC 2025 Workshop - AI Agent with MCP Server

A comprehensive AI workshop project demonstrating how to build and deploy an AI-powered security agent using LangGraph, Model Context Protocol (MCP), and Azion Edge Computing. This repository contains three interconnected projects that work together to create a full-stack AI chatbot application.

πŸ—οΈ Architecture Overview

This workshop consists of three main components:

  1. Backend - AI Agent API powered by LangGraph and Hono
  2. Frontend - Vue3 chatbot widget interface
  3. MCP Server - Model Context Protocol server for extending AI capabilities
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Frontend      β”‚
β”‚   (Vue3)        β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”      β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Backend       │◄────►│   MCP Server    β”‚
β”‚   (LangGraph)   β”‚      β”‚   (Tools)       β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜      β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Azion         β”‚
β”‚ (SQL Database)  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ“‹ Prerequisites

Before starting, ensure you have:

  • Node.js (v18 or higher) and Yarn installed
  • Azion CLI installed (Installation Guide)
    brew install azion   # For macOS
  • API Keys from:

πŸš€ Quick Start

1. Clone the Repository

git clone <repository-url>
cd tdc-2025-workshop-ai

2. Setup Each Project

Follow the setup instructions for each component in order:

  1. Backend Setup
  2. MCP Server Setup
  3. Frontend Setup

πŸ”§ Backend Setup

The backend is an AI agent built with LangGraph framework, deployed on Azion Edge Computing.

Installation

cd backend
yarn install

Environment Configuration

Create a .env file in the backend directory with the following variables:

# Required
AZION_TOKEN=your_azion_token
OPENAI_API_KEY=your_openai_api_key

# Optional - Model Configuration
OPENAI_MODEL=gpt-4o
EMBEDDING_MODEL=text-embedding-3-small

# Optional - LangSmith Tracing
LANGSMITH_API_KEY=your_langsmith_key
LANGCHAIN_PROJECT=your_project_name
LANGCHAIN_TRACING_V2=false

# Database Configuration
MESSAGE_STORE_DB_NAME=your_messagestore_db
MESSAGE_STORE_TABLE_NAME=messages
VECTOR_STORE_DB_NAME=your_vectorstore_db
VECTOR_STORE_TABLE_NAME=vectors

Database Setup

  1. Initialize the database:

    yarn setup-db
  2. Upload documents for RAG:

    yarn upload-docs

    This will process files from migrations/files/ (supports PDF, MD, JSON, TXT).

Local Development

# Build the project
azion build

# Run locally
azion dev

The server will start at http://localhost:3333

Testing

# Test the agent locally
curl 'http://localhost:3333/' \
  --data-raw '{"messages":[{"role":"user","content":"Hello"}],"stream":false}'

Deployment

azion deploy

After deployment, test with:

curl 'https://<your-domain>/' \
  --data-raw '{"messages":[{"role":"user","content":"Hello"}],"stream":false}'

Available Scripts

  • yarn upload-docs - Upload documents to vector database
  • yarn setup-db - Initialize database schema
  • yarn lint - Run ESLint
  • yarn lint:fix - Fix linting issues
  • yarn type-check - TypeScript type checking

πŸ”Œ MCP Server Setup

The MCP (Model Context Protocol) server provides additional tools and capabilities for the AI agent.

Installation

cd mcp-server
yarn install

Local Development

# Build the server
azion build

# Run locally
azion dev

Available Tools

  • Calculator - Perform basic arithmetic operations (add, subtract, multiply, divide)

Extending the Server

To add new tools, edit src/core/tools.ts and follow the MCP protocol specifications.

Deployment

azion deploy

🎨 Frontend Setup

A Vue3-based chatbot widget that provides the user interface for interacting with the AI agent.

Installation

cd frontend
yarn install

Environment Configuration

Create a .env file in the frontend directory based on .env.example:

# Backend Configuration
VITE_BACKEND_COPILOT_ENDPOINT_TESTAGENTV3=https://your-backend-url

# UI Configuration
VITE_THEME=dark
VITE_TITLE=Security Agent + MCP Server
VITE_SUBTITLE=Make your Agent with MCP Server
VITE_PREVIEW_TEXT=Hello
VITE_FOOTER_DISCLAIMER=

# Suggestion Options (customize as needed)
VITE_SUGGESTION_1_TITLE=List the latest HTTP events for my application.
VITE_SUGGESTION_1_CONTEXT=List the latest HTTP events for my application.

Local Development

# Start development server
yarn dev

The application will be available at http://localhost:5173

Build for Production

yarn build

Deployment

azion deploy

Available Scripts

  • yarn dev - Start development server
  • yarn build - Build for production
  • yarn preview - Preview production build

πŸ“š Project Structure

tdc-2025-workshop-ai/
β”œβ”€β”€ backend/                 # AI Agent Backend
β”‚   β”œβ”€β”€ src/                # Source code
β”‚   β”œβ”€β”€ migrations/         # Database migrations and setup
β”‚   β”œβ”€β”€ azion.config.ts     # Azion configuration
β”‚   └── package.json
β”‚
β”œβ”€β”€ frontend/               # Vue3 Chatbot Widget
β”‚   β”œβ”€β”€ src/               # Vue components and logic
β”‚   β”œβ”€β”€ public/            # Static assets
β”‚   β”œβ”€β”€ azion.config.mjs   # Azion configuration
β”‚   └── package.json
β”‚
β”œβ”€β”€ mcp-server/            # Model Context Protocol Server
β”‚   β”œβ”€β”€ src/              # MCP server implementation
β”‚   β”œβ”€β”€ azion.config.ts   # Azion configuration
β”‚   └── package.json
β”‚
└── README.md             # This file

πŸ” Monitoring & Evaluation

Request Tracing with LangSmith

Monitor your AI agent's requests and responses through LangSmith:

  1. Set up LangSmith environment variables in backend .env
  2. Enable tracing: LANGCHAIN_TRACING_V2=true
  3. View traces in your LangSmith dashboard

RAG Evaluation

The backend includes capabilities for evaluating:

  1. Retrieval Quality - Document relevance and retriever performance
  2. Hallucination Detection - LLM adherence to source material
  3. Answer Relevance - Response quality to questions
  4. Reference Answers - Comparison against ground truth

For detailed evaluation guides, see LangSmith Evaluation Tutorials.


πŸ› οΈ Tech Stack

Backend

  • LangGraph - AI agent framework
  • Hono - Web framework
  • OpenAI - Language models
  • Azion EdgeSQL - Vector database
  • TypeScript - Programming language

Frontend

  • Vue 3 - Frontend framework
  • PrimeVue - UI components
  • TailwindCSS - Styling
  • Vite - Build tool
  • Axios - HTTP client

MCP Server

  • Model Context Protocol SDK - MCP implementation
  • Hono - Web framework
  • Zod - Schema validation

πŸ“– Additional Resources


🀝 Contributing

This is a workshop project. Feel free to experiment and extend the functionality!


πŸ“ License

ISC License - See individual project package.json files for details.


πŸ†˜ Troubleshooting

Common Issues

  1. Database connection errors

    • Verify AZION_TOKEN is set correctly
    • Ensure databases are created with yarn setup-db
  2. OpenAI API errors

    • Check OPENAI_API_KEY is valid
    • Verify you have sufficient API credits
  3. Build failures

    • Clear node_modules and reinstall: rm -rf node_modules && yarn install
    • Ensure you're using Node.js v18 or higher
  4. Frontend can't connect to backend

    • Verify VITE_BACKEND_COPILOT_ENDPOINT_TESTAGENTV3 points to correct backend URL
    • Check CORS settings if deploying to different domains

πŸ“§ Support

For issues related to:


Happy Building! πŸš€

About

tdc-2025-workshop-ai

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors