Skip to content

A powerful, production-ready context management system for Large Language Models (LLMs). Built with ChromaDB and modern embedding technologies, it provides persistent, project-specific memory capabilities that enhance your AI's understanding and response quality.

License

Notifications You must be signed in to change notification settings

bsmi021/mcp-memory-bank

Repository files navigation

🐳 Running with Docker

This project is fully Docker-ready for easy deployment and local development. The provided Dockerfile and docker-compose.yml set up both the main application and its required ChromaDB vector database.

Requirements

  • Docker (latest stable)
  • Docker Compose (v2+ recommended)

Environment Variables

The following environment variables are used by default (can be overridden in your environment or via docker-compose.yml):

CHROMADB_URL=http://chromadb:8000
TRANSPORT=http
HTTP_PORT=3000
MCP_MEMBANK_EMBEDDING_MODEL=Xenova/all-MiniLM-L6-v2
NODE_ENV=production
NODE_OPTIONS=--max-old-space-size=4096

Build & Run

To build and start all services:

docker-compose up --build -d

This will:

  • Build the main TypeScript application (Node.js v22.13.1-slim)
  • Start the app as ts-app (listening on port 3000)
  • Start ChromaDB as chromadb (listening on port 8000)
  • Create a persistent volume for ChromaDB data
  • Set up a shared Docker network for inter-service communication

Ports

  • 3000: Main application HTTP API (ts-app)
  • 8000: ChromaDB vector database (chromadb)

Data Persistence

  • ChromaDB data is persisted in the named Docker volume chromadb-data.
  • Application data directory (/app/data) is created and owned by a non-root user inside the container.

Special Notes

  • The application requires ChromaDB to be available at the URL specified by CHROMADB_URL (default: http://chromadb:8000).
  • The embedding model can be changed via the MCP_MEMBANK_EMBEDDING_MODEL environment variable.
  • If you need to customize environment variables, edit the docker-compose.yml or use an .env file.

About

A powerful, production-ready context management system for Large Language Models (LLMs). Built with ChromaDB and modern embedding technologies, it provides persistent, project-specific memory capabilities that enhance your AI's understanding and response quality.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published