Your data, remembered. Your questions, answered.
MemorAIs is a full-fledged chat app to interact with a capable language model. It employs a Retrieval-Augmented Generation (RAG) pipeline to provide a more accurate and contextually relevant response. Further, memorAIs acts as an Model Context Protocol (MCP) host, to enable users agents to access the users' chat history with the language model. This AI-native application emphasizes user privacy and data security, ensuring that sensitive information is handled with care.
The service is located in the service directory. It uses docker compose to run the application.
To start the service run docker-compose up --build -d.
The checker is located in the checker directory. It is also run using docker compose. Start the checker with docker-compose up --build -d.
For detailed documentation, please refer to the documentation directory. It contains information on the functionality, technical overview, and vulnerabilities of the application.