UBi is an agentic AI-powered assistant for the UniversitΓ€tsbibliothek Mannheim, built with Chainlit and LangChain. It combines large language models (LLMs) with data from the library website to deliver context-aware answers.
- π§ Agentic Router β Dynamically detects language, augments user queries, and intelligently routes them to the most suitable tool
- π§ Semantic Augmentation β Enhances questions with context to optimize semantic search and retrieval
- π Tool selector β Routes queries to one of three specialized tools:
- π RAG Pipeline β Retrieval-Augmented Generation using OpenAI embeddings, OpenAI inference, and OpenAI Cloud-based vectorstore
- π° Library News Fetcher β Retrieves the latest updates directly from the UB Mannheim blog
- πͺ Real-time Seat Availability β Displays real-time information on study space availability at the library
- π Multilingual Support β Detects and processes user input in multiple languages
- π Feedback Collection β Stores user questions, answers, and satisfaction ratings for continuous improvement
- π Terms of Use Popup β Ensures users accept terms before interaction
- π Optional Login System β Supports password-protected access for restricted deployments
| Component | Technology |
|---|---|
| Frontend UI | Chainlit |
| Backend Logic | Python + LangChain |
| LLMs | OpenAI |
| Embeddings | OpenAI |
| Vector Database | OpenAI |
| Deployment | Docker + Docker Compose |
git clone https://github.com/UB-Mannheim/UBi.git
cd UBi/codeOPENAI_API_KEY=sk-...pip install -r requirements.txtYou can choose between two ways of running the app:
- Running the RAG pipeline locally
- This option will embed all documents locally using the OpenAI embedding model
text-embedding-ada-002and create achromadbvectorstore.
- This option will embed all documents locally using the OpenAI embedding model
- Running the RAG pipeline with an OpenAI vectorstore
- This option will create and upload all document to an OpenAI vectorstore
chainlit run app.pyOpen http://localhost:8000 in a browser.
OPENAI_API_KEY=sk-...
USE_OPENAI_VECTORSTORE='True'chainlit run app.pyOpen http://localhost:8000 in a browser.
OPENAI_API_KEY=sk-...
USE_OPENAI_VECTORSTORE='True' # Optional (for use with OpenAI vectorstore)- Optionally, set the exposed TCP port using the environment variable
PORT(default: 8000).
docker-compose up --buildOpen http://localhost:8000 in a browser.
All chats and feedback are stored in the database data/feedback.db:
| Field | Description |
|---|---|
| session_id | Random session ID |
| question | User input |
| augmented_question | Augmented user input |
| answer | LLM-generated response |
| timestamp | UTC datetime |
| feedback | Score + optional comment |
You can view or export this data for improving the bot.
This work is licensed under the MIT license (code) and Creative Commons Attribution 4.0 International license (for everything else). You are free to share and adapt the material for any purpose, even commercially, as long as you provide attribution.