Skip to content

Computational-Imaging-LAB/RAG

Repository files navigation

RAG-Agent Chat

A Chainlit-based chat interface for Ollama models with RAG (Retrieval Augmented Generation) capabilities.

Features

  • Chat with multiple Ollama models
  • Upload and query documents using RAG
  • Persistent chat history throughout the conversation
  • Streaming responses for a more natural conversation flow
  • Model selection via dropdown menu

Prerequisites

  • Python 3.8+
  • Ollama installed and running locally (https://ollama.ai)
  • Models pulled into Ollama (e.g., ollama pull llama3.2:latest)

Installation

  1. Clone this repository:
git clone <repository-url>
cd <repository-directory>
  1. Install dependencies:
pip install -r requirements.txt

Usage

  1. Start the Chainlit app:
chainlit run app.py
  1. Open your browser and navigate to: http://localhost:8000

  2. Select an Ollama model from the dropdown menu

  3. Upload documents using the upload button

  4. Start chatting!

How It Works

This application:

  1. Uses the OllamaChat class from the RAG-Agent codebase for chat interactions
  2. Uses the Loader_Local class for document loading and retrieval
  3. Maintains chat history in the same format as the original code
  4. Retrieves relevant document chunks when answering questions
  5. Preserves chat context in the format: "User: {question}\n{answer}"

Configuration

You can modify the chainlit.config.toml file to customize the UI and behavior.

Troubleshooting

  • Ensure Ollama is running and accessible
  • Check that you have pulled the necessary models into Ollama
  • If document retrieval is not working, check the format of your documents

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages