Skip to content

lunarc/ollama-coach-ui

Repository files navigation

Ollama Chat UI

A comprehensive conversational user interface for Ollama with integrated model management, built with PyQt/PySide using the qtpy compatibility layer.

✨ Features

💬 Chat Interface

  • Clean Chat Interface: Modern chat bubble interface similar to popular messaging apps
  • Model Selection: Dropdown to select from available Ollama models
  • Streaming Responses: Real-time streaming of AI responses
  • Stop Generation: Ability to stop response generation mid-stream
  • Chat History: Maintains conversation context
  • Clear Chat: Option to clear conversation history
  • Enhanced Controls: Temperature and token limit controls

🔧 Integrated Model Manager

  • Service Management: Start and stop Ollama service directly from the UI
  • Model Download: Easy download interface with popular model suggestions
  • Model Deletion: Safe model removal with confirmation
  • Real-time Status: Service status monitoring with visual indicators
  • Non-blocking Operations: All operations run in background threads
  • Auto-detection: Automatic Ollama installation detection

🎨 User Experience

  • Tabbed Interface: Organized chat and management in separate tabs
  • Model Synchronization: Models automatically sync between tabs
  • Responsive UI: Non-blocking interface with smooth interactions
  • Error Handling: Comprehensive error handling and user feedback
  • Cross-platform: Windows, macOS, and Linux support

📋 Requirements

  • Python 3.7+
  • Ollama server (automatically managed by the UI)
  • qtpy (compatibility layer for PyQt5/PySide2)
  • requests
  • psutil (for service management)

🚀 Installation

  1. Install the required Python packages:

    pip install -r requirements.txt
  2. The application will help you set up Ollama if not already installed

🎯 Usage

  1. Start the Ollama server if it's not already running
  2. Run the application:
    python main.py
  3. Select a model from the dropdown (click "Refresh Models" if needed)
  4. Start chatting!

Interface Components

Main Window

  • Model Selection: Dropdown to choose which Ollama model to use
  • Refresh Models: Button to reload available models from Ollama
  • Clear Chat: Button to clear the conversation history
  • Chat Area: Scrollable area displaying conversation bubbles
  • Input Field: Text field for typing messages
  • Send Button: Send the current message
  • Stop Button: Stop the current response generation
  • Status Bar: Shows current application status

Chat Bubbles

  • User Messages: Blue bubbles on the right side
  • AI Responses: Gray bubbles on the left side
  • Auto-scrolling: Automatically scrolls to show latest messages

Configuration

The application connects to Ollama at http://localhost:11434 by default. You can modify this in the OllamaClient class initialization if your Ollama server is running on a different host or port.

Troubleshooting

No Models Found

  • Make sure Ollama is running: ollama serve
  • Verify you have models installed: ollama list
  • Try pulling a model: ollama pull llama2

Connection Errors

  • Check if Ollama is running on the correct port
  • Verify firewall settings aren't blocking the connection
  • Try restarting the Ollama service

UI Issues

  • Make sure qtpy and a Qt backend (PyQt5/PySide2) are installed
  • Try running with different Qt backends if issues persist

Contributing

Feel free to submit issues and enhancement requests!

License

This project is open source and available under the MIT License.

About

A user interface for running ollama models.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published