Skip to content

Latest commit

 

History

History
68 lines (47 loc) · 2.32 KB

File metadata and controls

68 lines (47 loc) · 2.32 KB
ChatGPUI

ChatGPUI

A blazingly fast, GPU-accelerated native LLM chat client built with GPUI — the same rendering engine that powers Zed.

License Rust GPUI Hits

FeaturesInstallationConfigurationDevelopmentLicense


Features

  • GPU-Accelerated Rendering - Leverages Metal/Vulkan for buttery-smooth 120fps UI with minimal CPU usage
  • Streaming Responses - Real-time streaming output from LLM providers
  • Multi-Provider Support - Connect to various LLM providers:
    • Anthropic (Claude)
    • OpenAI (GPT-4, GPT-4o)
    • Azure OpenAI
    • DeepSeek
    • Google AI (Gemini)
    • Groq
    • Mistral
    • Ollama (Local)
    • OpenRouter
  • Modern UI - Clean, macOS-native interface with smooth animations
  • Collapsible Sidebar - Chat history sidebar with animated toggle
  • Persistent Settings - Configuration saved locally
  • i18n Support - Internationalization ready

Tech Stack

License

This project is dual-licensed:

For commercial licensing inquiries, please contact license@aprilnea.com.


Made with ❤️ by AprilNEA