Chat interface built with Next.js to be used with Ollama server. It provides a chatgpt style interface to interact with the locally hosted API and allows users to interact with large language models (LLMs) with Markdown rendering support. Your required model should be running locally.
The Frontend was mostly 'vibe coded', so just raise any bugs directly to the LLMs :P
First, run the development server:
npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev
Open http://localhost:3000 with your browser to see the result.