Skip to content

Chat interface built with Next.js to be used with Ollama server. It provides a chatgpt style interface to interact with the locally hosted API and allows users to interact with large language models (LLMs) with Markdown rendering support. Your required model should be running locally.

Notifications You must be signed in to change notification settings

sidhu18/fireside-chat

Repository files navigation

Chat interface built with Next.js to be used with Ollama server. It provides a chatgpt style interface to interact with the locally hosted API and allows users to interact with large language models (LLMs) with Markdown rendering support. Your required model should be running locally.

The Frontend was mostly 'vibe coded', so just raise any bugs directly to the LLMs :P

Getting Started

First, run the development server:

npm run dev
# or
yarn dev
# or
pnpm dev
# or
bun dev

Open http://localhost:3000 with your browser to see the result.

About

Chat interface built with Next.js to be used with Ollama server. It provides a chatgpt style interface to interact with the locally hosted API and allows users to interact with large language models (LLMs) with Markdown rendering support. Your required model should be running locally.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages