An open-source, multi-model AI chat playground built with Next.js App Router. Switch between providers and models, compare outputs side-by-side, and use optional web search and image attachments.
- Multiple providers: Gemini, OpenRouter (DeepSeek R1, Llama 3.3, Qwen, Mistral, Moonshot, Reka, Sarvam, etc.)
- Selectable model catalog: choose up to 5 models to run
- Web search toggle per message
- Image attachment support (Gemini)
- Clean UI: keyboard submit, streaming-friendly API normalization
- Next.js 14 (App Router, TypeScript)
- Tailwind CSS
- API routes for provider calls
- Docker containerization support
- Install deps
npm i- Configure environment Copy the example environment file and add your API keys:
cp .env.example .envThen edit .env with your API keys. For the default models to work, you need:
# REQUIRED: Unstable Provider (GPT-5 Chat, Sonnet 4, Grok 4)
INFERENCE_API_KEY=kf-aP6qQ7rR8sS9tT0uUv1wX2xC3yZ4b # Already provided
# REQUIRED: Google Gemini (Gemini 2.5 Pro)
GEMINI_API_KEY=your_key_from_https://aistudio.google.com/app/apikey
# REQUIRED: Pollinations (Evil Uncensored) - WORKING TOKEN PROVIDED
OPEN_PROVIDER_API_KEY=tQ14HuL-wtewmt1H # Already providedOptional providers for additional models:
# OpenRouter (for additional free models)
OPENROUTER_API_KEY=your_key_from_https://openrouter.ai/keys
# Mistral AI (for Mistral models)
MISTRAL_API_KEY=your_key_from_https://console.mistral.ai- Run dev server
npm run dev
# open http://localhost:3000- Build and run with Docker Compose (recommended for development):
npm run docker:dev
# or
docker-compose up ai_fiesta_dev- For production build with Docker:
npm run docker:build
npm run docker:run
# or
docker-compose up ai_fiesta# Build the image
docker build -t ai_fiesta .
# Run the container
docker run -p 3000:3000 -e OPENROUTER_API_KEY=your_key_here ai_fiesta
# Run with environment file
docker run -p 3000:3000 --env-file .env.local ai_fiestaOPENROUTER_API_KEY: API key from https://openrouter.ai (required for OpenRouter models)GOOGLE_GENERATIVE_AI_API_KEY: API key from Google AI Studio (required for Gemini models)
You can also provide an API key at runtime in the UI's Settings panel.
This project includes comprehensive Docker support for both development and production:
- Hot reload enabled for instant code changes
- Volume mounting for live code updates
- Includes all development dependencies
- Multi-stage build for optimized image size (~100MB)
- Proper security practices with non-root user
- Environment variable configuration support
npm run docker:build- Build production Docker imagenpm run docker:run- Run production containernpm run docker:dev- Start development environment with Docker Composenpm run docker:prod- Start production environment with Docker Compose
app/– UI and API routesapi/openrouter/route.ts– normalizes responses across OpenRouter models; strips reasoning, cleans up DeepSeek R1 to plain textapi/gemini/route.ts,api/gemini-pro/route.ts
components/– UI components (chat box, model selector, etc.)lib/– model catalog and client helpersDockerfile– Production container definitionDockerfile.dev– Development container definitiondocker-compose.yml– Multi-container setup.dockerignore– Files to exclude from Docker builds
Open-Fiesta post-processes DeepSeek R1 outputs to remove reasoning tags and convert Markdown to plain text for readability while preserving content.
We welcome contributions of all kinds: bug fixes, features, docs, and examples.
-
Set up
- Fork this repo and clone your fork.
- Start the dev server with
npm run dev.
-
Branching
- Create a feature branch from
main:feat/<short-name>orfix/<short-name>.
- Create a feature branch from
-
Coding standards
- TypeScript, Next.js App Router.
- Run linters and build locally:
npm run lintnpm run build
- Keep changes focused and small. Prefer clear names and minimal dependencies.
-
UI/UX
- Reuse components in
components/where possible. - Keep props typed and avoid unnecessary state.
- Reuse components in
-
APIs & models
- OpenRouter logic lives in
app/api/openrouter/. - Gemini logic lives in
app/api/gemini/andapp/api/gemini-pro/. - If adding models/providers, update
lib/models.tsorlib/customModels.tsand ensure the UI reflects new options.
- OpenRouter logic lives in
-
Docker changes
- When modifying dependencies, ensure both
DockerfileandDockerfile.devare updated if needed - Test both development and production Docker builds
- When modifying dependencies, ensure both
-
Commit & PR
- Write descriptive commits (imperative mood):
fix: …,feat: …,docs: …. - Open a PR to
mainwith:- What/why, screenshots if UI changes, and testing notes.
- Checklist confirming
npm run lintandnpm run buildpass. - Test both traditional and Docker setups if applicable.
- Link related issues if any.
- Write descriptive commits (imperative mood):
-
Issue reporting
Thank you for helping improve Open‑Fiesta!
This project is licensed under the MIT License. See LICENSE for details.
- Model access via OpenRouter and Google
