Skip to content

yugaljindle/ollama-ui

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ollama UI

Simple streamlit web-app & fastapi server to communicate with Ollama backend.

Setup

  1. Update config.json for app
  2. Create & Activate a new python env with conda (brew install --cask miniconda)
  3. cd client and run pip install -r requirements.txt
  4. cd server and run pip install -r requirements.txt
  5. Run Ollama server (localhost:11434)

Run Client

streamlit run client.py

Run Server

python server.py

Web Browser

Open https://localhost:8501/

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages