-
-
Notifications
You must be signed in to change notification settings - Fork 213
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Problem Description
Users report that popular LLM frontends cannot integrate with shimmy despite working with Ollama.
User Report from Issue #66
User: @barseghyanartur
Status: CURL works but frontends fail
Working:
✅ Direct CURL requests to shimmy API
✅ Both frontends work seamlessly with Ollama
Not Working:
❌ Open WebUI cannot detect/use shimmy
❌ AnythingLLM cannot detect/use shimmy
Expected Behavior
Popular LLM frontends should work with shimmy's OpenAI-compatible API just like they do with Ollama.
Actual Behavior
- Open WebUI: Cannot detect shimmy as model provider
- AnythingLLM: Refuses to automatically detect shimmy model provider
- Both tools work immediately with Ollama on same system
Technical Investigation Required
- API Response Format: Are shimmy's responses matching OpenAI specification exactly?
- Models Endpoint: Does
/v1/models
return the format these frontends expect? - Chat Completions: Are response fields complete and properly formatted?
- CORS/Authentication: Are there connection issues preventing frontend access?
- Model Detection: How do these frontends discover available models?
Compatibility Gap Analysis
Since both frontends work with Ollama but not shimmy, this suggests shimmy's OpenAI compatibility implementation has gaps that prevent real-world frontend integration.
Impact
This blocks users from using shimmy with popular LLM interfaces, limiting adoption compared to Ollama.
Affected Users
Related Issues
- Issue [Question]: How to use it with Open WebUI? #66: Original Open WebUI integration question
- Issue [Feature]: Open WebUI, AnythingLLM suppprt #70: Feature implementation tracking
Labels
- bug
- compatibility
- api
- frontend-integration
barseghyanartur
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working