Skip to content

Open WebUI and AnythingLLM integration compatibility issues #113

@Michael-A-Kuykendall

Description

@Michael-A-Kuykendall

Problem Description

Users report that popular LLM frontends cannot integrate with shimmy despite working with Ollama.

User Report from Issue #66

User: @barseghyanartur
Status: CURL works but frontends fail

Working:
✅ Direct CURL requests to shimmy API
✅ Both frontends work seamlessly with Ollama

Not Working:
❌ Open WebUI cannot detect/use shimmy
❌ AnythingLLM cannot detect/use shimmy

Expected Behavior

Popular LLM frontends should work with shimmy's OpenAI-compatible API just like they do with Ollama.

Actual Behavior

  • Open WebUI: Cannot detect shimmy as model provider
  • AnythingLLM: Refuses to automatically detect shimmy model provider
  • Both tools work immediately with Ollama on same system

Technical Investigation Required

  1. API Response Format: Are shimmy's responses matching OpenAI specification exactly?
  2. Models Endpoint: Does /v1/models return the format these frontends expect?
  3. Chat Completions: Are response fields complete and properly formatted?
  4. CORS/Authentication: Are there connection issues preventing frontend access?
  5. Model Detection: How do these frontends discover available models?

Compatibility Gap Analysis

Since both frontends work with Ollama but not shimmy, this suggests shimmy's OpenAI compatibility implementation has gaps that prevent real-world frontend integration.

Impact

This blocks users from using shimmy with popular LLM interfaces, limiting adoption compared to Ollama.

Affected Users

Related Issues

Labels

  • bug
  • compatibility
  • api
  • frontend-integration

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions