Skip to content

Support for self hosted LLM's #2

Open
@RustyKobie

Description

Problem Statement:

At present, Wordflow lacks the capability to utilize self-hosted Language Models (LLMs), limiting users' flexibility in choosing or swapping models according to their specific needs or preferences.

Proposed Solution:

Integrating support for self-hosted LLMs within Wordflow would greatly enhance its functionality and utility. This enhancement would empower users to leverage a diverse range of models, adapt to evolving requirements, and potentially improve performance by selecting models tailored to specific prompt tasks.

Benefits:

Flexibility: Users can seamlessly switch between different models or select a specific model based on task requirements.
Customization: Allows users to fine-tune their experience by integrating custom or specialized models.
Scalability: Accommodates the rapid influx of new models, ensuring users can readily integrate them into their workflow.
Performance Optimization: Users can choose models optimized for specific tasks, potentially improving overall performance and accuracy.

Easy win would be to integrate with something like ollama, which exposes a standard api for a whole range of different models:
https://github.com/ollama/ollama

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions