Skip to content

[feature] LocalAI endpoint #92

Open
@nalbion

Description

@nalbion

I don't have any experience with it, but LocalAI might be more attractive for people working in environments where sending source code out to the interwebs is frowned upon.

LocalAI is a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families that are compatible with the ggml format. Does not require GPU.

  • Text generation with llama.cpp, gpt4all.cpp and more
  • OpenAI functions
  • Embeddings generation for vector databases
  • Download Models drectly from Huggingface

See also #69 by @mrgoonie

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions