You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This demonstration shows you how to use embeddings with existing data in LocalAI. We are using the `llama_index` library to facilitate the embedding and querying processes. The `Weaviate` client is used as the embedding source.
4
-
5
-
## Prerequisites
6
-
7
-
Before proceeding, make sure you have the following installed:
8
-
- Weaviate client
9
-
- LocalAI and its dependencies
10
-
- llama_index and its dependencies
3
+
This demonstration shows you how to use embeddings with existing data in LocalAI.
4
+
We are using the `llama-index` library to facilitate the embedding and querying processes.
5
+
The `Weaviate` client is used as the embedding source.
11
6
12
7
## Getting Started
13
8
14
-
1. Clone this repository:
15
-
16
-
2. Navigate to the project directory:
9
+
1. Clone this repository and navigate to this directory
LocalAI is a community-driven project that aims to make AI accessible to everyone. It was created by Ettore Di Giacinto and is focused on providing various AI-related features such as text generation with GPTs, text to audio, audio to text, image generation, and more. The project is constantly growing and evolving, with a roadmap for future improvements. Anyone is welcome to contribute, provide feedback, and submit pull requests to help make LocalAI better.
0 commit comments