Skip to content

infranodus/llamaindex-chat-with-docs

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

71 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Chat with Any Docs using LlamaIndex, Streamlit, and any LLM Model (including DeepSeek)

Upload your own set of documents and chat with them using a model of your choice. You can use it with the open-source DeepSeek R1 model as well, just specify the details in the settings.

This app is a modified version of StreamLit for LlamaIndex example where you can actually upload your own files (PDF, TXT, DOCX, CSVs, etc.) and chat with them. The original version is using a hardcoded folder. I also added a possibility to use open-source models here, not only GPT-4o.

Overview of the App

  • Takes user queries via Streamlit's st.chat_input and displays both user queries and model responses with st.chat_message
  • Uses LlamaIndex to load and index data and create a chat engine that will retrieve context from that data to respond to each user query

Demo App

TBA

Get an OpenAI API key

You can get your own OpenAI API key by following the following instructions:

  1. Go to https://platform.openai.com/account/api-keys.
  2. Click on the + Create new secret key button.
  3. Next, enter an identifier name (optional) and click on the Create secret key button.
  4. Add your API key to your Environemnt variables as OPENAI_KEY. Alternatively, you can add this API key to your .streamlit/secrets.toml file (rename the sample file after you add your key).

Caution

Don't commit your secrets file to your GitHub repository. The .gitignore file in this repo includes .streamlit/secrets.toml and secrets.toml.

To launch the app

  1. Clone the repo in your Terminal (or using a service like Koyeb)
git clone git@github.com:streamlit/llamaindex-chat-with-streamlit-docs.git
  1. Change to llamaindex-chat-with-streamlit-docs directory and install dependencies:
pip install -r requirements.txt

Before that, you might want to change to a new Python environment not to mess up your current one. For instance, if you're using Conda:

conda create -n streamlit
conda activate streamlit
  1. Once the dependencies are installed, run the app:
streamlit run streamlit_app.py

It will be available on

http://localhost:8501/

Once the app is loaded, use the folder on the left to load your own library, Reindex it, and after the indexing is done ask a question using the chat.

Note, that if you're using OpenAI's GPT4o model (the default one), it may be quite expensive for big folders. So either reduce the number of folders or use another model.

About

A chatbot for your own documents, powered by LlamaIndex and Streamlit

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 100.0%