Skip to content
/ DocQA Public
forked from afaqueumer/DocQA

Question Answering with Custom FIles using LLMs

Notifications You must be signed in to change notification settings

fxy1699/DocQA

ย 
ย 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

9 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

DocQA ๐Ÿค–

image

DocQA ๐Ÿค– is a web application built using Streamlit ๐Ÿ”ฅ and the LangChain ๐Ÿฆœ๐Ÿ”— framework, allowing users to leverage the power of LLMs for Generative Question Answering. ๐ŸŒŸ

Read More Here ๐Ÿ‘‰ https://ai.plainenglish.io/๏ธ-langchain-streamlit-llama-bringing-conversational-ai-to-your-local-machine-a1736252b172

Installation

To run the LangChain web application locally, follow these steps:

Clone this repository ๐Ÿ”—

git clone https://github.com/afaqueumer/DocQA.git

Create Virtual Environment and Install the required dependencies โš™๏ธ

Run โžก๏ธ setup_env.bat 

Launch Streamlit App ๐Ÿš€

Run โžก๏ธ run_app.bat

Usage

Once you have the Streamlit web application up and running, you can perform the following steps:

  1. Upload the Text File.
  2. Once the Text File is loaded as the Vector Store Database it will show a success alert "Document is Loaded".
  3. Insert the question in "Ask" textbox and submit your question for LLM to generate the answer.

Contributing

Contributions to this app are welcome! If you have any ideas, suggestions, or bug fixes, please feel free to open an issue or submit a pull request. We appreciate your contributions.

License

This project is licensed under the MIT License.

๐ŸŽ‰ Thank you ๐Ÿค— Happy question answering! ๐ŸŒŸ

About

Question Answering with Custom FIles using LLMs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.0%
  • Batchfile 3.0%