This Q&A application leverages Langchain, Groq, Hugging Face, and Streamlit to provide intelligent document-based question answering. Users can upload PDF files, and the application uses RAG to answer the user queries based on the attachment.
- LANGCHAIN
- GROQ : (
llama-3.1-70b-versatileLLM) - HUGGING-FACE
- FAISS (VectorDB)
- STREAMLIT
pip install -r requirements.txtGenerate API keys for
Groqfor inferencing hosted LLM modelshere.Hugging-faceembeddings for embedding the documentshere.
streamlit run app.pyOnce the application is running, navigate to localhost to interact with the Document Q&A system.
- Upload a PDF file: Select a PDF document using the file uploader in the sidebar. The app will split the document into manageable chunks for efficient search and retrieval.
- Ask a question: Once the PDF is processed, type your query in the chat input field. The app will generate a response based on the document's content and chat history.


