Skip to content

Jacko88888/pdfchat-server

Repository files navigation

📄 PDF Chat Server

Chat with your own PDFs using Ollama, LangChain, and Gradio – all local, open-source, and fast.

Stars Forks Issues

📄 PDF Chat Server

PDF Chat Server is an AI-powered web app that lets you upload any PDF and chat with it using natural language.
Powered by LangChain, FAISS, and Ollama LLMs, it extracts, chunks, embeds, and retrieves relevant answers from your documents in real-time.


🧠 Features

  • 📄 Upload and index any PDF (scanned or digital)
  • 🤖 Ask natural language questions and get relevant answers
  • 🧩 Embeds using OllamaEmbeddings or HuggingFaceEmbeddings
  • ⚡ Fast semantic search via FAISS vector store
  • 📈 Optional system monitoring (CPU/RAM/GPU)
  • 🧠 Sidebar chat memory and history
  • 🕒 Timers for indexing and question answering
  • 🐳 Fully containerized using Docker

🚀 Quick Start

1. Clone the Repository

git clone https://github.com/Jacko88888/pdfchat-server.git
cd pdfchat-server
2. Install Ollama (if not already installed)
bash
Copy
Edit
curl -fsSL https://ollama.com/install.sh | sh
Start your preferred model (e.g. LLaMA 3):

bash
Copy
Edit
ollama run llama3
💡 You can swap models like llama3, qwen, mistral, etc.

3. Build & Run with Docker Compose
bash
Copy
Edit
docker compose up --build
Then open your browser to:

arduino
Copy
Edit
http://localhost:7860
🛠 Tech Stack
Backend: Python, LangChain, FAISS, Ollama

Frontend: Streamlit

Vector Store: FAISS (data/db)

Embeddings: OllamaEmbeddings (default) or HuggingFace

LLM API: Ollama (http://ollama:11434)

🗂 Project Structure
bash
Copy
Edit
pdfchat-server/
├── main.py                 # Streamlit app logic
├── Dockerfile              # Python runtime environment
├── docker-compose.yml      # Ollama + app orchestration
├── requirements.txt        # Python dependencies
├── data/                   # PDF chunks + FAISS index
└── README.md               # This file
📦 Environment Variables (Optional)
Defaults used by the app:

OLLAMA_BASE_URL: http://ollama:11434

DB_PATH: ./data/db

You can override them in main.py or export custom variables.

🧪 Development Mode (No Docker)
bash
Copy
Edit
pip install -r requirements.txt
streamlit run main.py
✅ Make sure Ollama is running locally before launching.

🧰 Troubleshooting
Ollama not responding?

Run: ollama run llama3

Check it’s available on http://localhost:11434

No answers or inaccurate responses?

Re-upload PDF

Try smaller files or more specific questions

📜 License
This project is licensed under the MIT License.

yaml
Copy
Edit

---

✅ You can now copy-paste this into your `README.md`. Want to add a screenshot of the UI or Netdata-style monitoring panel at the top? That’s a great way to impress GitHub visitors.

About

Chat with your own PDFs using Ollama, LangChain, and FAISS – fast, local, and open source.

Topics

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published