Skip to content

tsw-codes/study-buddy

Repository files navigation

📚 Study Buddy

A simple learning project to explore AI integration and LLM implementation. This app demonstrates how to integrate local LLMs with web applications by creating a PDF analysis tool using FastAPI and Ollama.

� Learning Purpose

This project was built to understand:

  • How to integrate LLMs into web applications
  • Working with local AI models using Ollama
  • Building API endpoints that interact with AI
  • Processing documents and providing contextual AI responses
  • Session management for conversational AI interfaces

⚡ What It Does

  • Upload PDF documents for AI analysis
  • Generate summaries, questions, flashcards, and outlines
  • Ask follow-up questions about uploaded documents
  • Demonstrates different prompt engineering techniques

🛠 Tech Stack

  • FastAPI - Web framework
  • Ollama + Llama3 - Local LLM processing
  • PyMuPDF - PDF text extraction
  • HTML/CSS/JS - Simple frontend

� Quick Start

  1. Install Ollama and pull Llama3

    # Install from https://ollama.ai/
    ollama pull llama3
  2. Install Python dependencies

    pip install fastapi uvicorn PyMuPDF ollama python-multipart
  3. Run the application

    python main.py
  4. Open upload_form.html in your browser

� Key Learning Areas

  • LLM Integration: How to connect FastAPI with local AI models
  • Prompt Engineering: Different prompts for various analysis types
  • Session Management: Maintaining context across AI conversations
  • Document Processing: Extracting and chunking text for AI analysis
  • API Design: Building endpoints that work with AI workflows

📁 Key Files

  • main.py - FastAPI app with AI integration logic
  • upload_form.html - Simple frontend interface
  • Sample PDFs and test scripts for experimentation

About

A simple learning project to explore AI integration and LLM implementation.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published