An AI-powered backend service designed to provide university professors with deep, actionable insights into their lectures.
This system:
- Transcribes lecture audio
- Uses Large Language Models (LLMs) to analyze and structure the content
- Cross-references that content against the official course syllabus to track progress and coverage
- Exposes a clean, fast API to power a frontend dashboard.
Upload lecture audio/video files and get an immediate response. All heavy processing (transcription, AI analysis) happens in the background.
Uses OpenAI’s Whisper-large-v3 for highly accurate lecture transcriptions.
-
Dashboard Analytics: LLM generates structured JSON for key metrics: topics covered, key points, questions asked, examples used, and a high-level summary.
-
Pedagogical Notes: A second, detailed LLM call generates rich, publication-quality class notes in JSON.
Upload a course syllabus (PDF or DOCX) — Llama 3.3 reads and parses it into a day-by-day JSON “course roadmap”.
Automatically compares actual topics from lectures vs planned topics from the syllabus.
Outputs a detailed coverage report (e.g., 90% covered, 2 topics missing).
Full suite of API endpoints to power the frontend dashboard:
/analytics/dashboard, /analytics/questions, and more.
Includes Gensim/NLTK to perform LDA topic modeling on transcripts as an alternative analysis layer.
| Layer | Technologies |
|---|---|
| Backend | FastAPI, Uvicorn |
| Database | SQLAlchemy, SQLite |
| Validation | Pydantic |
| AI & ML | OpenAI (Whisper, Llama 3.3), Gensim, NLTK |
| File Parsing | PyPDF2, python-docx |
| Async | BackgroundTasks (FastAPI) |
# 1. Clone the repository
git clone https://github.com/sibi-seeni/professors-dash.git
cd professors-dash
# 2. Create and activate a virtual environment
python -m venv .venv
source .venv/bin/activate
# On Windows:
# .venv\Scripts\activate
# 3. Install dependencies
pip install -r requirements.txtCreate a file named .env in the root directory and add your keys:
OPENAI_API_KEY="sk-..."
OPENAI_API_BASE="https_your_proxy_url_com"uvicorn main:app --reloadYour API will be live at:
👉 http://127.0.0.1:8000
Interactive Docs:
👉 http://127.0.0.1:8000/docs
| Method | Endpoint | Description |
|---|---|---|
| POST | /upload/ |
Upload a lecture audio/video file (triggers background processing) |
| POST | /upload_syllabus/ |
Upload a syllabus PDF/DOCX (triggers AI parsing & coverage) |
| GET | /lecture/{lecture_id} |
Retrieve lecture data (transcript, analytics, notes, etc.) |
| GET | /lecture/{lecture_id}/notes |
Get clean JSON of pedagogical class notes |
| GET | /analytics/dashboard |
Get combined analytics for dashboard UI |
| GET | /syllabus_result/ |
Get latest syllabus coverage report with AI-generated roadmap |
The current workflow requires manual uploads. The next phase integrates with the Canvas API to make the process fully automated — turning this into a hands-free Teaching Assistant.
-
Add
canvasapitorequirements.txt -
Add to
.env:CANVAS_API_URL="https://ufl.instructure.com" CANVAS_API_KEY="your_key_here"
-
Create a helper function to initialize the Canvas object.
New endpoint:
POST /sync_syllabus/{course_id}- Uses
canvas.get_course(course_id)to locate the syllabus file. - Downloads it and passes it to
syllabus_tracker.process_syllabus_file.
New endpoint:
POST /sync_lectures/{course_id}- Lists files in a Canvas folder (e.g., “Lecture Recordings”).
- Detects new files and queues them for background processing with
processing.process_lecture_file.
After background tasks complete:
- Format
notes_jsoninto HTML. - Use
course.create_page()to publish new “Lecture Notes” pages for students.
- Convert
quiz_jsonintoQuizQuestionformat. - Use
course.create_quiz()to build new unpublished quizzes for professors to review.
- Use
syllabus_tracker’s coverage data. - Create or update a private “Instructor Dashboard” page (
published=False) inside Canvas.
By integrating with Canvas, this project evolves from a manual dashboard to an autonomous analytics companion for educators — tracking lecture content, syllabus progress, and learning outcomes seamlessly.