Snortstamper uses a locally-installed AI model to generate timestamps for any length of youtube transcripts.
Ensure you have:
-
Python 3.8+
-
Ollama running with Mistral model:
ollama pull mistral -
Get the transcript
- For the correct timestamp format, you need the VidIQ plugin. https://app.vidiq.com/
-
Go to any youtube video
- Go to the description -> transcript
- Now you will see the VidIQ plugin button to copy the transcript
- Press the dropdown button and copy the transcript with timestamps
- Paste the whole transcript into the transcript.txt file
Windows:
- Double-click
Run Snortstamper.ink - Wait for browser to prompt
- Open
http://localhost:5000
macOS/Linux:
- Open terminal in the project folder
- Run:
chmod +x startup.sh && ./startup.sh - Wait for Flask to start
- Open
http://localhost:5000
The startup script will:
- Create virtual environment (if needed)
- Activate it
- Check for Ollama and Mistral model
- Start Ollama server
- Start Flask server
- Open the UI
Step 1: Activate Virtual Environment
Windows:
venv\Scripts\activatemacOS/Linux:
source venv/bin/activateStep 2: Start Ollama (in one terminal)
ollama serveStep 3: Start Flask Server (in another terminal)
python app.pyStep 4: Open in Browser
Go to: http://localhost:5000
- Click on the upload area to select your transcript file (
.txt) - The file should contain timestamps in
[M:SS]or[H:MM:SS]format - Click "Generate Timestamps"
- Wait for processing (will print progress in terminal)
- View the generated chapters in the preview
- Click "Download Chapters" to save as
youtube_chapters.txt
If you want to use the script from the command line without the web UI:
from snortstamper_core import ChapterGenerator
import json
with open('transcript.txt', 'r', encoding='utf-8') as f:
transcript = f.read()
generator = ChapterGenerator(model="mistral")
chapters = generator.generate_chapters(transcript)
formatted = generator.format_chapters(chapters)
print(formatted)
# Optionally save
with open('youtube_chapters.txt', 'w', encoding='utf-8') as f:
f.write(formatted)
with open('chapters.json', 'w', encoding='utf-8') as f:
json.dump(chapters, f, indent=2, ensure_ascii=False)Error: "Connection refused" on port 5000
- Flask server isn't running. Run
python app.py
Error: "Could not connect to Ollama"
- Ollama isn't running. Run
ollama servein another terminal - Make sure Mistral is pulled:
ollama pull mistral
Error: "No transcript file provided"
- Make sure you selected a
.txtfile before clicking Generate
Long processing time
- This is normal! LLM processing takes time. Check the terminal for progress logs.
POST /api/generate-timestamps
Request:
- Form data with file field named "transcript" (multipart/form-data)
Response:
{
"timestamps": "[0:00] Chapter Title\n[1:30] Another Chapter\n..."
}
Error Response:
{
"error": "Error message"
}