Skip to content

Seekr is an open-source AI interviewer powered by open-source models. Simulate realistic mock interviews, and run locally—no proprietary APIs, full control.

Notifications You must be signed in to change notification settings

mdjamilkashemporosh/seekr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bot

Seekr is an open-source AI interviewer built with open-source LLMs via Ollama. It simulates realistic mock interviews to help users practice, prepare, and improve their interview performance—no proprietary APIs or cloud dependencies required.

Features

  • AI-driven, dynamic interview sessions

  • Powered by local open-source models (via Ollama)

  • Role- and domain-specific question support

  • Easy to customize and extend

  • Full control and data privacy (runs locally)

  • Supports 10+ roles (e.g., intern, senior, CTO)

  • Covers 60+ topics (e.g., React, TypeScript, AWS)

Supported Models

  • Llama 3.3
  • Llama 3.2
  • Gemma 3
  • Phi-4
  • Mistral
  • DeepSeek
  • Many more available in the Ollama Model Library

Getting Started

To run Seekr, you'll need to set up both the backend and frontend. You can run it manually or using Docker.

Prerequisites

Make sure you have the following installed:

  • Python ≥ 3.9
  • Node.js ≥ 24 (recommended)
  • Ollama with your desired model installed (e.g., phi4)

Run ollama run phi4 to make sure the model is working locally.

Environment Setup

1. Frontend

Navigate to the frontend folder and create a .env file with:

# Base URL of the backend API
VITE_API_BASE_URL=http://localhost:8000

2. Backend

Navigate to the backend folder and create a .env file with:

# The name of the model to use with Ollama (e.g., llama3, mistral, phi4, etc.)
OLLAMA_MODEL=phi4
# Use http://host.docker.internal:11434 for macOS/Windows Docker
# On Linux, either use http://host.docker.internal:11434 (with --add-host=host.docker.internal:host-gateway)
# If running Ollama locally (on the same machine), use: http://localhost:11434
OLLAMA_BASE_URL=http://host.docker.internal:11434

Running the app

Run with Docker (Recommended)

From the root directory, run:

docker-compose -f docker-compose.dev.yml up --build

Once running, open your browser and navigate to:

http://localhost:5173

Run Manually

You can also run Seekr manually without Docker. Follow these steps:

Start the Backend

# Navigate to the backend folder
cd backend

# (Optional) Create and activate a virtual environment
python3 -m venv venv
source venv/bin/activate  # Use `venv\Scripts\activate` on Windows

# Install dependencies
pip install -r requirements.txt

# Start the backend server
uvicorn main:app --reload

Start the Frontend

In a new terminal window/tab, run:

# Navigate to the frontend folder
cd frontend

# Install dependencies
npm install

# Start the frontend dev server
npm run dev

Once running, open your browser and navigate to:

http://localhost:5173

Demo

This is the landing or home screen where the user starts their journey in the application. initial The user selects a specific topic they want to be tested or learn more about. topic The user chooses the difficulty level for their questions — typically something like Easy, Medium, or Hard. level A loading screen appears while the system fetches or generates relevant questions based on the selected topic and difficulty. ques-loading A question is presented to the user, with input fields. ques Once the user submits an answer, the system calculating the response. answear-review After completing all questions, the user receives a summary or evaluation of their performance, possibly including score, strengths, and areas for improvement. Evaluation

demo.mp4

Contributing

If you would like to contribute to this web application, please open an issue on GitHub to discuss your ideas or proposed changes. Pull requests are also welcome.

License

This pdf to audio web application is available under the MIT License. You are free to use, modify, and distribute this project as you see fit.

OLLAMA_MODEL=
OLLAMA_BASE_URL=
BASE_URL=http://localhost:8000

About

Seekr is an open-source AI interviewer powered by open-source models. Simulate realistic mock interviews, and run locally—no proprietary APIs, full control.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •