A modern web application that helps developers master Data Structures and Algorithms through AI-powered solution generation and smart revision tracking.
🚀 AI-Powered Solution Generation
- Get brute force, better, and optimal solutions for any DSA problem
- Powered by Qubrid AI with multi-tier caching (Redis → MongoDB)
- Supports multiple programming languages
📚 Smart Revision Tracking
- Save and organize your solved problems
- Track patterns and categories
- Review time/space complexity
🔐 Secure Authentication
- Email verification with OTP
- Password reset functionality
- JWT-based session management
📊 Admin Dashboard
- User analytics
- Cache management
- Solution statistics
Frontend:
- React 18 + TypeScript
- Vite
- TailwindCSS
Backend (Serverless):
- Vercel Serverless Functions
- MongoDB (Mongoose)
- Upstash Redis
AI:
- Qubrid AI (Qwen3-Coder)
- Node.js 18+
- MongoDB database
- Upstash Redis (optional, for faster caching)
- Qubrid API key
-
Clone the repository
git clone https://github.com/WillyEverGreen/ReCode.git cd ReCode -
Install dependencies
npm install
-
Configure environment variables
cp .env.example .env
Fill in your
.envfile:# MongoDB MONGO_URI=mongodb+srv://... # JWT Secret JWT_SECRET=your-secret-key # Qubrid AI QUBRID_API_KEY=your-qubrid-api-key # Admin ADMIN_PASSWORD=your-admin-password # Redis (Optional - for faster caching) UPSTASH_REDIS_REST_URL=https://... UPSTASH_REDIS_REST_TOKEN=...
-
Run development server
# Using Vercel CLI (recommended - tests serverless functions) vercel dev # Or using Vite only (frontend only) npm run dev
If you want to run local LLMs (Ollama) for offline AI generation during development, follow these steps:
- Install and run Ollama separately and ensure models are available (e.g.
qwen2.5-coder:7b,qwen2.5:7b,deepseek-r1:7b). - In your local
.envset:
AI_PROVIDER=ollama
OLLAMA_BASE_URL=http://127.0.0.1:11434/v1
OLLAMA_MODEL_CODING=qwen2.5-coder:7b
OLLAMA_MODEL_EXPLANATION=qwen2.5:7b
OLLAMA_MODEL_REASONING=deepseek-r1:7b
VITE_API_URL=http://localhost:5000- Start the backend and frontend in separate terminals:
npm run server # starts Express API on port 5000
npm run dev # starts Vite frontend on port 3000The local Ollama integration is opt-in — by default the app uses the existing remote AI provider (Qubrid). This ensures your production deployment on Vercel is unaffected.
- Push code to GitHub
- Import project in Vercel
- Add environment variables in Vercel dashboard
- Deploy!
Note: The repository changes are backward compatible — if AI_PROVIDER is not set to ollama the deployed app will continue to use the existing remote AI provider. Do not commit your .env file; keep secrets in Vercel's dashboard.
| Variable | Required | Description |
|---|---|---|
MONGO_URI |
✓ | MongoDB connection string |
JWT_SECRET |
✓ | Secret for JWT tokens |
QUBRID_API_KEY |
✓ | Qubrid AI API key |
ADMIN_PASSWORD |
✓ | Admin panel password |
UPSTASH_REDIS_REST_URL |
Upstash Redis URL | |
UPSTASH_REDIS_REST_TOKEN |
Upstash Redis token |
| Endpoint | Method | Description |
|---|---|---|
/api/health |
GET | Health check |
/api/auth/signup |
POST | User registration |
/api/auth/login |
POST | User login |
/api/auth/verify-email |
POST | Verify email OTP |
/api/auth/forgot-password |
POST | Request password reset |
/api/auth/reset-password |
POST | Reset password |
/api/solution |
POST | Generate DSA solution |
/api/questions |
GET/POST | User's saved questions |
/api/admin/stats |
GET | Admin statistics |
ReCode/
├── api/ # Vercel Serverless Functions
│ ├── _lib/ # Shared utilities
│ ├── admin/ # Admin endpoints
│ ├── auth/ # Authentication endpoints
│ ├── questions/ # Questions CRUD
│ └── solution/ # Solution generation
├── components/ # React components
├── models/ # MongoDB models
├── services/ # Frontend services
├── config/ # Configuration
└── types/ # TypeScript types
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
This project is licensed under the MIT License.
Made with ❤️ by WillyEverGreen