A modern web application for sharing and discovering cooking recipes through video content. Users can upload cooking videos which are automatically processed to extract frames, analyze content, and generate detailed recipe information.
Dinner People is a React application that leverages Supabase for backend services and AI models for video content analysis. The app allows users to:
- Sign up and authenticate
- Upload cooking videos
- Process videos into step-by-step recipes with AI assistance
- Browse, save, and like recipes from other users
- Manage their own recipe collection
Dinner People uses Supabase for authentication, database operations, and storage.
The app uses Supabase Auth for user management with a custom Zustand store:
// src/store/authStore.ts
import { create } from 'zustand';
import { supabase } from '../lib/supabase';
// Auth store with Zustand
export const useAuthStore = create<AuthState>((set) => ({
// ...existing code...
signIn: async (email, password) => {
try {
set({ loading: true, error: null });
const { data, error } = await supabase.auth.signInWithPassword({
email,
password,
});
// ...error handling and state updates
} catch (error: any) {
set({ error: error.message });
throw error;
} finally {
set({ loading: false });
}
},
// ...other auth methods
}));
Supabase is used for storing and querying recipe data:
// Example from src/lib/storage.ts
export async function uploadVideo(file: File): Promise<UploadResult> {
const userId = (await supabase.auth.getUser()).data.user?.id;
if (!userId) {
throw new Error('User not authenticated');
}
// Create recipe entry with a temporary title
const { error: recipeError } = await supabase
.from('recipes')
.insert({
id: recipeId,
user_id: userId,
status: 'draft',
title: `Untitled Recipe ${new Date().toLocaleDateString()}`,
description: 'Recipe details will be added after processing'
});
// ...additional code
}
Supabase Storage is used for video and image storage:
// Example from src/lib/video.ts
export async function uploadFrames(frames: { timestamp: number, blob: Blob }[], recipeId: string) {
const uploadedFrames: { timestamp: number, imageUrl: string }[] = [];
for (const frame of frames) {
const path = `${recipeId}/${frame.timestamp}.jpg`;
const { data: uploadData, error: uploadError } = await supabase.storage
.from('frames')
.upload(path, frame.blob);
// ...additional code
}
return uploadedFrames;
}
The app uses either OpenAI in production or Ollama locally for video processing and analysis.
note: testing openai local? Comment out the original implementation and return false below.
// src/lib/ai.ts
import { ollama } from './ollama';
import { openai } from './openai';
class AIService {
private isLocalEnvironment(): boolean {
// Temporarily return false to force using OpenAI instead of Ollama
return true;
// Original implementation (comment out while testing)
return window.location.hostname === 'localhost' ||
window.location.hostname === '127.0.0.1';
}
async analyzeFrame(imageUrl: string): Promise<string> {
// Use Ollama for local development, OpenAI for production
return this.isLocalEnvironment()
? ollama.analyzeFrame(imageUrl)
: openai.analyzeFrame(imageUrl);
}
// ...other methods
}
export const ai = new AIService();
// src/lib/openai.ts
import OpenAI from 'openai';
import { supabase } from './supabase';
const openai = new OpenAI({
apiKey: import.meta.env.VITE_OPENAI_API_KEY,
dangerouslyAllowBrowser: true // Note: In production, API calls should be made from backend
});
export async function analyzeFrame(imageUrl: string): Promise<string> {
try {
const response = await openai.chat.completions.create({
model: "gpt-4-vision-preview",
messages: [
{
role: "user",
content: [
{
type: "text",
text: "Describe this cooking step in detail, focusing on the ingredients, techniques, and any important details visible in the frame. Keep it concise but informative."
},
{
type: "image_url",
image_url: imageUrl
}
]
}
],
max_tokens: 150
});
return response.choices[0]?.message?.content || '';
} catch (error) {
console.error('Error analyzing frame:', error);
throw error;
}
}
// src/lib/ollama.ts
import { supabase } from './supabase';
const OLLAMA_BASE_URL = 'http://localhost:11434';
class OllamaAPI {
private baseUrl: string;
private model: string;
constructor(baseUrl: string = OLLAMA_BASE_URL, model: string = 'llama2-vision') {
this.baseUrl = baseUrl;
this.model = model;
}
async analyzeFrame(imageUrl: string): Promise<string> {
if (!this.isLocalEnvironment()) {
throw new Error('Ollama can only be used in local development environment');
}
const prompt = `You are a culinary expert. Analyze this cooking image and provide a detailed description of what you see.`;
// ...additional code
return await this.generateImageCompletion(prompt, [imageData]);
}
// ...other methods
}
export const ollama = new OllamaAPI('http://localhost:11434', 'llama2-vision');
Follow these steps to run Dinner People locally:
- Node.js 18+ and npm
- Supabase account (free tier available)
- For local AI processing: Ollama installed with llama2-vision model
- Optional: OpenAI API key for production-like environment
-
Clone the repository
git clone https://github.com/yourusername/dinnerpeople.git cd dinnerpeople
-
Install dependencies
npm install
-
Set up environment variables Create a
.env
file in the root directory with the following:VITE_SUPABASE_URL=your_supabase_project_url VITE_SUPABASE_ANON_KEY=your_supabase_anon_key VITE_OPENAI_API_KEY=your_openai_api_key (optional for local dev)
-
Set up Supabase
- Create a new Supabase project
- Run the migration scripts in the
supabase/migrations
folder - Set up the storage buckets (
videos
,thumbnails
,frames
)
-
For local AI processing with Ollama:
- Install Ollama from https://ollama.ai/
- Pull the llama2-vision model:
ollama pull llama2-vision
-
Start the development server
npm run dev
-
The application should now be running at
http://localhost:5173
Variable | Description | Required |
---|---|---|
VITE_SUPABASE_URL | URL for your Supabase project | Yes |
VITE_SUPABASE_ANON_KEY | Anonymous key for Supabase | Yes |
VITE_OPENAI_API_KEY | OpenAI API key | Optional for local dev |
For production deployment, we recommend:
- Deploying the frontend to Vercel, Netlify, or similar
- Ensuring your Supabase project has appropriate RLS policies
- Setting up proper edge functions for video processing
- Using OpenAI for production AI processing
See the migration files in /supabase/migrations
for the complete database schema, including:
- User authentication
- Recipes
- Video frames
- Processing queue
- Recipe interactions
- Supabase CLI
brew install supabase/tap/supabase
- Docker Desktop
brew install --cask docker
-
Start the local Supabase instance
supabase start
This will launch all required services (PostgreSQL, API, Auth, etc.)
-
View local Supabase Studio After starting, the CLI will output a Studio URL (typically http://localhost:54323)
-
Generate a timestamped migration file
supabase migration new your_migration_name
-
Edit the generated SQL file in
supabase/migrations/[timestamp]_your_migration_name.sql
-
Apply all pending migrations
supabase migration up
-
Verify migration was applied
supabase db execute "SELECT * FROM supabase_migrations.schema_migrations ORDER BY version DESC LIMIT 5;"
If you need to reset your local database:
supabase db reset
This will drop all data and reapply migrations from scratch.
To apply migrations to your production Supabase instance:
-
Link your local project to your Supabase project (first time only)
supabase link --project-ref your-project-ref
-
Push migrations to production (use with caution!)
supabase db push
-
Start the local Supabase instance:
supabase start
This will launch all Supabase services in Docker containers.
-
The CLI will output details including:
- Studio URL: http://localhost:54323
- API URL: http://localhost:54321
- DB URL: postgresql://postgres:postgres@localhost:54322/postgres
- anon key, service_role key, etc.
-
Apply migrations to your local instance:
supabase migration up
-
Test your changes in the local environment
-
Create new migrations when needed:
supabase migration new my_migration_name
This creates a timestamped migration file in
supabase/migrations/
. -
Edit your migration file, then apply it locally:
supabase migration up
-
Reset local database if needed:
supabase db reset
This will wipe your local database and reapply all migrations.
Once you've tested locally and are ready to deploy:
-
Link to your remote project (if not already done):
supabase link --project-ref oryvyobhmvztbwjzzllo
-
Push migration changes to production:
supabase db push
-
Or apply specific migrations:
supabase migration up --db-url "postgresql://postgres:[PASSWORD]@db.[PROJECT_REF].supabase.co:5432/postgres"
Remember to always test migrations locally before applying them to production.