cp .env.example .envPORT=3000
NODE_ENV=developmentDATABASE_PATH=./data/studiobot.dbThe database file will be automatically created on first run.
# Temporary storage for downloaded videos
TEMP_VIDEO_DIR=./temp/videos
# Output directories
OUTPUT_CLIPS_DIR=./output/clips
OUTPUT_SHORTS_DIR=./output/shorts
OUTPUT_THUMBNAILS_DIR=./output/thumbnailsDirectories are created automatically if they don't exist.
LOG_LEVEL=debug # debug, info, warn, error
LOG_FILE=./logs/app.log- Go to Google Cloud Console
- Create new project (StudioBot.ai)
- Enable YouTube Data API v3
- Create OAuth 2.0 credentials (Authorized redirect URI: http://localhost:3000/auth/youtube/callback)
- Add API key and credentials to .env:
YOUTUBE_API_KEY=your_api_key_here
YOUTUBE_CLIENT_ID=your_client_id.apps.googleusercontent.com
YOUTUBE_CLIENT_SECRET=your_client_secret_here- Go to Twitch Developer Console
- Click "Create Application"
- Select "Application Type: Public"
- Accept terms and create
- Generate OAuth Token
- Add to .env:
TWITCH_CLIENT_ID=your_client_id_here
TWITCH_ACCESS_TOKEN=your_access_token_here- Visit Rumble Creator Studio
- API Settings → Generate API Key
- Add to .env:
RUMBLE_API_KEY=your_api_key_here
RUMBLE_CHANNEL_ID=your_channel_id_hereJWT_SECRET=your_jwt_secret_key_here
SESSION_SECRET=your_session_secret_hereGenerate random secrets:
# On macOS/Linux
openssl rand -base64 32
# Or use Node.js
node -e "console.log(require('crypto').randomBytes(32).toString('hex'))"StudioBot.ai uses FFmpeg for video processing. Install it before running:
brew install ffmpegsudo apt-get install ffmpeg- Download from ffmpeg.org
- Extract and add to PATH
- Verify:
ffmpeg -version
If using Docker, FFmpeg is included in the base image.
The database is automatically initialized on server startup. The schema includes:
- Users table
- Videos table
- Clips table
- Shorts table
- Thumbnails table
- Platforms table
- Distributions table
Database file location: ./data/studiobot.db
cp ./data/studiobot.db ./data/studiobot.db.backuprm ./data/studiobot.db
# Restart server to reinitializeFor video processing optimization, edit the clip and short processing in respective service files:
For Clip Creation (src/services/clip.service.ts):
// Customize FFmpeg parameters
const command = `ffmpeg -i "${inputPath}" -ss ${startTime} -to ${endTime} -c:v libx264 -preset fast "${outputPath}"`;For Short Conversion (src/services/short.service.ts):
// Custom scaling and padding for vertical format
const command = `ffmpeg -i "${inputPath}" -vf "scale=1080:1920:force_original_aspect_ratio=decrease,pad=1080:1920:-1:-1:color=black" "${outputPath}"`;Modify log levels per module by editing src/utils/logger.ts:
class Logger {
private currentLogLevel: LogLevel;
constructor() {
const envLevel = (process.env.LOG_LEVEL || 'info').toUpperCase();
this.currentLogLevel = LogLevel[envLevel as keyof typeof LogLevel] as LogLevel;
}
}- Environment: Set
NODE_ENV=production - Security:
- Generate strong JWT_SECRET and SESSION_SECRET
- Use environment variables for all secrets
- Enable HTTPS
- Logging: Set
LOG_LEVEL=info(reduce verbosity) - Database: Consider PostgreSQL for production
- Storage: Use cloud storage (AWS S3, Google Cloud Storage) for media files
Create Dockerfile:
FROM node:18-alpine
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY dist ./dist
EXPOSE 3000
CMD ["node", "dist/index.js"]Create docker-compose.yml:
version: '3.8'
services:
studiobot:
build: .
ports:
- "3000:3000"
environment:
NODE_ENV: production
DATABASE_PATH: /app/data/studiobot.db
volumes:
- ./data:/app/data
- ./output:/app/output
- ./logs:/app/logsRun with Docker:
docker-compose up-- Create indexes for faster queries
CREATE INDEX idx_videos_user ON videos(user_id);
CREATE INDEX idx_clips_video ON clips(video_id);
CREATE INDEX idx_clips_user ON clips(user_id);
CREATE INDEX idx_shorts_user ON shorts(user_id);
CREATE INDEX idx_distributions_content ON distributions(content_id);
CREATE INDEX idx_distributions_platform ON distributions(platform_name);For large video processing, set Node.js memory limit:
node --max-old-space-size=4096 dist/index.jsAdjust FFmpeg presets in services for speed vs quality:
ultrafast- Speed priorityfast- Balanced (default)slow- Quality priority
Problem: "SQLITE_CANTOPEN error"
Solution: Run mkdir -p ./data before starting server
Problem: "ffmpeg command not found"
Solution: Install FFmpeg and verify it's in PATH: which ffmpeg
Problem: "EADDRINUSE :::3000"
Solution: Change PORT in .env or kill process: lsof -ti:3000 | xargs kill -9
Problem: "Cannot reach http://localhost:3000"
Solution: Verify server is running: curl http://localhost:3000/health
- Never commit .env - Add to .gitignore
- Rotate secrets - Periodically regenerate API keys
- Check logs - Review
./logs/app.logfor errors - Validate inputs - API validates all inputs
- Use HTTPS - Required for production
- Secure database - Use strong file permissions
chmod 600 ./data/studiobot.db
curl http://localhost:3000/healthExpected response:
{
"status": "ok",
"timestamp": "2024-02-08T10:30:00.000Z"
}tail -f ./logs/app.logls -lh ./data/studiobot.db#!/bin/bash
BACKUP_DIR="./backups"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
mkdir -p $BACKUP_DIR
cp ./data/studiobot.db $BACKUP_DIR/studiobot_$TIMESTAMP.dbcp ./backups/studiobot_YYYYMMDD_HHMMSS.db ./data/studiobot.db- ✅ Configure environment variables
- ✅ Install FFmpeg
- ✅ Start the server
- ✅ Create a test user account
- ✅ Upload a test video
- ✅ Test the API workflow