Reverse-engineered API and client for controlling some of the best-selling ~$50 "toy" drones on Amazon from a computer, replacing the closed-source mobile apps they come with.
Nowadays, there are incredibly cheap "toy" drones available on Amazon that are basically paired-down clones of the early DJI Mavic. Only ~$50 to have a 1080p camera for FPV and recording, tiny downward-facing optical flow sensor for position and altitude hold, and a well-tuned flight profile out-of-the-box. The only problem with drones like this is that they run closed-source firmware and are locked to only being controlled by a custom mobile app.
TurboDrone frees these drones from their "jail" by reverse-engineering how the mobile apps work, providing an open-source API and client for accessing the video feed and sending control commands. This transforms a highly capable $50 "toy" drone into something that can be programmatically controlled and used for all sorts of applications and experiments.
For beginners or first-time setup:
# Run the automated setup helper
python tools/setup_helper.pyThis will:
- ✅ Check your Python version
- 🔍 Scan for drone networks automatically
- 📦 Install all dependencies (Python + Node.js)
- 🌐 Test connectivity to your drone
- ⚙️ Generate configuration files
- 📋 Provide next steps
If the auto-setup works, skip to Running the Application.
- Backend (FastAPI):
- MJPEG at
/mjpeg, WebSocket controls at/ws, SSE detections at/detections. - Status telemetry at
/status(fps, queues, connection, recording, drone type). - Recordings API at
/recordings(+ per-file download). - Protocols: S2X (S20/S29) and WiFi-UAV families for RC + video.
- MJPEG at
- Frontend (React + Vite):
- Live video preview, keyboard/gamepad/TrackPoint controls.
- Profile selector, headless toggle, rate control (S2X).
- Autopilot demos: Square, Circle, Stop.
- Detection overlay (optional YOLO), status overlay, recordings panel.
- Tooling:
- Setup helper and protocol analyzer for onboarding and reverse engineering.
- Tests: Backend pytest (9 passing), Frontend unit tests (Vitest), E2E smoke (Playwright).
- CI: GitHub Actions for backend and frontend (unit + E2E).
-
WiFi Camera Drone (ranked in order of recommendation):
Brand Model Number(s) Compatibility Purchase Link Notes Loiley S29 ✅ Tested Amazon Best build quality, has servo for tilting camera(not implemented in API yet) Hiturbo S20 ✅ Tested Amazon, Alternate Amazon Listing Original test platform, great build quality FlyVista V88 ✅ Tested Amazon WiFi UAV protocol ? D16/GT3/V66 ✅ Tested cheapest on Aliexpress, Amazon 20% smaller DJI Neo clone. Only good for indoor flight really. Several Brands E58 ⚠️ Tested*Amazon At least video feed has been tested physically with this drone. Very likely will work though. Karuisrc K417 ⚠️ Tested*Amazon Brushless motors! Similar body to the X29 but better build quality. Several Brands E88/E88 Pro 🔍 Suspected Amazon Several Brands E99/E99 Pro 🔍 Suspected Amazon Swifsen A35 🔍 Suspected Amazon Very small "toy" drone Unknown LSRC-S1S 🔍 Suspected mentioned in another reverse-engineering effort for the WiFi UAV app Velcase S101 🚧 TODO Amazon lower quality build, smaller battery and props than S29 & S20 Redrie X29 🚧 TODO Amazon Working on this one now Legend:
- ✅ Tested: Drone has been physically run with TurboDrone to ensure compatibility
⚠️ Tested*: Partial testing completed, full compatibility likely- 🔍 Suspected: APK analysis shows same libraries as tested drones
- 🚧 TODO: Different protocols, requires new implementation
- WiFi Dongle (recommend ALFA Network AWUS036ACM or similar)
- Drone broadcasts its own WiFi network so your computer will have to connect to it
- Built-in WiFi adapters work but dedicated adapter recommended for better range/stability
git clone https://github.com/yourusername/turbodrone.git
cd turbodronecd backend
python -m venv venv
# On Linux/macOS:
source venv/bin/activate
# On Windows:
venv\Scripts\activate
pip install -r requirements.txtWindows users only:
pip install windows-cursescd ../frontend
npm install- Power on your drone and wait for the status lights to stabilize
- Connect to the drone's WiFi network
- Network name format:
BRAND-MODEL-XXXXXX(e.g.,S29-ABC123) - Usually no password required
- Network name format:
- Verify connection: Your computer should get an IP in the drone's network
Create a .env file in the backend/ directory:
For S2X drones (S20, S29):
DRONE_TYPE=s2x
DRONE_IP=172.16.10.1
CONTROL_PORT=8080
VIDEO_PORT=8888For WiFi UAV drones (V88, D16, E58):
DRONE_TYPE=wifi_uav
DRONE_IP=192.168.169.1
CONTROL_PORT=8800
VIDEO_PORT=8800Terminal 1 - Backend Server:
cd backend
uvicorn web_server:appTerminal 2 - Frontend Development Server:
cd frontend
npm run devOpen your browser to: http://localhost:5173
You should see:
- 📹 Live video feed from the drone
- 🎮 Control interface (Keyboard/Gamepad/TrackPoint)
- 🎛️ Profile selector (Normal/Precise/Aggressive)
- 📊 Connection status and telemetry
- ⏺️ Record button to capture sessions
- 🧭 Headless toggle and Rate selector (S2X)
- 🤖 Autopilot: Square and Circle maneuvers (open-loop)
Enable real-time detection overlays and a /detections SSE stream:
# Install extras (GPU recommended; CPU works too)
pip install ultralytics torch torchvision
# In a shell running the backend
export ENABLE_DETECTION=1
uvicorn web_server:appIn the web UI you’ll see bounding boxes rendered over the video. A profile selector lets you switch sensitivity presets.
Detection data also streams over Server-Sent Events:
GET http://localhost:8000/detections // text/event-stream
data: {"width":1280,"height":720,"detections":[
{"class":0,"label":"person","confidence":0.93,"bbox":[x1,y1,x2,y2]}
]}
Bounding boxes are automatically scaled to match the "contain"-fitted video preview.
Create a local virtual environment (uses virtualenv to avoid system Python restrictions), install dev deps, and run pytest:
cd backend
python3 -m pip install --user --break-system-packages virtualenv
python3 -m virtualenv .venv
./.venv/bin/pip install -r requirements-dev.txt
./.venv/bin/pytest -qWhat’s covered:
- Control model scaling and profile switching
- Incremental control dynamics (S2X)
- Flight controller loop sends control packets
- Object detection service (using a stubbed YOLO model)
- FastAPI app imports and queue wiring
The frontend has no unit tests yet. To preview locally:
cd frontend
npm install
npm run devOpen http://localhost:5173.
Run headless smoke tests against the built preview server:
cd frontend
npm install
npx playwright install
npm run build
npm run test:e2eWhat’s covered:
- App loads and renders primary controls (Keyboard/Gamepad/TrackPoint)
- Command buttons (Takeoff/Land/Record)
- Video element present
cd backend
python main.py --drone-type s2x --with-video # For S2X drones
python main.py --drone-type wifi_uav --with-video # For WiFi UAV dronesCLI Controls:
WASD: Pitch/Roll movementSpace/Shift: Throttle up/downQ/E: Yaw left/rightT: TakeoffL: LandR: Emergency stop
- Connect any standard gamepad (Xbox, PlayStation, generic)
- Move sticks to auto-detect
- Toggle between keyboard/gamepad with the control scheme button
- Movement:
WASDfor pitch/roll - Throttle:
Space(up) /Shift(down) - Yaw:
Q(left) /E(right) - Commands:
T(takeoff) /L(land) /R(emergency stop)
Switch between sensitivity profiles:
- Normal: Balanced response (default)
- Precise: Lower sensitivity, more stable
- Aggressive: Higher sensitivity, faster response
You can record sessions directly from the web UI.
- Click Record to start/stop. Files save under
backend/recordings/with a timestamped name (MP4 if available, otherwise AVI/MJPEG fallback). - Env var
RECORDINGS_DIRcontrols output location.
Backend WebSocket messages (used by the UI):
{ "type": "start_recording" }
{ "type": "stop_recording" }
{ "type": "toggle_recording" }- Toggle from the UI checkbox in the top-right panel.
- Backend WS messages:
{ "type": "set_headless", "enabled": true }
{ "type": "toggle_headless" }S2X and WiFi-UAV both support a headless bit; implementation mirrors the reverse-engineered apps.
- UI Rate select controls the S2X
speedbyte (Slow=10, Normal=20, Fast=30). - Backend WS message:
{ "type": "set_speed", "value": 20 }Simple demo maneuvers that drive sticks over time:
- UI buttons: Square, Circle, Stop.
- Backend WS messages:
{ "type": "auto_square", "leg_sec": 2.0, "speed": 0.3, "yaw_spd": 0.6 }
{ "type": "auto_circle", "duration": 12.0, "forward": 0.25, "yaw": 0.15, "roll": 0.0 }
{ "type": "auto_stop" }Note: These are open-loop patterns (no GPS/IMU feedback), intended for short, careful indoor demos.
- Axis tuning: per-axis expo, dual rates, deadzone, and slew-rate limiting.
- Throttle assist: hover trim with quick-set to current throttle.
- Soft landing: gentle controlled descent with automatic stick neutralization.
- Course lock and heading hold: maintain heading with zero yaw when centered.
- Dynamic braking: faster deceleration when reversing direction on pitch/roll.
- Path demos: figure-8, zig-zag, timed pans, with UI for amplitude/duration.
# Test drone connectivity
ping 172.16.10.1 # For S2X drones
ping 192.168.169.1 # For WiFi UAV drones
# Check if you're on the drone network
ip addr show # Linux/macOS
ipconfig # Windows- No video: Check drone camera is enabled and not obstructed
- Choppy video: Move closer to drone, check WiFi interference
- Connection drops: Use dedicated WiFi adapter, minimize interference
Two GitHub Actions workflows are included:
.github/workflows/backend.yml: Installs backend dev deps and runspytest..github/workflows/frontend.yml: Installs Node and Playwright browsers, builds the app, and runs Playwright smoke tests.
cd frontend
npm install
npm run test # runs vitest unit testsWhat’s covered:
- Command buttons invoke callbacks
- Control scheme toggle disables Gamepad when disconnected
- Profile selector sends
set_profileand persists to localStorage
- Endpoint:
GET http://localhost:8000/recordingsreturns{ directory, files: [{ name, size, mtime, url }] } - Download: Open
http://localhost:8000/recordings/<filename>to stream or save the file. - UI: Use the “Show Recordings” button to view and download existing captures.
- Endpoint:
GET http://localhost:8000/statusreturns live telemetry:fps,connected,last_frame_age_ms- queue depths:
raw_queue,frame_queue,detection_queue recording(bool),drone_type
- UI: A small overlay shows connectivity, FPS and queue sizes.
- Unresponsive: Verify drone is in WiFi mode, not Bluetooth
- Erratic movement: Check control profile, recalibrate if needed
- Won't takeoff: Ensure drone is on level surface, check battery
# Backend with verbose logging
cd backend
uvicorn web_server:app --log-level debug
# Capture protocol data for analysis
python tools/protocol_analyzer.py --interactivepython tools/setup_helper.py- Auto-detects drone type and network
- Installs dependencies automatically
- Generates configuration files
- Tests connectivity
python tools/protocol_analyzer.py --interactive- Captures and analyzes drone network traffic
- Identifies packet patterns and structures
- Exports analysis for further study
- Perfect for adding support for new drones
- Video Processing: Object pooling, async JPEG encoding
- Network Layer: Connection pooling, batch processing
- Memory Management: Pre-allocated buffers, optimized GC
-
Scan for your drone:
python tools/setup_helper.py --scan-only
-
Connect to drone WiFi and capture traffic:
python tools/protocol_analyzer.py --interactive
-
Analyze patterns in the output:
- Look for repeating packet structures
- Note packet sizes (control vs video)
- Identify sync patterns and headers
-
Create experimental implementation:
- Add to
experimental/directory - Use existing protocols as templates
- Add to
Tools you'll need:
Process:
- Download the drone's mobile app APK
- Decompile with jadx to analyze Java code
- Use Wireshark + protocol_analyzer.py to capture packets
- Look for patterns:
- Control packets: Usually 10-50 bytes, sent at 80Hz
- Video packets: Variable size, contain JPEG frame slices
- Headers: Often 2-8 bytes with sync patterns
- Create protocol adapter based on findings
- Test in experimental directory before integration
Watch this reverse engineering walkthrough video for the process used with the Hiturbo S20.
Current State:
- ✅ Video Feed: Solid with automatic reconnection
- ✅ Controls: Excellent via web interface
- ✅ Web Client: Gamepad, keyboard, TrackPoint support
- ✅ Reverse Engineering: Automated tools for new drones
- 🔧 Performance: Optimized for low-latency streaming
Active Development:
- Adding support for more drones from Amazon's best-selling list
- Performance optimizations for video processing
- Enhanced reverse engineering tools
Method 1: Use Our Tools (Recommended)
# Step 1: Detect and analyze
python tools/setup_helper.py --scan-only
python tools/protocol_analyzer.py --interactive
# Step 2: Create experimental implementation
# Add your findings to experimental/ directory
# Step 3: Submit findings
# Create GitHub issue with analysis resultsMethod 2: Manual Analysis
- Download the drone's APK from APKMirror or similar
- Decompile with jadx, look for network protocols
- Use Wireshark to capture actual traffic
- Implement protocol adapter based on patterns
- Submit pull request with implementation
# Clone repo
git clone https://github.com/yourusername/turbodrone.git
cd turbodrone
# Setup development environment
python tools/setup_helper.py --skip-deps # if you want manual control
# Make changes and test
# ... your changes ...
# Run tests (if available)
cd backend && python -m pytest
cd frontend && npm test
# Submit pull request- Architecture Overview: Detailed system architecture
- CLAUDE.md: Development guidelines and patterns
- Tools Documentation: Reverse engineering tools
- Research Notes: Protocol analysis findings
Flight Safety:
- Always fly in open areas away from people and property
- Respect local drone regulations and airspace restrictions
- Check battery levels before flight
- Land immediately if you lose video feed or control
Legal Notice:
- This project is for educational and research purposes
- Ensure compliance with local laws regarding drone operation
- Respect manufacturer warranties and terms of service
- Use responsibly and at your own risk
For drones with limited or experimental support, see the experimental/ directory. This includes early-stage implementations that haven't been fully integrated into the main architecture.
Having Issues?
- Check the troubleshooting section
- Run diagnostics:
python tools/setup_helper.py --scan-only - Search existing GitHub issues
- Create a new issue with:
- Your drone model
- Operating system
- Error messages
- Output from protocol analyzer if available
Want to Help?
- Test with new drone models
- Improve documentation
- Add performance optimizations
- Contribute reverse engineering findings
TurboDrone - Liberating toy drones from their proprietary mobile apps since 2024! 🚁✨
