Skip to content

Mahdi451/turbodrone

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TurboDrone

Reverse-engineered API and client for controlling some of the best-selling ~$50 "toy" drones on Amazon from a computer, replacing the closed-source mobile apps they come with.

S20 Drone Short Clip

Introduction

Nowadays, there are incredibly cheap "toy" drones available on Amazon that are basically paired-down clones of the early DJI Mavic. Only ~$50 to have a 1080p camera for FPV and recording, tiny downward-facing optical flow sensor for position and altitude hold, and a well-tuned flight profile out-of-the-box. The only problem with drones like this is that they run closed-source firmware and are locked to only being controlled by a custom mobile app.

TurboDrone frees these drones from their "jail" by reverse-engineering how the mobile apps work, providing an open-source API and client for accessing the video feed and sending control commands. This transforms a highly capable $50 "toy" drone into something that can be programmatically controlled and used for all sorts of applications and experiments.

🚀 Quick Start (New!)

For beginners or first-time setup:

# Run the automated setup helper
python tools/setup_helper.py

This will:

  • ✅ Check your Python version
  • 🔍 Scan for drone networks automatically
  • 📦 Install all dependencies (Python + Node.js)
  • 🌐 Test connectivity to your drone
  • ⚙️ Generate configuration files
  • 📋 Provide next steps

If the auto-setup works, skip to Running the Application.

Project Status

  • Backend (FastAPI):
    • MJPEG at /mjpeg, WebSocket controls at /ws, SSE detections at /detections.
    • Status telemetry at /status (fps, queues, connection, recording, drone type).
    • Recordings API at /recordings (+ per-file download).
    • Protocols: S2X (S20/S29) and WiFi-UAV families for RC + video.
  • Frontend (React + Vite):
    • Live video preview, keyboard/gamepad/TrackPoint controls.
    • Profile selector, headless toggle, rate control (S2X).
    • Autopilot demos: Square, Circle, Stop.
    • Detection overlay (optional YOLO), status overlay, recordings panel.
  • Tooling:
    • Setup helper and protocol analyzer for onboarding and reverse engineering.
    • Tests: Backend pytest (9 passing), Frontend unit tests (Vitest), E2E smoke (Playwright).
    • CI: GitHub Actions for backend and frontend (unit + E2E).

Hardware

Supported Drones

  • WiFi Camera Drone (ranked in order of recommendation):

    Brand Model Number(s) Compatibility Purchase Link Notes
    Loiley S29 ✅ Tested Amazon Best build quality, has servo for tilting camera(not implemented in API yet)
    Hiturbo S20 ✅ Tested Amazon, Alternate Amazon Listing Original test platform, great build quality
    FlyVista V88 ✅ Tested Amazon WiFi UAV protocol
    ? D16/GT3/V66 ✅ Tested cheapest on Aliexpress, Amazon 20% smaller DJI Neo clone. Only good for indoor flight really.
    Several Brands E58 ⚠️ Tested* Amazon At least video feed has been tested physically with this drone. Very likely will work though.
    Karuisrc K417 ⚠️ Tested* Amazon Brushless motors! Similar body to the X29 but better build quality.
    Several Brands E88/E88 Pro 🔍 Suspected Amazon
    Several Brands E99/E99 Pro 🔍 Suspected Amazon
    Swifsen A35 🔍 Suspected Amazon Very small "toy" drone
    Unknown LSRC-S1S 🔍 Suspected mentioned in another reverse-engineering effort for the WiFi UAV app
    Velcase S101 🚧 TODO Amazon lower quality build, smaller battery and props than S29 & S20
    Redrie X29 🚧 TODO Amazon Working on this one now

    Legend:

    • Tested: Drone has been physically run with TurboDrone to ensure compatibility
    • ⚠️ Tested*: Partial testing completed, full compatibility likely
    • 🔍 Suspected: APK analysis shows same libraries as tested drones
    • 🚧 TODO: Different protocols, requires new implementation

Required Hardware

  • WiFi Dongle (recommend ALFA Network AWUS036ACM or similar)
    • Drone broadcasts its own WiFi network so your computer will have to connect to it
    • Built-in WiFi adapters work but dedicated adapter recommended for better range/stability

Manual Setup

Prerequisites

1. Clone and Navigate

git clone https://github.com/yourusername/turbodrone.git
cd turbodrone

2. Backend Setup

cd backend
python -m venv venv

# On Linux/macOS:
source venv/bin/activate

# On Windows:
venv\Scripts\activate

pip install -r requirements.txt

Windows users only:

pip install windows-curses

3. Frontend Setup

cd ../frontend
npm install

4. Hardware Connection

  1. Power on your drone and wait for the status lights to stabilize
  2. Connect to the drone's WiFi network
    • Network name format: BRAND-MODEL-XXXXXX (e.g., S29-ABC123)
    • Usually no password required
  3. Verify connection: Your computer should get an IP in the drone's network

5. Configuration

Create a .env file in the backend/ directory:

For S2X drones (S20, S29):

DRONE_TYPE=s2x
DRONE_IP=172.16.10.1
CONTROL_PORT=8080
VIDEO_PORT=8888

For WiFi UAV drones (V88, D16, E58):

DRONE_TYPE=wifi_uav
DRONE_IP=192.168.169.1
CONTROL_PORT=8800
VIDEO_PORT=8800

Running the Application

Method 1: Web Interface (Recommended)

Terminal 1 - Backend Server:

cd backend
uvicorn web_server:app

Terminal 2 - Frontend Development Server:

cd frontend
npm run dev

Open your browser to: http://localhost:5173

You should see:

  • 📹 Live video feed from the drone
  • 🎮 Control interface (Keyboard/Gamepad/TrackPoint)
  • 🎛️ Profile selector (Normal/Precise/Aggressive)
  • 📊 Connection status and telemetry
  • ⏺️ Record button to capture sessions
  • 🧭 Headless toggle and Rate selector (S2X)
  • 🤖 Autopilot: Square and Circle maneuvers (open-loop)

Optional: Object Detection (Experimental)

Enable real-time detection overlays and a /detections SSE stream:

# Install extras (GPU recommended; CPU works too)
pip install ultralytics torch torchvision

# In a shell running the backend
export ENABLE_DETECTION=1
uvicorn web_server:app

In the web UI you’ll see bounding boxes rendered over the video. A profile selector lets you switch sensitivity presets.

Detection data also streams over Server-Sent Events:

GET http://localhost:8000/detections  // text/event-stream

data: {"width":1280,"height":720,"detections":[
  {"class":0,"label":"person","confidence":0.93,"bbox":[x1,y1,x2,y2]}
]}

Bounding boxes are automatically scaled to match the "contain"-fitted video preview.

Testing

Backend unit tests

Create a local virtual environment (uses virtualenv to avoid system Python restrictions), install dev deps, and run pytest:

cd backend
python3 -m pip install --user --break-system-packages virtualenv
python3 -m virtualenv .venv
./.venv/bin/pip install -r requirements-dev.txt
./.venv/bin/pytest -q

What’s covered:

  • Control model scaling and profile switching
  • Incremental control dynamics (S2X)
  • Flight controller loop sends control packets
  • Object detection service (using a stubbed YOLO model)
  • FastAPI app imports and queue wiring

Frontend

The frontend has no unit tests yet. To preview locally:

cd frontend
npm install
npm run dev

Open http://localhost:5173.

Frontend E2E (Playwright)

Run headless smoke tests against the built preview server:

cd frontend
npm install
npx playwright install
npm run build
npm run test:e2e

What’s covered:

  • App loads and renders primary controls (Keyboard/Gamepad/TrackPoint)
  • Command buttons (Takeoff/Land/Record)
  • Video element present

Method 2: CLI Interface

cd backend
python main.py --drone-type s2x --with-video        # For S2X drones
python main.py --drone-type wifi_uav --with-video   # For WiFi UAV drones

CLI Controls:

  • WASD: Pitch/Roll movement
  • Space/Shift: Throttle up/down
  • Q/E: Yaw left/right
  • T: Takeoff
  • L: Land
  • R: Emergency stop

🎮 Control Options

Gamepad Support

  1. Connect any standard gamepad (Xbox, PlayStation, generic)
  2. Move sticks to auto-detect
  3. Toggle between keyboard/gamepad with the control scheme button

Keyboard Controls

  • Movement: WASD for pitch/roll
  • Throttle: Space (up) / Shift (down)
  • Yaw: Q (left) / E (right)
  • Commands: T (takeoff) / L (land) / R (emergency stop)

Control Profiles

Switch between sensitivity profiles:

  • Normal: Balanced response (default)
  • Precise: Lower sensitivity, more stable
  • Aggressive: Higher sensitivity, faster response

Recording

You can record sessions directly from the web UI.

  • Click Record to start/stop. Files save under backend/recordings/ with a timestamped name (MP4 if available, otherwise AVI/MJPEG fallback).
  • Env var RECORDINGS_DIR controls output location.

Backend WebSocket messages (used by the UI):

{ "type": "start_recording" }
{ "type": "stop_recording" }
{ "type": "toggle_recording" }

Movement Features

Headless Mode (Drone-side)

  • Toggle from the UI checkbox in the top-right panel.
  • Backend WS messages:
{ "type": "set_headless", "enabled": true }
{ "type": "toggle_headless" }

S2X and WiFi-UAV both support a headless bit; implementation mirrors the reverse-engineered apps.

Rate/Speed (S2X only)

  • UI Rate select controls the S2X speed byte (Slow=10, Normal=20, Fast=30).
  • Backend WS message:
{ "type": "set_speed", "value": 20 }

Autopilot Maneuvers (Open-loop)

Simple demo maneuvers that drive sticks over time:

  • UI buttons: Square, Circle, Stop.
  • Backend WS messages:
{ "type": "auto_square", "leg_sec": 2.0, "speed": 0.3, "yaw_spd": 0.6 }
{ "type": "auto_circle", "duration": 12.0, "forward": 0.25, "yaw": 0.15, "roll": 0.0 }
{ "type": "auto_stop" }

Note: These are open-loop patterns (no GPS/IMU feedback), intended for short, careful indoor demos.

Roadmap (Planned Movement Features)

  • Axis tuning: per-axis expo, dual rates, deadzone, and slew-rate limiting.
  • Throttle assist: hover trim with quick-set to current throttle.
  • Soft landing: gentle controlled descent with automatic stick neutralization.
  • Course lock and heading hold: maintain heading with zero yaw when centered.
  • Dynamic braking: faster deceleration when reversing direction on pitch/roll.
  • Path demos: figure-8, zig-zag, timed pans, with UI for amplitude/duration.

🔧 Troubleshooting

Network Issues

# Test drone connectivity
ping 172.16.10.1    # For S2X drones
ping 192.168.169.1  # For WiFi UAV drones

# Check if you're on the drone network
ip addr show        # Linux/macOS
ipconfig           # Windows

Video Feed Problems

  1. No video: Check drone camera is enabled and not obstructed
  2. Choppy video: Move closer to drone, check WiFi interference
  3. Connection drops: Use dedicated WiFi adapter, minimize interference

Continuous Integration

Two GitHub Actions workflows are included:

  • .github/workflows/backend.yml: Installs backend dev deps and runs pytest.
  • .github/workflows/frontend.yml: Installs Node and Playwright browsers, builds the app, and runs Playwright smoke tests.

Frontend Unit Tests (Vitest)

cd frontend
npm install
npm run test  # runs vitest unit tests

What’s covered:

  • Command buttons invoke callbacks
  • Control scheme toggle disables Gamepad when disconnected
  • Profile selector sends set_profile and persists to localStorage

Recordings List & Download

  • Endpoint: GET http://localhost:8000/recordings returns { directory, files: [{ name, size, mtime, url }] }
  • Download: Open http://localhost:8000/recordings/<filename> to stream or save the file.
  • UI: Use the “Show Recordings” button to view and download existing captures.

Status/Telemetry Endpoint

  • Endpoint: GET http://localhost:8000/status returns live telemetry:
    • fps, connected, last_frame_age_ms
    • queue depths: raw_queue, frame_queue, detection_queue
    • recording (bool), drone_type
  • UI: A small overlay shows connectivity, FPS and queue sizes.

Control Issues

  1. Unresponsive: Verify drone is in WiFi mode, not Bluetooth
  2. Erratic movement: Check control profile, recalibrate if needed
  3. Won't takeoff: Ensure drone is on level surface, check battery

Get Debug Information

# Backend with verbose logging
cd backend
uvicorn web_server:app --log-level debug

# Capture protocol data for analysis
python tools/protocol_analyzer.py --interactive

🚀 New Features & Tools

Automated Setup Helper

python tools/setup_helper.py
  • Auto-detects drone type and network
  • Installs dependencies automatically
  • Generates configuration files
  • Tests connectivity

Protocol Analyzer (For Reverse Engineering)

python tools/protocol_analyzer.py --interactive
  • Captures and analyzes drone network traffic
  • Identifies packet patterns and structures
  • Exports analysis for further study
  • Perfect for adding support for new drones

Performance Optimizations

  • Video Processing: Object pooling, async JPEG encoding
  • Network Layer: Connection pooling, batch processing
  • Memory Management: Pre-allocated buffers, optimized GC

🧑‍💻 Reverse Engineering New Drones

Quick Start for Beginners

  1. Scan for your drone:

    python tools/setup_helper.py --scan-only
  2. Connect to drone WiFi and capture traffic:

    python tools/protocol_analyzer.py --interactive
  3. Analyze patterns in the output:

    • Look for repeating packet structures
    • Note packet sizes (control vs video)
    • Identify sync patterns and headers
  4. Create experimental implementation:

    • Add to experimental/ directory
    • Use existing protocols as templates

Advanced Reverse Engineering

Tools you'll need:

Process:

  1. Download the drone's mobile app APK
  2. Decompile with jadx to analyze Java code
  3. Use Wireshark + protocol_analyzer.py to capture packets
  4. Look for patterns:
    • Control packets: Usually 10-50 bytes, sent at 80Hz
    • Video packets: Variable size, contain JPEG frame slices
    • Headers: Often 2-8 bytes with sync patterns
  5. Create protocol adapter based on findings
  6. Test in experimental directory before integration

Watch this reverse engineering walkthrough video for the process used with the Hiturbo S20.

📊 Status

Current State:

  • Video Feed: Solid with automatic reconnection
  • Controls: Excellent via web interface
  • Web Client: Gamepad, keyboard, TrackPoint support
  • Reverse Engineering: Automated tools for new drones
  • 🔧 Performance: Optimized for low-latency streaming

Active Development:

  • Adding support for more drones from Amazon's best-selling list
  • Performance optimizations for video processing
  • Enhanced reverse engineering tools

🤝 Contributing

Adding Support for New Drones

Method 1: Use Our Tools (Recommended)

# Step 1: Detect and analyze
python tools/setup_helper.py --scan-only
python tools/protocol_analyzer.py --interactive

# Step 2: Create experimental implementation
# Add your findings to experimental/ directory

# Step 3: Submit findings
# Create GitHub issue with analysis results

Method 2: Manual Analysis

  1. Download the drone's APK from APKMirror or similar
  2. Decompile with jadx, look for network protocols
  3. Use Wireshark to capture actual traffic
  4. Implement protocol adapter based on patterns
  5. Submit pull request with implementation

Development Setup

# Clone repo
git clone https://github.com/yourusername/turbodrone.git
cd turbodrone

# Setup development environment
python tools/setup_helper.py --skip-deps  # if you want manual control

# Make changes and test
# ... your changes ...

# Run tests (if available)
cd backend && python -m pytest
cd frontend && npm test

# Submit pull request

📚 Documentation

⚠️ Safety and Legal

Flight Safety:

  • Always fly in open areas away from people and property
  • Respect local drone regulations and airspace restrictions
  • Check battery levels before flight
  • Land immediately if you lose video feed or control

Legal Notice:

  • This project is for educational and research purposes
  • Ensure compliance with local laws regarding drone operation
  • Respect manufacturer warranties and terms of service
  • Use responsibly and at your own risk

🏆 Experimental Support

For drones with limited or experimental support, see the experimental/ directory. This includes early-stage implementations that haven't been fully integrated into the main architecture.

📞 Support

Having Issues?

  1. Check the troubleshooting section
  2. Run diagnostics: python tools/setup_helper.py --scan-only
  3. Search existing GitHub issues
  4. Create a new issue with:
    • Your drone model
    • Operating system
    • Error messages
    • Output from protocol analyzer if available

Want to Help?

  • Test with new drone models
  • Improve documentation
  • Add performance optimizations
  • Contribute reverse engineering findings

TurboDrone - Liberating toy drones from their proprietary mobile apps since 2024! 🚁✨

About

reverse engineering the best-selling drones on Amazon to control programmatically

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 83.8%
  • TypeScript 14.5%
  • CSS 1.1%
  • Other 0.6%