Skip to content

Prtm2110/SAR-DroneAI

Repository files navigation

Autonomous Search and Rescue Drone System

License: MIT Python Version PyBullet Stable Baselines3

Overview

This project implements an autonomous multi-drone system for search and rescue operations using reinforcement learning (RL). The system trains drone swarms to efficiently search areas, detect victims, and optimize rescue routes while maintaining stable flight patterns. The project consists of:

Multi Drone Training

  • A real-time drone UI for monitoring and controlling drones
  • Reinforcement learning models to optimize drone search efficiency
  • Simulation-based training with PyBullet for drone behavior learning
  • Autonomous victim detection and rescue path optimization

Key Features

  • Autonomous Drone Operations: Drones navigate and search independently
  • Multi-sensor Integration: Uses position, IMU, and FOV data
  • Victim Detection & Rescue Planning: AI-driven decision-making for optimal rescue routes
  • Real-time UI Dashboard: Monitor drone behavior and RL training
  • Reinforcement Learning: PPO-based training for optimized navigation

RL Dashboard

System Components

1. Drone UI

A React and Vite-powered dashboard for real-time visualization and control.

2. Backend (Drone-UI)

A Python-based backend for processing drone input and managing communication.

3. Reinforcement Learning Module (Drone-RL-Processing)

Trains drones using PPO with PyBullet simulation.


Setup & Installation

1. Clone the repository

git clone https://github.com/Prtm2110/SAR-DroneAI
cd SAR-DroneAI

2. Backend Setup (Drone-UI)

cd Drone-UI
python -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
pip install -r requirements.txt
python backend.py

3. Frontend Setup (Drone-Dashboard-Frontend)

cd ../Drone-Dashboard-Frontend
npm install
npm run dev

Navigate to http://localhost:5173 to view the UI.

4. Running the Basic UI (Static Page)

To serve the basic UI from Drone-UI/index.html:

cd ../Drone-UI
python -m http.server

Navigate to http://localhost:8000/index.html to view the UI.


Basic UI

Running the RL Simulation

1. Setup the RL Environment

cd ../Drone-RL-Processing
python -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate
pip install -r requirements.txt

2. Start RL Training & Simulation

# Run the single drone simulation with FOV visualization
python singleDroneWithFOV.py

# Optimized single drone simulation
python singleDrone_final.py

# Multi-drone simulation
python fourDrones_final.py

3. Start RL Dashboard

cd RL_Training_FrontendReport
npm install
npm start

Navigate to http://localhost:3000 to view training metrics.


Performance Metrics

  • Search Efficiency: Evaluates coverage per unit time
  • Victim Detection Rate: Measures success in locating victims
  • Revisit Rate: Tracks unnecessary redundant searches
  • Flight Stability: Ensures stable drone movement during searches

Email Notification System

Mission progress updates are automatically sent via email:

  • Search coverage statistics
  • Victim detection details
  • Optimized rescue paths

License

This project is open-source and available under the MIT License.

About

Autonomous AI-powered drone system for Search and Rescue missions.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors