Skip to content

Latest commit

 

History

History

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 

README.md

Helmet Detection on Camera with LED

Table of Contents

1. Overview.

The Helmet Detection demo showcases the edge AI capabilities of the Arduino® UNO Q using a trained model from Edge Impulse. This application enables real-time helmet recognition from a live video feed captured by a USB webcam and control the LED matrix for detected helmet and other status.

  • 📷 Live Helmet Detection: Continuously captures frames from a USB camera and detects helmet using a pre-trained AI model.
  • 🧠 AI-Powered Processing: Utilizes the video_objectdetection Brick to analyze video frames and identify helmets.
  • 🖼️ Real-Time Visualization: Displays helmet labels around detected faces directly on the video feed.
  • 🌐 Web-Based Interface: Managed through an web interface for seamless control and monitoring.
  • 💡 LED Matrix Behavior Mapping: Changes LED patterns and animations based on detection results.

⚠️ Important: This demo must be run in Network Mode or SBC within the Arduino App Lab. It requires:

This demonstration highlights how the Arduino UNO Q can be paired with a USB webcam to perform edge AI tasks such as helmet recognition. It exemplifies the integration of Edge Impulse models with Arduino hardware for intelligent, real-time computer vision applications.

N|Solid

2. Requirements.

2.1 Hardware.

  • Arduino® UNO Q
  • USB camera (x1)
  • USB-C® hub adapter with external power (x1)
  • A power supply (5 V, 3 A) for the USB hub (e.g., a phone charger)
  • Personal computer (x86/AMD64) with internet access #Helmet Detection Setup Workflow

2.2 Software.

3. Helmet Detection Workflow.

flowchart LR
    A[Start] --> B[Setting Up Arduino UNO-Q Device]
    B --> C[Download Required Models]
    C --> D[Create the Helmet Detection Application]
    D --> E[Run the Helmet Detection Application]
    E --> F[✅ Helmet Detection Active]
Loading

4. Setup Instructions.

Before proceeding further, please ensure that all the setup steps outlined below are completed in the specified order. These instructions are essential for configuring the various tools required to successfully run the application.

Each section provides a reference to internal documentation for detailed guidance. Please follow them carefully to avoid any setup issues later in the process.

4.1 Setting Up Visual Studio Code (VS Code).

Visual Studio Code is the recommended IDE for editing, debugging, and managing the project’s source code. It provides essential extensions and integrations that streamline development workflows. Please follow the setup instructions carefully to ensure compatibility with the project environment.

For detailed steps, refer to the internal documentation: Set up VS Code

4.2. Setting Up Arduino App Lab.

Arduino App Lab enables you to create and deploy Apps directly on the Arduino® UNO Q board, which integrates both a microcontroller and a Linux-based microprocessor. The App Lab runs seamlessly on personal computers (Windows, macOS, Linux) and comes pre-installed on the UNO Q, with automatic updates. Please follow the setup instructions carefully to ensure smooth development and deployment of Apps.

For detailed steps, refer to the documentation: Set up Arduino App Lab

4.3. Setting Up Arduino Flasher Cli.

Arduino Flasher CLI provides a streamlined way to flash Linux images onto your Arduino UNO Q board. Please follow the setup instructions carefully to avoid flashing errors and ensure proper board initialization.

For detailed steps, refer to the documentation: Arduino Flasher CLI

4.4. Setting Up Arduino UNO-Q Device.

Arduino UNO-Q must be properly configured to ensure reliable communication with the host system and accurate sensor data acquisition. Please follow the setup instructions carefully to avoid hardware conflicts and ensure seamless integration with the software stack.

For detailed steps, refer to the documentation: Set up Arduino UNO-Q.

5. Get the Model from Edge Impulse.

Edge Impulse empowers you to build datasets, train machine learning models, and optimize libraries for deployment directly on-device.

Click here to know more about Edge Impulse

5.1 Setup an Edge Impulse Account.

An Edge Impulse account is required to access the platform’s full suite of tools for building, training, and deploying machine learning models on the Arduino UNO Q. Please follow the setup instructions carefully to ensure proper integration with your device and development workflow.

Follow the instructions to sign up: Signup Instructions

5.2 Clone the Edge Impulse Project.

Cloning an Edge Impulse project allows you to replicate existing machine learning workflows, datasets, and configurations for customization or deployment on the Arduino UNO Q. Please follow the setup instructions carefully to ensure proper synchronization and compatibility with your device.

Clone the Helmet Detection Project

For detailed steps, refer to the documentation: Clone the Repository

5.3 Build and Download Deployable Model.

Edge Impulse allows you to build optimized machine learning models tailored for deployment on the Arduino UNO Q. Once trained, models can be compiled into efficient libraries and downloaded for direct integration with your device. Please follow the setup instructions carefully to ensure the model is compatible with your hardware and application requirements.

Mandatory step:

  1. Select Arduino UNO Q Hardware while configuring your deployment at the Deployment stage.
  2. Build the model (It automatically downloads the deployable model).

N|Solid

For detailed steps, refer to the documentation: Build and Deploy Model

6. Prepare the Application.

This section will guide you on how to create a new application from existing examples, configure Edge Impulse models, set up the application parameters, and build the final App for deployment on the Arduino UNO Q.Starting from pre-built examples is recommended for first-time users to better understand the structure and workflow.

6.1 Copy existing Video Detection on camera Application.

Arduino App Lab provides a ready-to-use Video Detection on Camera application that can be copied and customized for your specific use case. This section will guide you through duplicating the existing application, modifying its components, integrating Edge Impulse models, and tailoring the detection logic to suit your deployment on the Arduino UNO Q.

In this example we are taking the Video Detection on camera Application for helmet detection.

N|Solid

N|Solid

For detailed steps, refer to the documentation: Copy and Edit existing sample

6.2 Copy sketch files from existing Air quality on LED matrix Application.

We would like to control the LED matrix on Arduino UNO Q board. Arduino App Lab provides a ready-to-use LED matrix control example Air quality on LED matrix Application that can be copied and customized for your specific use case. This section will guide you through duplicating the existing application, modifying its components, integrating Edge Impulse models, and tailoring the detection logic to suit your deployment on the Arduino UNO Q.

In this example we are coping the LED matrix control sketch from Air quality on LED matrix Application.

cd /home/arduino/.local/share/arduino-app-cli/examples/air-quality-monitoring/sketch

cp -r sketch/* /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/sketch

6.3 Upload Model to the Device.

Once the deployable model is built in Edge Impulse, it must be uploaded to the Arduino UNO Q to enable real-time inference and application integration. This section will guide you through transferring the compiled model to the device, verifying compatibility, and preparing it for execution within your App Lab application.

Here mention about usage of the model which download from edge impulse in the previous step. Build and Deploy Model

Upload location:Make sure to upload the model file to /home/arduino/.arduino-bricks/ei-models/helmet-detection-linux-aarch64-v8.eim

For detailed steps, refer to the documentation: Upload Model

6.4 Modify the Configuration file.

Steps

  1. Create your working directory:

    mkdir my_working_directory
    cd my_working_directory
  2. Download Your Application:

    git clone -n --depth=1 --filter=tree:0 https://github.com/qualcomm/Startup-Demos.git
    cd Startup-Demos
    git sparse-checkout set --no-cone /CV_VR/IoT-Robotics/helmet-detection-on-camera-with-led/
    git checkout
  3. Navigate to Application Directory:

    cd ./CV_VR/IoT-Robotics/helmet-detection-on-camera-with-led/

The app.yaml file defines the structure, behavior, and dependencies of your Arduino App Lab application. Modifying this configuration allows you to customize how your app interacts with hardware, integrates Edge Impulse models, and launches on the Arduino UNO Q. This section will guide you through editing key parameters such as bricks, model paths, and runtime settings. Please follow the setup instructions carefully to ensure your application runs as expected.

cp app.yaml /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/

6.5 Modify the sketch file.

The sketch.ino file contains the main program logic for your Arduino App Lab project. It initializes hardware, communicates with bricks defined in app.yaml, and runs the primary control loop. Use this file to implement custom behaviors, sensor reading, actuator control, and model inference on the Arduino UNO Q.

cp sketch.ino /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/sketch/

6.6 Modify the main python file.

The main.py file contains the core Python logic for your Arduino App Lab application. It handles communication with connected bricks, runs Edge Impulse model inference, and processes events coming from the App Lab runtime. Use this file to define custom behaviors, manage data flow, and implement high‑level control logic for your application.

cp main.py /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/python/

7. Run the Helmet Detection with LED application.

Once your application is configured and built in Arduino App Lab, it can be deployed and executed directly on the Arduino UNO Q. This section will guide you through launching the application, verifying sensor input form camera, and observing real-time result.

N|Solid

For detailed steps, refer to the documentation: Run Application

7.1 Demo Output.

N|Solid N|Solid N|Solid