- 1. Overview
- 2. Requirements
- 3. Helmet Detection Workflow
- 4. Setup Instructions
- 5. Get the Model from Edge Impulse
- 6. Prepare the Application
- 7. Run the Helmet Detection with LED application
The Helmet Detection demo showcases the edge AI capabilities of the Arduino® UNO Q using a trained model from Edge Impulse. This application enables real-time helmet recognition from a live video feed captured by a USB webcam and control the LED matrix for detected helmet and other status.
- 📷 Live Helmet Detection: Continuously captures frames from a USB camera and detects helmet using a pre-trained AI model.
- 🧠 AI-Powered Processing: Utilizes the
video_objectdetectionBrick to analyze video frames and identify helmets. - 🖼️ Real-Time Visualization: Displays helmet labels around detected faces directly on the video feed.
- 🌐 Web-Based Interface: Managed through an web interface for seamless control and monitoring.
- 💡 LED Matrix Behavior Mapping: Changes LED patterns and animations based on detection results.
⚠️ Important: This demo must be run in Network Mode or SBC within the Arduino App Lab. It requires:
This demonstration highlights how the Arduino UNO Q can be paired with a USB webcam to perform edge AI tasks such as helmet recognition. It exemplifies the integration of Edge Impulse models with Arduino hardware for intelligent, real-time computer vision applications.
- Arduino® UNO Q
- USB camera (x1)
- USB-C® hub adapter with external power (x1)
- A power supply (5 V, 3 A) for the USB hub (e.g., a phone charger)
- Personal computer (x86/AMD64) with internet access #Helmet Detection Setup Workflow
flowchart LR
A[Start] --> B[Setting Up Arduino UNO-Q Device]
B --> C[Download Required Models]
C --> D[Create the Helmet Detection Application]
D --> E[Run the Helmet Detection Application]
E --> F[✅ Helmet Detection Active]
Before proceeding further, please ensure that all the setup steps outlined below are completed in the specified order. These instructions are essential for configuring the various tools required to successfully run the application.
Each section provides a reference to internal documentation for detailed guidance. Please follow them carefully to avoid any setup issues later in the process.
Visual Studio Code is the recommended IDE for editing, debugging, and managing the project’s source code. It provides essential extensions and integrations that streamline development workflows. Please follow the setup instructions carefully to ensure compatibility with the project environment.
For detailed steps, refer to the internal documentation: Set up VS Code
Arduino App Lab enables you to create and deploy Apps directly on the Arduino® UNO Q board, which integrates both a microcontroller and a Linux-based microprocessor. The App Lab runs seamlessly on personal computers (Windows, macOS, Linux) and comes pre-installed on the UNO Q, with automatic updates. Please follow the setup instructions carefully to ensure smooth development and deployment of Apps.
For detailed steps, refer to the documentation: Set up Arduino App Lab
Arduino Flasher CLI provides a streamlined way to flash Linux images onto your Arduino UNO Q board. Please follow the setup instructions carefully to avoid flashing errors and ensure proper board initialization.
For detailed steps, refer to the documentation: Arduino Flasher CLI
Arduino UNO-Q must be properly configured to ensure reliable communication with the host system and accurate sensor data acquisition. Please follow the setup instructions carefully to avoid hardware conflicts and ensure seamless integration with the software stack.
For detailed steps, refer to the documentation: Set up Arduino UNO-Q.
Edge Impulse empowers you to build datasets, train machine learning models, and optimize libraries for deployment directly on-device.
Click here to know more about Edge Impulse
An Edge Impulse account is required to access the platform’s full suite of tools for building, training, and deploying machine learning models on the Arduino UNO Q. Please follow the setup instructions carefully to ensure proper integration with your device and development workflow.
Follow the instructions to sign up: Signup Instructions
Cloning an Edge Impulse project allows you to replicate existing machine learning workflows, datasets, and configurations for customization or deployment on the Arduino UNO Q. Please follow the setup instructions carefully to ensure proper synchronization and compatibility with your device.
Clone the Helmet Detection Project
For detailed steps, refer to the documentation: Clone the Repository
Edge Impulse allows you to build optimized machine learning models tailored for deployment on the Arduino UNO Q. Once trained, models can be compiled into efficient libraries and downloaded for direct integration with your device. Please follow the setup instructions carefully to ensure the model is compatible with your hardware and application requirements.
Mandatory step:
- Select Arduino UNO Q Hardware while configuring your deployment at the Deployment stage.
- Build the model (It automatically downloads the deployable model).
For detailed steps, refer to the documentation: Build and Deploy Model
This section will guide you on how to create a new application from existing examples, configure Edge Impulse models, set up the application parameters, and build the final App for deployment on the Arduino UNO Q.Starting from pre-built examples is recommended for first-time users to better understand the structure and workflow.
Arduino App Lab provides a ready-to-use Video Detection on Camera application that can be copied and customized for your specific use case. This section will guide you through duplicating the existing application, modifying its components, integrating Edge Impulse models, and tailoring the detection logic to suit your deployment on the Arduino UNO Q.
In this example we are taking the Video Detection on camera Application for helmet detection.
For detailed steps, refer to the documentation: Copy and Edit existing sample
We would like to control the LED matrix on Arduino UNO Q board. Arduino App Lab provides a ready-to-use LED matrix control example Air quality on LED matrix Application that can be copied and customized for your specific use case. This section will guide you through duplicating the existing application, modifying its components, integrating Edge Impulse models, and tailoring the detection logic to suit your deployment on the Arduino UNO Q.
In this example we are coping the LED matrix control sketch from Air quality on LED matrix Application.
cd /home/arduino/.local/share/arduino-app-cli/examples/air-quality-monitoring/sketch
cp -r sketch/* /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/sketchOnce the deployable model is built in Edge Impulse, it must be uploaded to the Arduino UNO Q to enable real-time inference and application integration. This section will guide you through transferring the compiled model to the device, verifying compatibility, and preparing it for execution within your App Lab application.
Here mention about usage of the model which download from edge impulse in the previous step. Build and Deploy Model
Upload location:Make sure to upload the model file to /home/arduino/.arduino-bricks/ei-models/helmet-detection-linux-aarch64-v8.eim
For detailed steps, refer to the documentation: Upload Model
-
Create your working directory:
mkdir my_working_directory cd my_working_directory -
Download Your Application:
git clone -n --depth=1 --filter=tree:0 https://github.com/qualcomm/Startup-Demos.git cd Startup-Demos git sparse-checkout set --no-cone /CV_VR/IoT-Robotics/helmet-detection-on-camera-with-led/ git checkout
-
Navigate to Application Directory:
cd ./CV_VR/IoT-Robotics/helmet-detection-on-camera-with-led/
The app.yaml file defines the structure, behavior, and dependencies of your Arduino App Lab application. Modifying this configuration allows you to customize how your app interacts with hardware, integrates Edge Impulse models, and launches on the Arduino UNO Q. This section will guide you through editing key parameters such as bricks, model paths, and runtime settings. Please follow the setup instructions carefully to ensure your application runs as expected.
cp app.yaml /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/The sketch.ino file contains the main program logic for your Arduino App Lab project. It initializes hardware, communicates with bricks defined in app.yaml, and runs the primary control loop. Use this file to implement custom behaviors, sensor reading, actuator control, and model inference on the Arduino UNO Q.
cp sketch.ino /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/sketch/The main.py file contains the core Python logic for your Arduino App Lab application. It handles communication with connected bricks, runs Edge Impulse model inference, and processes events coming from the App Lab runtime. Use this file to define custom behaviors, manage data flow, and implement high‑level control logic for your application.
cp main.py /home/arduino/ArduinoApps/helmet-detection-on-camera-with-led/python/Once your application is configured and built in Arduino App Lab, it can be deployed and executed directly on the Arduino UNO Q. This section will guide you through launching the application, verifying sensor input form camera, and observing real-time result.
For detailed steps, refer to the documentation: Run Application






