A Little Friend for a Big Focus
Nomo is an interactive desk companion robot built to encourage healthier work habits through computer vision, gesture recognition, and playful nudges. Designed as a capstone project at NYU Abu Dhabi, Nomo blends real-time software with expressive hardware to promote mindful breaks and deep focus—making your workflow a little more human.
interactive-media
robotics
robot-companion
machine-learning
google-mediapipe
raspberry-pi-5
arduino
wellbeing
capstone-project
pomodoro
- Real-time gesture recognition (wave, closed fist, thumbs-up)
- LED feedback to guide breaks and focus sessions
- Raspberry Pi + Arduino integration
- Pettable with responsive purring vibration
- Location-based behavior (work and break zones)
- Designed with routines, attention, and well-being in mind
- MediaPipe – real-time hand tracking and gesture recognition
- OpenCV – image capture, annotation, and video streaming
- Flask – lightweight web framework for video streaming
- Picamera2 – camera interface for Raspberry Pi
- Arduino – microcontroller used for motion, sound, and feedback
- Python – control logic, CV processing, and server orchestration
- Pyserial – serial communication for Pi-to-Arduino command flow
- Raspberry Pi 5 – main compute unit for processing gestures, state transitions, and control logic
Nomo runs on a Raspberry Pi using Python for vision and logic, paired with an Arduino for motor and sensor control. When placed in a predefined location, it can detect gestures and trigger break/focus states accordingly. Each mode is designed to reflect a balance between productivity and self-care.
For a full overview of modes, gestures, and setup, see the Product Manual.
For a full overview of modes, gestures, and setup, see the State Machine Diagram.
nomo/
├── pi/ # Raspberry Pi scripts (python - gesture, logic)
├── arduino/ # Arduino sketches (servo, vibration, LEDs, sounds)
├── assets/ # Gesture samples, hero images, mockups
├── docs/ # Product manual
└── README.md
This project was made possible with the support and guidance of:
-
Professor Nimrah Syed – Capstone Advisor
Provided continuous mentorship and feedback from the ideation stage through final execution. -
Professor Michael Shiloh – Technical Mentor
Offered invaluable assistance in hardware design, debugging, and user testing throughout development. -
Daniel Nivia – NYU Abu Dhabi Alumnus
Contributed custom audio cues, lending Nomo its personality and expressive audio interactions.
We deeply appreciate their time, expertise, and encouragement in bringing Nomo to life.
We also extend our heartfelt thanks to the Interactive Media faculty, instructors, lab staff, and the NYUAD Art Gallery for their continuous support, resources, and encouragement throughout the capstone journey.
This was developed as part of a Capstone Project at NYU Abu Dhabi (2025).
The code and associated materials are made available under the terms of the MIT License.
They may be freely used, modified, and distributed for personal, academic, and non-commercial purposes.
Commercial use, resale, or distribution of derivative works for profit is not permitted without prior written consent of both project authors.