A full-stack rehabilitation system integrating a 3D-printed, Arduino-enabled elbow exoskeleton with computer-vision-driven motion tracking to gamify physical therapy. Developed during the Axxess Hackathon in February 2025, this system achieved 95% gesture classification accuracy and demonstrated measurable clinical benefits in joint flexibility.
The platform merges hardware fabrication, computer vision, and analytics to create a complete rehabilitation feedback loop.
Core System Components:
- 3D-Printed Exoskeleton Hardware – ergonomically designed to support elbow movement while integrating sensing hardware.
- Arduino Firmware – Captures joint-angle data and streams it to the backend in real time.
- Computer Vision Pipeline – MediaPipe and OpenCV-based gesture recognition, achieving 95% accuracy in movement classification.
- Data Aggregation & Analysis – CSV-based logging with Pandas preprocessing for daily, weekly, and monthly progress tracking.
- Analytical Visualizations – Seaborn-powered dashboards mapping over 10,000 datapoints for clinical insights.
-
Backend Development
Designed and implemented the MediaPipe-powered motion tracking pipeline. Developed CSV-based log aggregation to quantify joint movement over multiple time intervals. -
Clinical Data Visualization
Built interactive Seaborn/OpenCV dashboards to visualize motion data, enabling clinicians to adjust treatment plans in real time. Processed over 10,000 datapoints for accuracy and insight. -
Full-System Integration
Unified the frontend, backend, and Arduino firmware into a cohesive system. Ensured all components worked seamlessly together through agile sprints. -
Performance Outcomes
Achieved 95% gesture recognition accuracy. Recorded a 30% improvement in joint-flexion capacity and a 40% acceleration in recovery cycles during testing scenarios.