A computer vision and audio-visual tool that allows you to control visuals and audio effects using your hands in real-time!
Handstrument leverages MediaPipe for hand tracking and Tone.js for audio effects, enabling interactive DJ-like music and visual experiences through hand gestures.
- Hand Tracking: Detects the presence and movements of both hands.
- Velocity Calculation: Measures hand velocity to influence audio playback.
- Audio Integration: Uses Tone.js to map hand movements to pitch and feedback effects.
- Interactive UI: Displays feedback to users about their hand movements and actions.
- Mobile Compatibility: Ensures functionality on mobile devices, including a loading screen for better UX.
- Import MediaPipe:
- Include the vision library via CDN:
@mediapipe/tasks-vision.
- Include the vision library via CDN:
- Hand Tracking:
- Log finger values in the console.
- Gesture Calculations:
- Track hand X, Y coordinates.
- Calculate hand velocity.
- Audio Integration:
- Import Tone.js or another music library.
- Use hand movements to influence audio effects' wetness.
- User Interface:
- Build a UI to indicate hand-triggered actions.
- Allow users to use different notes for each hand.
- Deployment:
- Upload the project to Netlify.
- Debug and ensure functionality in the production environment, including custom drawing functions.
- Mobile Support:
- Add a loading screen for mobile devices.
- Verify that the tool works correctly on mobile.
- Configuration Management:
- Create a
config.jsonfile to store threshold values for the app.
- Create a
- Code Refactoring:
- Consolidate references into a single state for simpler mutations in child functions.
- iOS Compatibility:
- Audio will not work on iOS devices if the phone is in silent mode.