A web application that uses the trained model I built in asamyuta-hastas-detection-using-mediapipe to recognize 28 unique hand signs, known as the Asamyuta Hastas, from the Indian classical dance form, Bharatanatyam. This is meant to be an interactive experience and a fun game to test dancers' skills.
Here's what you can do on this platform:
- Detection: Simply make a hasta viniyoga, and the algorithm will detect it.
- Single Player Game: Test your skills as you try to get as many hand gestures right as possible within a 10-second frame.
- Two Player Game: Compete with a friend and see who can nail the most hand gestures in 10 seconds.
asamyuta-hastas-webapp
│ app.py
│ helper.py
│ leaderboard.db
│
├───model
│ keypoint_classifier.tflite
│
├───my-app
│ ├───src
│ │ ├───components
│ │ │ App.css
│ │ │ GamePage.js
│ │ │ HandDetection.js
│ │ │ HowToPlay.css
│ │ │ HowToPlay.js
│ │ │ LandingPage.js
│ │ │ MainApp.js
│ │ │ manymandalas.jpg
│ │ │ MPGamePage.css
│ │ │ MPGamePage.js
│ │ └───... (other directories and files in src)
│ └───... (other directories and files in my-app)
│
└───__pycache__
helper.cpython-39.pyc
Backend: Flask Frontend: React
How I trained the model to recognise the unique hand gesutues is elaborated upon in asamyuta-hastas-detection-using-mediapipe for which I took reference from hand-gesture-recognition-mediapipe repository. Significant modifications were made to cater to the specific needs of recognizing Asamyuta Hastas, including changes in data collection methodology and model architecture.
