This repo contains scripts for generating data, calculating features, finding the best SVM model, and using the gesture recognition algorithm with LILYGO TTGO T-Watch. Data collected while working on the project has also been included.
This project was used for the preparation of the Wearables app (https://github.com/lukawitek000/wearables-mobile-app) for the European Project Semester 2021 at the Universitat Politècnica de València.
Requirements
Using the exisitng gestures
Creating gesture recognition algorithm with your own gestures
Example of generating a feature array
Pre-defined gestures
In order to use the files from the repo you will need:
- Python with Pandas, Numpy, micromlgen and scikit-learn packages installed
- MATLAB/Octave
- PlatformIO
- Arduino IDE
- Bluetooth serial terminal
If you want to use the ready-to-go gesture recognition algorithm trained with the ten gestures definied by us please follow the below instructions. To see the gestures, go to the Gestures section. The accuracy that we obtained using this algorithm was 93%. In case you want to train the algorithm with your own gestures, proceed to the next section.
- Using PlatformIO, open the ttgo folder as a project.
- Compile and upload the project to the watch.
- Connect the watch to your computer from a Bluetooth Serial Terminal.
- Double-tap the screen of the watch and perform a gesture after you feel the vibration. The number of the recognized gesture will be sent to the Bluetooth Serial Terminal.
- Open the gesture-data folder.
- Clear the folder from existing files. Prepare folders named Gesture n, where n is the number of the gesture. Start with n = 1 and go up to the number of gestures that you wish to recognize.
- In each of the folders create a noFiles.txt file. The file should contain the number of samples recorder for the gesture, in this case 0.
- Configure Arduino IDE for LILYGO TTGO T-Watch.
- Open the AccGenerateTrainingData folder. Open the file with the same name in Arduino IDE and compile and upload the file to the watch.
- Open Bluetooth settings on your computer and pair with TTGO Watch.
- Open the generateData file in MATLAB. Run the script. If the connection has been estabilished correctly, you should see a Enter the number of gesture: prompt. After providing the number, press enter and perform the gesture with the watch (double-tap screen and perform gesture). Follow the instructions of the terminal. In case of an error, reupload the arduino file. Sometimes errors occur while connecting to the watch, in such case also try reuploading the arduino files.
- After all the data has been collected, use the computeFeatures MATLAB script to compute a set of features for a pre-recorded sample. Iterate in order to create an array where each row is one sample generated by the script. To each row, add a column with the gesture number corresponding to the sample. You can use a importTrainingExample script to upload a specific sample of a specific gesture. For more info, check the Example of generating a feature array section and have a look at the comments and definitions of functions in the files mentioned.
- Shuffle the array, and split it to train and test sets. Save the sets as trainData.txt and evaluateData.txt csv files in the Python folder.
- Run the findSVMParameters Python script from the Python folder. The script will print in the console the fitting results with different parameters as well as the best parameters.
- Using micromlgen Python library, generate a SVM model C-file. Save it as a predict_class.h header file.
- Upload the predict_class file in the ttgo/src directory. Allow for overwriting the existing file.
- Using PlatformIO, open the ttgo folder as a project.
- Compile and upload the project to the watch.
- Connect the watch to your computer from a Bluetooth Serial Terminal.
- Double-tap the screen of the watch and perform a gesture after you feel the vibration. The number of the recognized gesture will be sent to the Bluetooth Serial Terminal.
This is a MATLAB example of generating an array of features for the purpose of training a new model.
data = [];
for g = 1:10
for i = 1:100
data = [data; computeFeatures(importTrainingExample(g, i)), g];
end
end
This section contains labels of gestures and GIFs showing them. Note that the algorithm returns a number from range 0-9, and not 1-10.









