This repository contains the Unity project to facilitate the data collection of this research project that was published in ISMAR 2022 Multimodal lnteraction with Gaze and Controller Gesture. Happy to know if this project helped anybody on any future development of this / other research projects :)
Features included:
- A Unity scene that comprises of the target, the dragging path and event listeners for the different metrics indicated in the publication's test scenarios.
- User interaction with a combination of different interaction modes (eye gaze, controller, hand gesture input).
- Targets and dragging path that are interactable with the above-mentioned interactions (shown in the gif below).
- A full demo of the task and input interaction modes is available here: https://youtu.be/zxWkTnVsHIM
- Easily control user testing and data collection settings
- Collect these data in a consolidated excel sheet available in "Streaming Asset" folder
To be improved:
- Eye jitteriness: During the research, we realized that eye jitteriness is an issue faced by the participant and resulted in high target-dragging error rate. One of the proposed solution is to build a deadzone around the target to reduce the movement of the target induced by eye jitteriness. This proposal could be explored further in the future.
- HTC's Vive Pro Eye (https://business.vive.com/sea/product/vive-pro-eye/)
- Unity 2019.4.14f
- Refer here for more info: https://unity.com/releases/editor/whats-new/2019.4.14
- Tobii XR SDK
- HTC Vive Pro Eye headset was used in this project. Refer here for more info on the setup: https://developer.tobii.com/xr/develop/unity/getting-started/vive-pro-eye/
- This project uses Tobii eye-gaze API for most of the eye-gaze-based interaction. Therefore, the scripts should be compatible with other headsets that are compatible with Tobii SDK (Refer here for other compatible headsets: https://developer.tobii.com/xr/develop/unity/)
To conduct user testing
- Navigate to the main test scene: Assets/Scenes/Main Testbed Final.unity
- Change the experiment's conditions using the following fields:
- Under TaskControllers game object
- Experiment Controller Script (To change the metadata)
- Task type to either Primary task only or Primary and Rotation task
- Input mode: ControllerClick / GazeDwell / GazeClick
- Participant ID
- Toggle between enable (for actual data collection) or disable (for dry runs) data collection
- Primary Task Controller Script (To change the test condition)
- Toggle between randomized test condition or non-randomized (which will utilize the settings below)
- Change object size
- Change target distance
- Change path direction
- Experiment Controller Script (To change the metadata)
- Run the project