Adversarial VR: An Open-Source Testbed for Evaluating Adversarial Robustness of VR Cybersickness Detection and Mitigation
Adversarial-VR is a real-time VR testbed for evaluating deep learning (DL)-based cybersickness detection and mitigation strategies under adversarial conditions. It integrates:
- A Python Flask backend for deep learning-based cybersickness detection and adversarial attack generation
- A Unity-based VR maze frontend for real-time simulation, sensor data collection, and adaptive mitigation
- Introduction
- System Overview
- Experimental Data & Models
- Backend: Flask API
- Frontend: Unity VR
- Builds & Setup
- Tool Usage
- Contacts
- License
- Acknowledgements
- Known Issues
Adversarial-VR is a real-time VR testbed for evaluating DL-based automatic cybersickness detection and mitigation strategies under adversarial attack conditions. The backend hosts a DL model which is trained on the MazeSick Dataset, exposes a REST API for severity prediction, and supports three state-of-the-art adversarial attacks (MI-FGSM, PGD, C&W). The Unity frontend streams VR sensor data (e.g., eye tracking and head tracking), receives cybersickness severity prediction (e.g., none, low, medium, and high), and applies an automatic cybersickness mitigation technique (e.g., dynamic field of view).
- Backend: Python Flask server with a trained .keras DL model for cybersickness severity classification. Supports adversarial perturbation of inputs.
- Frontend: Unity VR maze simulation (PCVR/Android), real-time eye/head tracking, dynamic vignette mitigation based on backend prediction.
- Hardware: HTC Vive Pro Eye (recommended for full feature support).
- Dataset: Trained using MazeSick (open-source, see below).
- Model Training Data: MazeSick Dataset.
Request access or see the publication for download instructions. - Trained Model: Provided as
.kerasfile (e.g, Transformer model). - Feature List: See code and paper; ensure you maintain order and normalization.
- REST API
/predict: Accepts feature vector (e.g., eye and head tracking features) and returns cybersickness severity (None, Low, Medium, High). - Adversarial Modes: Switchable attacks using CleverHans:
- MI-FGSM, PGD, C&W (edit code to enable/disable each).
- Trained Model: Loads
.kerasmodel.
# 1. Create environment
python3 -m venv venv
source venv/bin/activate # or venv\Scripts\activate on Windows
# 2. Install dependencies
pip install -r requirements.txt # includes Flask, TensorFlow, CleverHans, h5py
# 3. Place your model file:
# model/Transformer.keras
# 4. Run Flask API
python AdversarialAttack.pyAPI runs by default at http://localhost:8000/predict
POST /predict
{
"Left_Eye_Openness": ...,
"Right_Eye_Openness": ...,
...
"Velocity": ...
}Returns: predicted class, confidence, adversarial class/confidence (if enabled).
- Edit
AdversarialAttack.pyand comment/uncomment the adversarial attack line you want (MI-FGSM, PGD, C&W). - Restart the Flask server for change to take effect.
- Maze simulation with coin collection, first-person movement.
- Real-time streaming of eye/head tracking data to Flask backend.
- Automatic mitigation: Dynamic field-of-view ("tunneling vignette") that narrows FOV based on model's predicted severity.
- Visual indicators for severity class, coin count, and vignette effect.
- Unity 2020.3 LTS or newer
- HTC Vive Pro Eye (or compatible with eye/head tracking)
- SRanipal SDK and Tobii Vive SDK imported to project
- Unity XR plugin
- Import SRanipal & Tobii SDKs, set up eye/head tracking scripts.
- Import the Tunneling Vignette component and custom controller script.
- Implement a script for streaming feature vectors to the backend (
UnityWebRequest). - Update the vignette parameters in real time according to backend predictions (mapping severity levels to aperture/feather, see paper/Table 2).
- Set backend URL in your Unity client scripts.
- See Backend Setup for creating environment and running server.
- Open Unity project, ensure dependencies above are imported.
- Set backend URL for prediction.
- Build for PCVR (default) or Android (with adaptation).
- Start Flask API (
python AdversarialAttack.py). - Play in Unity: The system streams real-time sensor data to backend, applies mitigation automatically.
- Enable Attacks: Change adversarial mode in backend as needed, observe Unity response.
If this is useful for your work, please cite our paper:
@INPROCEEDINGS{AdversarialVR,
author={Ahmed, Istiak and Kundu, Ripan Kumar and Hoque, Khaza Anuarul},
booktitle={2025 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)},
title={Adversarial VR: An Open-Source Testbed for Evaluating Adversarial Robustness of VR Cybersickness Detection and Mitigation},
year={2025},
volume={},
number={},
pages={281-288},
keywords={Headphones;Deep learning;Solid modeling;Visualization;Accuracy;Cybersickness;Prevention and mitigation;Transformers;Robustness;Real-time systems;Cybersickness;Virtual Reality;Deep Learning;Adversarial Attacks;Cybersickness Mitigation},
doi={10.1109/ISMAR-Adjunct68609.2025.00061}}
}
Maintained by Khaza Anuarul Hoque.
MIT License
- MazeSick Dataset
- CleverHans adversarial attack library
- Unity Tunneling Vignette
- HTC SRanipal, Tobii, and Unity XR SDKs
- Android/mobile VR support may require extra development and custom eye-tracking code.
- Flask backend and Unity must be able to communicate over network; ensure firewall/port settings.
- Some adversarial attacks (esp. C&W) are computationally expensive and may add response delay.
