VR Table Tennis is an accessible virtual reality table tennis game specifically designed for Blind and Low-Vision (BLV) users. The game provides an immersive table tennis experience with comprehensive accessibility features including audio guidance, haptic feedback, and spatial audio cues.
- Audio Guidance System: Beeping sounds help BLV users locate the paddle in the game environment
- Haptic Feedback: Tactile feedback through haptic gloves provides spatial awareness of ball position
- Spatial Audio: 3D audio cues for ball movement and collision detection
- Accessible Controls: Simplified interaction system optimized for VR accessibility
- Adaptive Difficulty: Automatic ball tossing system with configurable parameters
The codebase is organized into modular components for maintainability and extensibility:
Assets/Scripts/
├── Core/ # Core game management
│ └── GameManager.cs # Central game controller and state management
├── Gameplay/ # Core gameplay mechanics
│ ├── Ball.cs # Ball physics and collision handling
│ └── Tosser.cs # Automatic ball tossing system
├── Interaction/ # VR interaction systems
│ ├── HandData.cs # Hand tracking and controller input
│ └── PaddleAttacher.cs # Paddle pickup and interaction
├── Accessibility/ # Accessibility features for BLV users
│ ├── AudioGuide.cs # Audio guidance system
│ └── HapticsController.cs # Haptic feedback system
└── UI/ # User interface components
└── MenuController.cs # Menu navigation and scene management
- GameManager: Singleton pattern for centralized game state management
- Coordinates between different game systems
- Handles game start, pause, restart, and quit functionality
- Ball: Handles ball physics, collision detection, and audio feedback
- Tosser: Manages automatic ball tossing with configurable positions and timing
- HandData: Manages VR hand tracking and controller input
- PaddleAttacher: Handles paddle pickup mechanics and interaction feedback
- AudioGuide: Provides audio guidance for paddle location
- HapticsController: Delivers haptic feedback based on ball position and distance
- MenuController: Handles menu navigation and scene transitions
- Unity 2022.3 LTS: Game engine
- XR Interaction Toolkit: VR interaction framework
- Bhaptics SDK: Haptic feedback system
- Meta XR SDK: Oculus/Meta VR support
- Universal Render Pipeline: Graphics rendering
- Unity 2022.3 LTS or later
- Meta Quest 2/3 or compatible VR headset
- Bhaptics haptic gloves (optional but recommended for full experience)
- Meta XR SDK installed
git clone https://github.com/xability/a11y-vr-tabletennis.git
cd a11y-vr-tabletennis- Launch Unity Hub
- Open the project folder in Unity 2022.3 LTS
- Wait for Unity to import all assets and packages
- Go to
Edit > Project Settings > XR Plug-in Management - Enable "Oculus" and "OpenXR" plugins
- Configure your VR headset settings
- Install Bhaptics SDK from the Assets folder
- Configure haptic glove settings in the HapticsController component
- Test haptic feedback in the scene
- Go to
File > Build Settings - Select Android platform
- Configure build settings for your target device
- Build and deploy to your VR headset
- Open the project in Unity
- Open the main scene:
Assets/Scenes/game scene.unity - Press Play in the Unity Editor (for testing without VR)
- Use VR headset for full immersive experience
- Build the project for Android
- Install the APK on your Meta Quest device
- Launch the app from your Quest library
- Adjust
activationDistancein AudioGuide for paddle location sensitivity - Configure audio sources for different collision types
- Set volume levels for optimal accessibility
- Modify
minDistanceandmaxDistancein HapticsController - Adjust
minIntensityandmaxIntensityfor haptic sensitivity - Configure motor patterns for different feedback types
- Tune
throwStrengthin Tosser for ball speed - Adjust
reloadtime for ball frequency - Configure shot positions for varied gameplay
- Beeping sounds guide users to paddle location
- Distance-based audio intensity
- Automatic audio stop when paddle is picked up
- Distance-based haptic intensity
- Height-based motor activation patterns
- Real-time feedback for ball position
- 3D audio cues for ball movement
- Collision-specific sound effects
- Volume-based distance indication
- Use C# naming conventions (PascalCase for classes, camelCase for variables)
- Add XML documentation for all public methods
- Follow Unity's component-based architecture
- Keep scripts focused on single responsibilities
- Use namespaces to organize code modules
- Minimize dependencies between modules
- Test accessibility features with BLV users
- Validate haptic feedback patterns
- Ensure audio cues are clear and helpful
- Fork the repository
- Create a feature branch:
git checkout -b feature/new-feature - Make your changes following the coding guidelines
- Test thoroughly, especially accessibility features
- Submit a pull request with detailed description
- Accessibility Improvements: Enhanced audio/haptic feedback
- Gameplay Features: New game modes or mechanics
- UI/UX: Better menu systems and user experience
- Performance: Optimization for VR performance
- Documentation: Improved guides and tutorials
- Always test new features with BLV users
- Gather feedback on accessibility effectiveness
- Consider different levels of visual impairment
- Test with various assistive technologies
- Principal Investigator: JooYoung Seo
- Project Lead: Sanchita S. Kamath
- Developer: Dhruv Sethi
- Co-Designer: Aziz N. Zeidieh
This project is licensed under the MIT License - see the LICENSE file for details.
For technical support or accessibility questions:
- Create an issue on GitHub
- Contact the development team
This project is part of ongoing research into accessible VR gaming for BLV users. The goal is to create inclusive gaming experiences that provide meaningful engagement and physical activity opportunities for the BLV community.