A sophisticated eye-tracking system that allows users to control their computer mouse using eye movements and blink gestures. This project provides a production-ready solution for hands-free computer interaction.
- Eye Tracking: Real-time iris and gaze direction tracking using webcam
- Mouse Movement: Smooth cursor movement based on eye gaze direction
- Blink Detection:
- Single blink = Left click
- Double blink = Right click
- Long blink = Drag and drop
- Scroll Control: Vertical scrolling when looking up or down
- Anti-Jitter: Advanced smoothing algorithms to prevent cursor jitter
- Debug Mode: Visual calibration interface with landmark detection
- Cross-Platform: Compatible with Linux, Windows, and macOS
- Configurable: Customizable sensitivity, thresholds, and behavior
- Production Ready: Optimized performance with minimal CPU usage
- Accessibility: Designed for users with mobility limitations
- Python 3.8 or higher
- Webcam (USB or built-in)
- Minimum 4GB RAM
- Modern CPU (Intel i5/AMD Ryzen 5 or better recommended)
See requirements.txt for complete list of Python packages.
git clone https://github.com/yourusername/ai-virtual-mouse.git
cd ai-virtual-mousepython -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activatepip install -r requirements.txt# Install additional system dependencies
sudo apt-get update
sudo apt-get install python3-opencv libgl1-mesa-glx- Ensure Visual Studio Build Tools are installed
- Install MediaPipe dependencies through pip
# Install additional dependencies
brew install opencvRun the main application:
python src/main.pyFor calibration and testing:
python src/debug.pyEdit config/settings.json to customize:
- Sensitivity settings
- Blink detection thresholds
- Mouse movement smoothing
- Scroll sensitivity
{
"eye_tracking": {
"sensitivity": 1.0,
"smoothing_factor": 0.7,
"dead_zone": 0.1
},
"blink_detection": {
"single_blink_threshold": 0.3,
"double_blink_threshold": 0.5,
"long_blink_threshold": 1.0
},
"mouse_control": {
"movement_speed": 2.0,
"scroll_sensitivity": 1.0,
"click_delay": 0.1
},
"debug": {
"show_landmarks": false,
"show_blink_detection": false,
"fps_display": true
}
}| Gesture | Action |
|---|---|
| Eye Movement | Move mouse cursor |
| Single Blink | Left click |
| Double Blink | Right click |
| Long Blink | Drag and drop |
| Look Up | Scroll up |
| Look Down | Scroll down |
| Look Left/Right | Move cursor horizontally |
ai_virtual_mouse/
โโโ src/
โ โโโ core/
โ โ โโโ eye_tracker.py # Core eye tracking logic
โ โ โโโ mouse_controller.py # Mouse control implementation
โ โ โโโ blink_detector.py # Blink detection algorithms
โ โโโ ui/
โ โ โโโ debug_interface.py # Debug/calibration UI
โ โ โโโ main_interface.py # Main application UI
โ โโโ calibration/
โ โ โโโ calibrator.py # Calibration system
โ โ โโโ settings_manager.py # Configuration management
โ โโโ main.py # Production application
โ โโโ debug.py # Debug/calibration application
โโโ utils/
โ โโโ image_processing.py # Image processing utilities
โ โโโ smoothing.py # Smoothing algorithms
โ โโโ platform_utils.py # Cross-platform utilities
โโโ config/
โ โโโ settings.json # Configuration file
โโโ data/
โ โโโ calibration_data/ # Calibration data storage
โโโ tests/
โ โโโ test_eye_tracker.py
โ โโโ test_blink_detector.py
โ โโโ test_mouse_controller.py
โโโ docs/
โ โโโ API.md
โ โโโ CALIBRATION.md
โโโ requirements.txt
โโโ README.md
python -m pytest tests/black src/ tests/
flake8 src/ tests/- Voice Commands: Integration with speech recognition
- Multi-Monitor Support: Extended desktop navigation
- Gesture Recognition: Hand gesture integration
- Machine Learning: Personalized calibration using ML
- Mobile Support: Android/iOS companion app
- Web Interface: Browser-based control panel
- Accessibility Features: Enhanced accessibility options
- Performance Optimization: GPU acceleration support
- Multi-User Support: User profiles and switching
- Cloud Calibration: Sync settings across devices
- Analytics: Usage tracking and optimization
- Plugin System: Extensible architecture
- API Integration: Third-party application support
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- MediaPipe for facial landmark detection
- OpenCV for computer vision capabilities
- PyAutoGUI for cross-platform mouse control
- The open-source community for inspiration and support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Email: [email protected]
This project is based on research in:
- Computer Vision
- Human-Computer Interaction
- Accessibility Technology
- Eye Tracking Systems
Made with โค๏ธ for accessibility and innovation