Skip to content

fti-sfuke/AI-Virtual-Mouse-using-Eye-Gestures

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

1 Commit
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

AI Virtual Mouse using Eye Gestures

A sophisticated eye-tracking system that allows users to control their computer mouse using eye movements and blink gestures. This project provides a production-ready solution for hands-free computer interaction.

๐Ÿš€ Features

Core Functionality

  • Eye Tracking: Real-time iris and gaze direction tracking using webcam
  • Mouse Movement: Smooth cursor movement based on eye gaze direction
  • Blink Detection:
    • Single blink = Left click
    • Double blink = Right click
    • Long blink = Drag and drop
  • Scroll Control: Vertical scrolling when looking up or down
  • Anti-Jitter: Advanced smoothing algorithms to prevent cursor jitter

Advanced Features

  • Debug Mode: Visual calibration interface with landmark detection
  • Cross-Platform: Compatible with Linux, Windows, and macOS
  • Configurable: Customizable sensitivity, thresholds, and behavior
  • Production Ready: Optimized performance with minimal CPU usage
  • Accessibility: Designed for users with mobility limitations

๐Ÿ“‹ Requirements

System Requirements

  • Python 3.8 or higher
  • Webcam (USB or built-in)
  • Minimum 4GB RAM
  • Modern CPU (Intel i5/AMD Ryzen 5 or better recommended)

Dependencies

See requirements.txt for complete list of Python packages.

๐Ÿ› ๏ธ Installation

1. Clone the Repository

git clone https://github.com/yourusername/ai-virtual-mouse.git
cd ai-virtual-mouse

2. Create Virtual Environment

python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

3. Install Dependencies

pip install -r requirements.txt

4. Platform-Specific Setup

Linux

# Install additional system dependencies
sudo apt-get update
sudo apt-get install python3-opencv libgl1-mesa-glx

Windows

  • Ensure Visual Studio Build Tools are installed
  • Install MediaPipe dependencies through pip

macOS

# Install additional dependencies
brew install opencv

๐ŸŽฏ Usage

Production Mode

Run the main application:

python src/main.py

Debug/Calibration Mode

For calibration and testing:

python src/debug.py

Configuration

Edit config/settings.json to customize:

  • Sensitivity settings
  • Blink detection thresholds
  • Mouse movement smoothing
  • Scroll sensitivity

โš™๏ธ Configuration

Settings File (config/settings.json)

{
  "eye_tracking": {
    "sensitivity": 1.0,
    "smoothing_factor": 0.7,
    "dead_zone": 0.1
  },
  "blink_detection": {
    "single_blink_threshold": 0.3,
    "double_blink_threshold": 0.5,
    "long_blink_threshold": 1.0
  },
  "mouse_control": {
    "movement_speed": 2.0,
    "scroll_sensitivity": 1.0,
    "click_delay": 0.1
  },
  "debug": {
    "show_landmarks": false,
    "show_blink_detection": false,
    "fps_display": true
  }
}

๐ŸŽฎ Controls

Gesture Action
Eye Movement Move mouse cursor
Single Blink Left click
Double Blink Right click
Long Blink Drag and drop
Look Up Scroll up
Look Down Scroll down
Look Left/Right Move cursor horizontally

๐Ÿ—๏ธ Project Structure

ai_virtual_mouse/
โ”œโ”€โ”€ src/
โ”‚   โ”œโ”€โ”€ core/
โ”‚   โ”‚   โ”œโ”€โ”€ eye_tracker.py      # Core eye tracking logic
โ”‚   โ”‚   โ”œโ”€โ”€ mouse_controller.py # Mouse control implementation
โ”‚   โ”‚   โ””โ”€โ”€ blink_detector.py   # Blink detection algorithms
โ”‚   โ”œโ”€โ”€ ui/
โ”‚   โ”‚   โ”œโ”€โ”€ debug_interface.py  # Debug/calibration UI
โ”‚   โ”‚   โ””โ”€โ”€ main_interface.py   # Main application UI
โ”‚   โ”œโ”€โ”€ calibration/
โ”‚   โ”‚   โ”œโ”€โ”€ calibrator.py       # Calibration system
โ”‚   โ”‚   โ””โ”€โ”€ settings_manager.py # Configuration management
โ”‚   โ”œโ”€โ”€ main.py                 # Production application
โ”‚   โ””โ”€โ”€ debug.py                # Debug/calibration application
โ”œโ”€โ”€ utils/
โ”‚   โ”œโ”€โ”€ image_processing.py     # Image processing utilities
โ”‚   โ”œโ”€โ”€ smoothing.py            # Smoothing algorithms
โ”‚   โ””โ”€โ”€ platform_utils.py      # Cross-platform utilities
โ”œโ”€โ”€ config/
โ”‚   โ””โ”€โ”€ settings.json           # Configuration file
โ”œโ”€โ”€ data/
โ”‚   โ””โ”€โ”€ calibration_data/       # Calibration data storage
โ”œโ”€โ”€ tests/
โ”‚   โ”œโ”€โ”€ test_eye_tracker.py
โ”‚   โ”œโ”€โ”€ test_blink_detector.py
โ”‚   โ””โ”€โ”€ test_mouse_controller.py
โ”œโ”€โ”€ docs/
โ”‚   โ”œโ”€โ”€ API.md
โ”‚   โ””โ”€โ”€ CALIBRATION.md
โ”œโ”€โ”€ requirements.txt
โ””โ”€โ”€ README.md

๐Ÿ”ง Development

Running Tests

python -m pytest tests/

Code Style

black src/ tests/
flake8 src/ tests/

๐Ÿš€ Future Enhancements

Planned Features

  • Voice Commands: Integration with speech recognition
  • Multi-Monitor Support: Extended desktop navigation
  • Gesture Recognition: Hand gesture integration
  • Machine Learning: Personalized calibration using ML
  • Mobile Support: Android/iOS companion app
  • Web Interface: Browser-based control panel
  • Accessibility Features: Enhanced accessibility options
  • Performance Optimization: GPU acceleration support

Advanced Features

  • Multi-User Support: User profiles and switching
  • Cloud Calibration: Sync settings across devices
  • Analytics: Usage tracking and optimization
  • Plugin System: Extensible architecture
  • API Integration: Third-party application support

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ™ Acknowledgments

  • MediaPipe for facial landmark detection
  • OpenCV for computer vision capabilities
  • PyAutoGUI for cross-platform mouse control
  • The open-source community for inspiration and support

๐Ÿ“ž Support

๐Ÿ”ฌ Research

This project is based on research in:

  • Computer Vision
  • Human-Computer Interaction
  • Accessibility Technology
  • Eye Tracking Systems

Made with โค๏ธ for accessibility and innovation

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages