This project is an Augmented Reality (AR) Hand Tracking User Interface demo. It uses Python, OpenCV, and MediaPipe to detect your hand via webcam and overlays futuristic UI graphics—radial gauges, HUD elements, and gesture-based controls—directly onto your hand in real time.
- Real-time hand tracking using MediaPipe
- AR-style radial and pinch UI overlays
- Gesture-based switching (open hand, pinch, fist)
- Futuristic HUD graphics: concentric circles, radial ticks, core pattern, numeric overlays
- All graphics generated programmatically
- Python 3.8+
- OpenCV
- MediaPipe
- Numpy
- Clone this repository:
git clone https://github.com/<your-username>/<your-repo>.git cd <your-repo>
- Install dependencies:
pip install -r requirements.txt
- Run the project:
python main.py
- Allow webcam access when prompted.
- Move your hand in front of the camera to interact with the AR UI overlays.
- Try different gestures (open hand, pinch, fist) to see UI changes.
Made by Tuba Khan
MIT License