CVFit is an advanced fitness tracking application that uses computer vision and pose detection to track, analyze, and improve your workout experience. Unlike traditional fitness trackers that require wearable sensors, CVFit leverages your computer's camera to monitor your movements, providing real-time metrics, performance analysis, and workout recommendations.
The app focuses on tracking hand/arm movements to estimate running metrics, making it ideal for indoor workouts where GPS tracking isn't available. Using deep learning-based pose estimation, CVFit delivers a comprehensive fitness experience with minimal setup requirements.
CVFit captures and displays key running metrics in real-time:
- Speed: Calculated from arm movements with machine learning algorithms
- Distance: Accumulated based on speed over time
- Step Count: Detected from rhythmic arm swing patterns during running
- Duration: Precise workout time tracking
- Calories: Energy expenditure estimation based on speed and duration
The system evaluates several performance dimensions:
- Stability Score: Measures the consistency of vertical arm movements
- Form Analysis: Evaluates running form by analyzing arm symmetry
- Efficiency Measurement: Combines speed and cadence data for overall movement efficiency
- Consistency Rating: Tracks variability in pace throughout the session
- Hand-focused tracking for greater privacy and performance
- Real-time visual feedback with arm/hand keypoint visualization
- Efficient processing optimized for standard webcams
- Session summaries with key performance metrics
- Historical data tracking across multiple sessions
- Visual performance trends through time-series graphs
- Windows: Native executable with installer
- macOS: Native .app bundle with DMG installer
- Linux: AppImage and tarball distribution
- Automatic releases: Built and distributed through GitHub Releases
CVFit employs YOLOv8 for efficient pose estimation, specifically focusing on hand/arm tracking:
- Model: YOLOv8n-pose, a lightweight model optimized for real-time inference
- Keypoints: Tracks shoulders (5,6), elbows (7,8), and wrists (9,10)
- Confidence Threshold: Only uses keypoints with confidence > 0.5 for reliable metrics
Speed is derived from arm movement patterns and cadence:
speed = arm_movement_speed * 0.7 + (cadence / 160.0) * 2.0
Where:
arm_movement_speed
is calculated from the displacement of wrist keypoints- Pixel-to-meter ratio is calibrated based on frame height
- Cadence factor adjusts based on steps per minute
Steps are counted by identifying the characteristic arm swing patterns during running:
# Simplified representation of the core algorithm
if ((y_vals[1] > y_vals[0] and y_vals[2] > y_vals[1]) or
(y_vals[1] < y_vals[0] and y_vals[2] < y_vals[1])) and
abs(y_vals[2] - y_vals[0]) > movement_threshold:
# Step detected
The algorithm looks for both "valleys" and "peaks" in wrist vertical position with appropriate cooldown periods to avoid multiple detections of the same step.
Calories are estimated using the MET (Metabolic Equivalent of Task) method:
calories = (MET × 3.5 × weight_kg) / (200 × 60) × seconds
Where MET values vary by speed:
- Walking (< 1.5 m/s): 2.5 METs
- Jogging (1.5-2.5 m/s): 7.0 METs
- Running (2.5-4.0 m/s): 10.0 METs
- Fast running (> 4.0 m/s): 12.5 METs
stability = base_value + left_arm_stability + right_arm_stability
Where each arm's stability is calculated by analyzing variance in vertical position.
Evaluates symmetry between left and right arm movements, with better scores when arms move in opposite directions (negative correlation coefficient).
Combines speed and cadence factors, with optimal scores in the 160-180 steps/minute range coupled with good speed.
Calculated from the coefficient of variation (CV) of recent speeds:
consistency = 50 + 45 * (1 - CV)
Download the latest version for your operating system from our GitHub Releases page:
- Windows: Download
CVFit-Windows.zip
(or the installer if available) - macOS: Download
CVFit-macOS.zip
(contains CVFit.app) - Linux: Download
CVFit-Linux.tar.gz
(or AppImage if available)
- Windows: Extract the ZIP or run the installer. You may need to approve security warnings as the app isn't signed.
- macOS: Extract the ZIP, move CVFit.app to your Applications folder. Right-click → Open for first run to bypass Gatekeeper.
- Linux: Extract the tarball or make the AppImage executable with
chmod +x CVFit-*.AppImage
- Python 3.8 or higher
- Webcam or built-in camera
- 4GB RAM minimum (8GB recommended for smoother experience)
For hassle-free setup, we provide convenient setup scripts that handle all installation steps automatically:
# Navigate to the CVFit directory
cd CVFit
# Make the script executable (if needed)
chmod +x setup.sh
# Run the setup script
./setup.sh
# Navigate to the CVFit directory
cd CVFit
# You may need to allow script execution first
Set-ExecutionPolicy -ExecutionPolicy RemoteSigned -Scope CurrentUser
# Run the setup script
.\setup.ps1
These scripts will:
- Check if Python 3.8+ is installed
- Create a virtual environment
- Install all required dependencies
- Download the YOLOv8 pose detection model
- Launch the application automatically
If you prefer to install manually, follow these steps:
- Clone the repository:
git clone https://github.com/S1D007/CVFit
cd CVFit
- Create and activate a virtual environment:
python -m venv env
source env/bin/activate # On Windows: env\Scripts\activate
- Install required packages:
pip install -r requirements.txt
- Download the pose detection model:
python download_model.py
If you didn't use the setup scripts, start the application with:
python cvfit.py
You can build standalone executables for your platform:
chmod +x build_macos.sh
./build_macos.sh
This creates a CVFit.app
in the dist
folder and optionally a DMG installer.
.\build_windows.ps1
This creates executables in the dist\CVFit
folder and optionally a setup.exe installer.
chmod +x build_linux.sh
./build_linux.sh
This creates executables in the dist/CVFit
folder and optionally an AppImage.
- Launch the application using one of the methods described above
- Select your camera source and preferred resolution
- Click "Start Tracking" to begin your workout session
- Run in place or on a treadmill while facing the camera
- Monitor your metrics on the dashboard in real-time
- Click "Stop Tracking" when finished to save your session
CVFit follows a modular architecture:
-
Core: Fundamental components for pose detection and motion analysis
pose_engine.py
: YOLOv8-based pose detectionactivity_tracker.py
: Converts pose data to fitness metricsmotion_analyzer.py
: Analyzes movement patterns
-
GUI: User interface components
app.py
: Main application window and UI logic
-
Services: Backend services for data handling
analytics_service.py
: Session data storage and analysisrecommendation_service.py
: Workout recommendations based on performancepose_service.py
: WebSocket-based pose data processing
-
Utils: Helper utilities
pose_utils.py
: Mathematical utilities for pose processingvideo_capture.py
: Thread-safe video capture
-
Build System: Cross-platform executable generation
build_macos.sh
: macOS app bundle and DMG creationbuild_windows.ps1
: Windows executable and installer creationbuild_linux.sh
: Linux binary and AppImage creation- GitHub Actions workflows for automated releases
- Camera Angle: Requires a clear front view for accurate tracking
- Lighting Conditions: Best performance under good lighting
- Clothing: Loose or baggy clothing may affect pose detection accuracy
- Calibration: Speed calculations are approximate and may require individual calibration
- Privacy: While focusing on hands only, still requires camera access
- Processing Power: May affect performance on low-end systems
CVFit uses GitHub Actions to automatically build executables for all platforms when a new release is created:
- Tag your release:
git tag v1.0.0 && git push --tags
- Create a release on GitHub using the tag
- GitHub Actions will automatically build Windows, macOS, and Linux executables
- The executables will be attached to the release for easy download
- Fork the repository
- Create a feature branch:
git checkout -b feature/my-feature
- Make your changes
- Run tests:
pytest
- Submit a pull request
- User Profiles: Personalized tracking with user-specific parameters
- Calibration Wizard: Guide users through customized calibration for more accurate metrics
- Export Functionality: Allow exporting workout data to common fitness platforms
- Voice Feedback: Audio cues and coaching during workouts
- Additional Workout Types: Support for strength training, yoga, and HIIT exercises
- Full-Body Analysis: Optional full-body tracking for comprehensive form analysis
- Multi-Person Support: Track multiple users simultaneously
- Mobile App Version: Port to mobile platforms for greater accessibility
- AI Coaching: Personalized workout recommendations and form corrections
- VR/AR Integration: Immersive workout experience with virtual environments
- YOLOv8 team for the pose estimation model
- OpenCV community for computer vision tools
- All contributors and testers who have helped improve CVFit