Skip to content

BrainHack Vanderbilt 2025: A validation project for DeepMReye, a neuroimaging tool that uses deep learning to estimate gaze positions directly from functional MRI (fMRI) data. Our primary aim is to compare DeepMReye’s gaze predictions with traditional eye-tracking measurements to assess accuracy and reliability.

Notifications You must be signed in to change notification settings

Zoha-Arif/DeepMReye_Validation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 

Repository files navigation

DeepMReye Validation

A validation project for DeepMReye, a neuroimaging tool that uses deep learning to estimate gaze positions directly from functional MRI (fMRI) data. Our primary aim is to compare DeepMReye’s gaze predictions with traditional eye-tracking measurements to assess accuracy and reliability.


Table of Contents

  1. Onboarding Documentation
  2. Prerequisites
  3. Installation
  4. Data Access
  5. Project Structure
  6. Contributing
  7. License

Onboarding Documentation

Below are some essential resources and references to help you get started:

Additional Documentation and Links:


Prerequisites

Ensure you have the following installed or available:

  • Python 3.8+
  • Jupyter Notebooks: Download & Registration
  • Git (for version control)
  • (Optional) Anaconda/Miniconda (recommended for managing Python environments)

Installation

Follow these steps to set up the project environment:

# 1. Clone the DeepMReye Validation repository
git clone https://github.com/Zoha-Arif/DeepMReye_Validation.git
cd DeepMReye_Validation

# 2. Clone the main DeepMReye repository (if needed)
git clone https://github.com/DeepMReye/DeepMReye.git

# 3. Open the cloned repository using a python editor of your choice (we use Jupyter Notebooks)

Project Structure

DeepMReye_Validation/
├── notebooks/        # Jupyter notebooks for analysis, demos, tests
├── scripts/          # Standalone scripts for running analyses and pre-processing
├── DeepMReye/        # Cloned DeepMReye repository
├── requirements.txt  # Python dependencies
└── README.md         # This readme

Data Access

This project uses both example data from DeepMReye and project-specific datasets to compare fMRI-derived gaze with traditional eye-tracking.

Data Sources

  1. DeepMReye Sample Data

    • OSF Link
    • Example dataset you can use to test if DeepMReye is set up correctly.
  2. LAND Lab Data (Vanderbilt University)

    • OSF Link
    • Contains both fMRI and corresponding in-bore eye-tracking data for direct comparison.

Contributing

We welcome contributions from the community! Feel free to contact us if you have questions, suggestions, or want to get involved.

License

Please contact Dr. Sophia Vinci-Booher ([email protected]) if you plan to reuse or distribute the LAND Lab dataset used in this project.

About

BrainHack Vanderbilt 2025: A validation project for DeepMReye, a neuroimaging tool that uses deep learning to estimate gaze positions directly from functional MRI (fMRI) data. Our primary aim is to compare DeepMReye’s gaze predictions with traditional eye-tracking measurements to assess accuracy and reliability.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published