A validation project for DeepMReye, a neuroimaging tool that uses deep learning to estimate gaze positions directly from functional MRI (fMRI) data. Our primary aim is to compare DeepMReye’s gaze predictions with traditional eye-tracking measurements to assess accuracy and reliability.
- Onboarding Documentation
- Prerequisites
- Installation
- Data Access
- Project Structure
- Contributing
- License
Below are some essential resources and references to help you get started:
- fMRI Basics: Andy's Brain Book: Introduction to fMRI
- Introduction to Neural Networks and Deep Learning: 3Blue1Brown Youtube Series
- Get an OSF Account to access the dataset: OSF website
Additional Documentation and Links:
- DeepMReye Repository
- DeepMReye Example Usage Notebook
- BIDS-MReye (Alternative Repo)
- DeepMReye Nature Neuroscience Paper
Ensure you have the following installed or available:
- Python 3.8+
- Jupyter Notebooks: Download & Registration
- Git (for version control)
- (Optional) Anaconda/Miniconda (recommended for managing Python environments)
Follow these steps to set up the project environment:
# 1. Clone the DeepMReye Validation repository
git clone https://github.com/Zoha-Arif/DeepMReye_Validation.git
cd DeepMReye_Validation
# 2. Clone the main DeepMReye repository (if needed)
git clone https://github.com/DeepMReye/DeepMReye.git
# 3. Open the cloned repository using a python editor of your choice (we use Jupyter Notebooks)
DeepMReye_Validation/
├── notebooks/ # Jupyter notebooks for analysis, demos, tests
├── scripts/ # Standalone scripts for running analyses and pre-processing
├── DeepMReye/ # Cloned DeepMReye repository
├── requirements.txt # Python dependencies
└── README.md # This readme
This project uses both example data from DeepMReye and project-specific datasets to compare fMRI-derived gaze with traditional eye-tracking.
-
DeepMReye Sample Data
- OSF Link
- Example dataset you can use to test if DeepMReye is set up correctly.
-
LAND Lab Data (Vanderbilt University)
- OSF Link
- Contains both fMRI and corresponding in-bore eye-tracking data for direct comparison.
We welcome contributions from the community! Feel free to contact us if you have questions, suggestions, or want to get involved.
Please contact Dr. Sophia Vinci-Booher ([email protected]) if you plan to reuse or distribute the LAND Lab dataset used in this project.