PCSM combines Finite Impulse Response (FIR) modeling of BOLD activity with a Gaussian Mixture Model-Hidden Markov Model (GMM-HMM) to quantify dynamic, spatially distributed brain states. From these posteriors, PCSM derives interpretable, emergent properties including serial-parallel processing, cognitive demand, resource level, and serial bottleneck.
This repository holds the code with links to the simulated data and derivatives for evaluating PCSM.
| Open Science | Presentations | Data/Code | Examples |
|---|---|---|---|
| Preregistration | University of Zurich | Data | Human Data Example |
| BioRxiv Preprint | American College of Neuropsychopharmocology | GitHub | |
| OSF Repository | Initial Package |
Depicting the PCSM pipeline. (A) Modeling BOLD from task-based fMRI using a Finite Impulse Response (FIR) model. (B) FIR-derived timeseries are inputs into the GMM-HMM with PCSM alignment. The emergent properties from these dynamic outputs are then decoded in the following steps. (C) The number of responding nodes at each timepoint is decoded to estimate parallel and serial processing periods that are then projected to brain space. (D) Cognitive demand and resource levels are computed from these dynamics. (E) From these metrics, PCSM derives a scalar index of serial bottleneck severity.
This image outlines the workflow for computations in PCSM - These steps are outlined in more detail in the preprint and the upcoming peer-reviewed publication.
Examples of using PCSM with human data can be found in this repository here or under code/30_notebooks/PCSM_example_human_data_reduced.ipynb
The Open Science Framework (OSF) repository for PCSM holding the simulations can be found here.
The human data used for informing simulations can be found on OpenNeuro here. The derivatives for this dataset can be imported via Nilearn using fetch_open_neuro with an example of how to do this in a tutorial here.
The simulated data and derivatives - comprised of 15,000 simulations with 200 node timeseries each for ~132 trials with a TR of 2s - were substantially larger than what is allowed for OSF. Therefore I compressed these files and split them for easier storage. This requires proper concatenation and unpacking that can be done with the code below.
# Bash------------------------------------------------
# change directory into where files were downloaded
cd ~/<name of download filename>
# concatenating
cat simulated_data.part-* > simulated_data.tar.gz
# decompressing
tar -xzf simulated_data.tar.gz
Functions the scripts folder uses are in the 'code/10_functions/projlib' folder. This can be downloaded as a package and imported into your coding framework for immediate use.
While under review, I anticipate some changes to be made - upon acceptance, I will place these functions in a user-friendly format and upload to PyPI for easy import via pip.
For now, PCSM can be tested and used by cloning the repository and installing it into your session using the following code example:
#Bash-----------------------------------------
git clone https://github.com/drewwint/pcsm
cd ~/<location repository is cloned at>
pip install .
#Python---------------------------------------
import projlib
from projlib import metric_calculation as mc
# from projlib import <etc..>
Then you can import modules and functions to run PCSM for yourself. Until the PCSM package is finalized and in PyPI - you will need to install and import the dependencies outlined in the .txt file in the /code/env folder, including 'hmmlearn' and 'nilearn'. For the final package, the required portions of these packages will be forked so that the PCSM package will be self-contained and robust to external dependency changes in the future.
