Skip to content

ins-amu/vbi

Repository files navigation

Test Documentation Status DOI Docker Build

VBI: Virtual Brain Inference


Getting Started | Documentation |

Installation

Quick Start

# Create conda environment (recommended)
conda create --name vbi python=3.10
conda activate vbi


# Install VBI
export SKIP_CPP=1                  # To skip C++ compilation
pip install vbi                    # Light version (CPU only)
pip install vbi[inference]         # With parameter inference
pip install vbi[inference-gpu]     # Full functionality with GPU support

Installation Options

  • pip install vbi - Light version with CPU simulation only
  • pip install vbi[inference] - Adds parameter inference capabilities
  • pip install vbi[inference-gpu] - Full functionality with GPU acceleration

Using Docker

# Quick start with pre-built image
docker run --rm -it -p 8888:8888 ghcr.io/ins-amu/vbi:main

# With GPU support
docker run --gpus all --rm -it -p 8888:8888 ghcr.io/ins-amu/vbi:main

Other Installation Methods

For detailed installation instructions including:

  • Installing from source
  • Windows-specific installation
  • Building Docker locally
  • Platform-specific guides (Google Colab, EBRAINS)
  • Troubleshooting

See our comprehensive Installation Guide in the documentation.

Quick Verification

import vbi
vbi.tests()
vbi.test_imports()

Example output:

              Dependency Check              
                                         
  Package      Version       Status        
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 
  vbi          v0.2.1        ✅ Available  
  numpy        1.24.4        ✅ Available  
  scipy        1.10.1        ✅ Available  
  matplotlib   3.7.5         ✅ Available  
  sbi          0.22.0        ✅ Available  
  torch        2.4.1+cu121   ✅ Available  
  cupy         12.3.0        ✅ Available  
                                          
  Torch GPU available: True
  Torch device count: 1
  Torch CUDA version: 12.1
  CuPy GPU available: True
  CuPy device count: 1

Getting Started

Interactive Examples

Try these examples directly in your browser using Google Colab:

Load simulated data

Model Colab Link
Jansen-Rit SDE Open In Colab
Wilson-Cowan SDE Open In Colab
Wong-Wang SDE Open In Colab
VEP SDE (84 Regions) Open In Colab
VEP SDE (few parameters) Open In Colab
MPR SDE Open In Colab
Damped Oscillator Open In Colab

Feedback and Contributions

We welcome contributions to the VBI project! If you have suggestions, bug reports, or feature requests, please open an issue on our GitHub repository. To contribute code, fork the repository, create a new branch for your feature or bugfix, and submit a pull request. Make sure to follow our coding standards and include tests for your changes. For detailed guidelines, please refer to our CONTRIBUTING.md file. Thank you for helping us improve VBI!

GDPR Compliance

VBI itself does not collect, store, or transmit personal data.
Users may provide their own datasets locally; in such cases, the responsibility for GDPR compliance lies with the user and their execution environment.

For details, see GDPR_COMPLIANCE.md.

Citation

@article{VBI, 
title={Virtual Brain Inference (VBI): A flexible and integrative toolkit for efficient probabilistic inference on virtual brain models},
author={Ziaeemehr, Abolfazl and Woodman, Marmaduke and Domide, Lia and Petkoski, Spase and Jirsa, Viktor and Hashemi, Meysam},
DOI={10.7554/elife.106194.2}, 
url={http://dx.doi.org/10.7554/eLife.106194.2}, 
publisher={eLife Sciences Publications, Ltd}, 
year={2025}, 
abstract = {Network neuroscience has proven essential for understanding the principles and mechanisms
underlying complex brain (dys)function and cognition. In this context, whole-brain network modeling–
also known as virtual brain modeling–combines computational models of brain dynamics (placed at each network node)
with individual brain imaging data (to coordinate and connect the nodes), advancing our understanding of
the complex dynamics of the brain and its neurobiological underpinnings. However, there remains a critical
need for automated model inversion tools to estimate control (bifurcation) parameters at large scales
associated with neuroimaging modalities, given their varying spatio-temporal resolutions.
This study aims to address this gap by introducing a flexible and integrative toolkit for efficient Bayesian inference
on virtual brain models, called Virtual Brain Inference (VBI). This open-source toolkit provides fast simulations,
taxonomy of feature extraction, efficient data storage and loading, and probabilistic machine learning algorithms,
enabling biophysically interpretable inference from non-invasive and invasive recordings.
Through in-silico testing, we demonstrate the accuracy and reliability of inference for commonly used
whole-brain network models and their associated neuroimaging data. VBI shows potential to improve hypothesis
evaluation in network neuroscience through uncertainty quantification, and contribute to advances in precision
medicine by enhancing the predictive power of virtual brain models.}
}

See the VBI paper here.

This research has received funding from:

  • EU's Horizon 2020 Framework Programme for Research and Innovation under the Specific Grant Agreements:
    • No. 101147319 (EBRAINS 2.0 Project)
    • No. 101137289 (Virtual Brain Twin Project)
    • No. 101057429 (project environMENTAL)
  • Government grant managed by the Agence Nationale de la Recherche:
    • Reference ANR-22-PESN-0012 (France 2030 program)

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

About

Virtual Brain Inference

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors 3

  •  
  •  
  •