NWB conversion scripts for Cai lab data to the Neurodata Without Borders data format.
To use this conversion package, you'll need to install it directly from GitHub. This approach allows you to access the latest features and modify the source code if needed to adapt to your specific experimental requirements.
Before installation, ensure you have the following tools installed:
git(installation instructions)conda(installation instructions) - recommended for managing dependencies
From a terminal (note that conda should install one in your system) you can do the following:
git clone https://github.com/catalystneuro/cai-lab-to-nwb
cd cai-lab-to-nwb
conda env create --file make_env.yml
conda activate cai_lab_to_nwb_envThis creates a conda environment which isolates the conversion code from your system libraries. We recommend that you run all your conversion related tasks and analysis from the created environment in order to minimize issues related to package dependencies.
If you fork this repository and are running code from that fork, instead use:
git clone https://github.com/your_github_username/cai-lab-to-nwbAlternatively, if you want to avoid conda altogether (for example if you use another virtual environment tool) you can install the repository with the following commands using only pip:
git clone https://github.com/catalystneuro/cai-lab-to-nwb
cd cai-lab-to-nwb
pip install --editable .Note: both of the methods above install the repository in editable mode. The dependencies for this environment are stored in the dependencies section of the pyproject.toml file.
Each conversion is organized in a directory of its own in the src directory:
cai-lab-to-nwb/
├── LICENSE
├── make_env.yml
├── pyproject.toml
├── README.md
├── dandi_upload.md
└── src
└── cai_lab_to_nwb
├── another_conversion
└── zaki_2024
├── interfaces
│ ├── __init__.py
│ ├── eztrack_interface.py
│ ├── minian_interface.py
│ ├── miniscope_imaging_interface.py
│ ├── zaki_2024_cell_registration_interface.py
│ ├── zaki_2024_edf_interface.py
│ ├── zaki_2024_shock_stimuli_interface.py
│ └── zaki_2024_sleep_classification_interface.py
├── notes
│ ├── zaki_2024_notes.md
│ └── ... .png
├── tutorials
│ ├── zaki_2024_tutorial.ipynb
│ └── ... .png
├── utils
│ ├── __init__.py
│ ├── conversion_parameters.yaml
│ ├── define_conversion_parameters.py
│ ├── edf_slicing.py
│ ├── generate_session_description.py
│ └── source_data_path_resolver.py
├── __init__.py
├── zaki_2024_convert_all_sessions.py
├── zaki_2024_convert_session.py
├── zaki_2024_convert_week_session.py
├── zaki_2024_metadata.yaml
└── zaki_2024_run_conversion.ipynb
For example, for the conversion zaki_2024 you can find a directory located in src/cai-lab-to-nwb/zaki_2024. Inside each conversion directory you can find the following files:
zaki_2024_convert_session.py: this script defines the function to convert one full session of the conversion.zaki_2024_convert_week_session.py: this script defines the function to convert a week-long experimental session into an NWB file.zaki_2024_convert_all_sessions.py: this script defines the function to perform a batch conversion of a set of sessions. The conversion parameters of each session needs to be defined inutils/conversion_parameters.yamlzaki_2024_nwbconverter.py: the place where theNWBConverterclass is defined.zaki_2024_metadata.yaml: YAML file containing experimental metadata for the session.zaki_2024_run_conversion.ipynb: notebook with tutorial on how to run the conversion.notes/zaki_2024_notes.md: notes and comments concerning this specific conversion.interfaces/: directory containing the interface classes for this specific conversion.tutorials/: directory containing tutorials for this specific conversion.utils/: directory containing utility functions for this specific conversion.
The conversion notes is located in src/cai_lab_to_nwb/zaki_2024/notes/zaki_2024_notes.md.
This file contains information about the expected file structure and the conversion process.
For detailed documentation on how to run NWB conversion for zaki_2024 dataset see
zaki_2024_run_conversion.ipynb
The tutorials directory contains Jupyter notebooks that demonstrate how to read the NWB files generated by the conversion scripts.
The notebooks are located in the src/cai_lab_to_nwb/zaki_2024/tutorials directory.
You might need to install jupyter before running the notebooks:
pip install jupyter
cd src/cai_lab_to_nwb/zaki_2024/tutorials
jupyter lab
Detailed instructions on how to upload the data to the DANDI archive can be found here.
To create a new conversion or adapt this one for different experimental paradigms:
Follow the naming convention and create a new directory under src/cai_lab_to_nwb/:
mkdir src/cai_lab_to_nwb/new_experiment_2025Create custom interfaces inheriting from existing ones:
from neuroconv.datainterfaces import BaseDataInterface
class CustomInterface(BaseDataInterface):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def add_to_nwbfile(self, nwbfile, metadata, **kwargs):
# Custom processing logic
super().add_to_nwbfile(nwbfile, metadata, **kwargs)Combine all interfaces for your dataset:
from neuroconv import NWBConverter
class NewExperimentNWBConverter(NWBConverter):
data_interface_classes = dict(
Behavior=CustomInterface,
Video=ExternalVideoInterface,
# Add other interfaces as needed
)Create scripts for single sessions and batch processing following the established patterns.
Develop YAML metadata files with dataset-specific experimental parameters:
NWBFile:
experiment_description: "Description of your new experiment"
institution: "Your Institution"
lab: "Your Lab"
Subject:
species: "Mus musculus"
# Add subject-specific metadata
# Add other experimental metadataEach conversion should be self-contained within its directory and follow the established patterns for consistency and maintainability.
- Use
stub_test=Truefor initial testing with small data subsets - Process sessions in parallel for large datasets
- Consider using SSD storage for faster I/O operations
- Monitor memory usage for large video files
For issues specific to this conversion:
- Check the
notes.mdfile in the conversion directory - Review the metadata YAML files for parameter examples
- Examine the conversion scripts for usage patterns
For general neuroconv issues:
- Visit the neuroconv documentation
- Check the neuroconv GitHub repository
If you use this conversion in your research, please cite:
This project is licensed under the terms specified in the LICENSE file.