Skip to content
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
9ba9323
Update file_read.rst
ehennestad Jun 30, 2025
211f737
Update nwbfile.rst
ehennestad Jun 30, 2025
299ee8e
Create schemas_and_generation.rst
ehennestad Jun 30, 2025
aba4874
Update schemas_and_generation.rst
ehennestad Jun 30, 2025
3e26eb0
Fix typos
ehennestad Jun 30, 2025
59b8808
Merge branch 'main' into update-docs-read-section
ehennestad Jun 30, 2025
4d999f8
Update conf.py
ehennestad Sep 3, 2025
3e1d843
Reorganize docs based on diataxis framework
ehennestad Sep 4, 2025
78f5882
Fix links
ehennestad Sep 4, 2025
0aa79ce
smaller rewordings
ehennestad Sep 4, 2025
2923c40
Update overview.rst
ehennestad Sep 4, 2025
b7a4609
Removed troubleshooting section
ehennestad Sep 4, 2025
80b0a28
Minor rewording
ehennestad Sep 4, 2025
9a4a24b
Update installation.rst
ehennestad Sep 4, 2025
843e9d5
Update quickstart.rst
ehennestad Sep 4, 2025
b1c18bf
Update file_create.rst
ehennestad Sep 4, 2025
3c5c3ed
Update file_create.rst
ehennestad Sep 4, 2025
a817423
Update nwbfile.rst
ehennestad Sep 4, 2025
6879b2a
Add neurodata types page
ehennestad Sep 4, 2025
419af7b
Merge branch 'main' into update-docs-read-section
ehennestad Sep 24, 2025
eafbddc
Minor reformulations
ehennestad Sep 25, 2025
343c303
Update hdf5_considerations.rst
ehennestad Sep 25, 2025
9fb18e9
Update performance_optimization.rst
ehennestad Sep 25, 2025
354455b
Update overview.rst
ehennestad Sep 25, 2025
96382e8
Update performance_optimization.rst
ehennestad Sep 25, 2025
097f3f4
Update docs/source/pages/getting_started/quickstart.rst
ehennestad Sep 25, 2025
f5c1ae5
Merge branch 'main' into update-docs-read-section
ehennestad Sep 25, 2025
926c832
Merge branch 'main' into update-docs-read-section
ehennestad Sep 25, 2025
806a5ae
Update nwbfile.rst
ehennestad Sep 26, 2025
7452da5
Update docs/source/pages/concepts/file_create.rst
bendichter Sep 26, 2025
65c0d1e
Update docs/source/pages/concepts/file_create/hdf5_considerations.rst
bendichter Sep 26, 2025
99d27bd
Update docs/source/pages/concepts/file_create/hdf5_considerations.rst
bendichter Sep 26, 2025
69f67aa
Update docs/source/pages/concepts/file_create/nwbfile.rst
bendichter Sep 26, 2025
f40db55
Update docs/source/pages/concepts/file_create/performance_optimizatio…
bendichter Sep 26, 2025
994e6d2
Update docs/source/pages/getting_started/overview.rst
bendichter Sep 26, 2025
6ad5451
Update docs/source/pages/getting_started/overview.rst
bendichter Sep 26, 2025
9af98a6
Rename considerations.rst to dimension_ordering.rst
ehennestad Sep 27, 2025
0523f6a
Rename hdf5_considerations.rst to about_hdf5.rst
ehennestad Sep 29, 2025
b152e3b
Update file_create.rst
ehennestad Sep 29, 2025
fc47c8d
Update overview.rst
ehennestad Sep 29, 2025
f038ff5
Updating the file_create concept pages
ehennestad Sep 29, 2025
f4290e2
Update editing_nwb_files.rst
ehennestad Sep 29, 2025
e54b1ad
Change performance page and add how-to for using config profiles
ehennestad Sep 29, 2025
4c2ff13
Update performance_optimization.rst
ehennestad Sep 29, 2025
b73e8d4
Update compression_profiles.rst
ehennestad Sep 29, 2025
8d15987
Simplify config-profile how-to guide, add to main index
ehennestad Sep 29, 2025
d19e137
Update compression_profiles.rst
ehennestad Sep 29, 2025
67b566e
Update neurodata_types.rst
ehennestad Sep 29, 2025
c0a30fe
Update compression_profiles.rst
ehennestad Sep 29, 2025
af42d1c
Update neurodata_types.rst
ehennestad Sep 29, 2025
0e71485
Update index.rst
ehennestad Sep 29, 2025
a1c738a
Rename performance_optimization to storage_optimization
ehennestad Sep 29, 2025
b57e049
Update compression_profiles.rst
ehennestad Sep 29, 2025
06f2bad
Merge branch 'main' into update-docs-read-section
ehennestad Oct 2, 2025
b0e0cb6
Update storage_optimization.rst
ehennestad Oct 7, 2025
0045e35
Update storage_optimization.rst
ehennestad Oct 7, 2025
b5588d9
Improve api for applying dataset configuration profiles to file befor…
ehennestad Oct 21, 2025
25143ca
Document limitations on editing NWB datasets in MatNWB
ehennestad Oct 23, 2025
f218a96
Merge branch 'main' into update-docs-read-section
ehennestad Oct 23, 2025
516b57e
Update editing_nwb_files.rst
ehennestad Oct 23, 2025
c3fc4e9
Merge branch 'update-docs-read-section' of https://github.com/Neuroda…
ehennestad Oct 23, 2025
df6488b
Fix links and formatting issues
ehennestad Oct 23, 2025
6cd3415
Update documentation with citation info and improved links
ehennestad Oct 23, 2025
44342b8
Remove concepts page on neurodata types
ehennestad Oct 23, 2025
449e474
Clarify NWB schema usage and class regeneration docs
ehennestad Oct 23, 2025
ed17a14
Add note on lazy loading with DataStub in MatNWB
ehennestad Oct 23, 2025
a6826b2
Update storage_backends.rst
ehennestad Oct 23, 2025
e9ec92f
Apply suggestion from @bendichter
ehennestad Oct 23, 2025
50c618d
Update overview.rst
ehennestad Oct 23, 2025
e3f3977
Revise documentation on editing NWB files in MatNWB
ehennestad Oct 23, 2025
df23311
Merge branch 'main' into update-docs-read-section
ehennestad Oct 24, 2025
68d082e
Adjust introductions for index and overview pages
ehennestad Oct 24, 2025
adf1064
Remove page on editing NWB files
ehennestad Oct 24, 2025
ff95a84
Update nwbfile.rst
ehennestad Oct 24, 2025
3eee884
Update storage_optimization.rst
ehennestad Oct 24, 2025
9ea1b52
Merge branch 'main' into update-docs-read-section
ehennestad Oct 27, 2025
33881e6
Merge branch 'main' into update-docs-read-section
ehennestad Oct 31, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 63 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
# MatNWB Documentation

This directory contains the documentation for MatNWB, built using Sphinx.

## Building the Documentation Locally

### Prerequisites

1. **Install Python dependencies:**
```bash
cd docs
pip install -r requirements.txt
```

This installs the required packages:
- sphinx
- sphinx-rtd-theme
- sphinx-copybutton
- sphinxcontrib-matlabdomain

### Build the Documentation

**On macOS/Linux:**
```bash
cd docs
make html
```

**On Windows:**
```bash
cd docs
make.bat html
```

### View the Documentation

After building, open `docs/build/html/index.html` in your web browser to view the generated documentation.

### Other Build Options

- `make clean` - Remove build files
- `make help` - See all available build targets
- `make linkcheck` - Check for broken links

## Documentation Structure

- `source/` - Source files for the documentation
- `pages/` - Main documentation pages
- `conf.py` - Sphinx configuration
- `build/` - Generated documentation (created after building)
- `requirements.txt` - Python dependencies for building docs
- `Makefile` - Build commands for Unix systems
- `make.bat` - Build commands for Windows

## Contributing to Documentation

When editing documentation:

1. Make changes to files in the `source/` directory
2. Build locally to test your changes
3. Ensure the documentation builds without warnings

The documentation uses reStructuredText (`.rst`) format. See the [Sphinx documentation](https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html) for syntax reference.
5 changes: 5 additions & 0 deletions docs/source/_links.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
.. _MatNWB: https://github.com/NeurodataWithoutBorders/matnwb
.. _PyNWB: https://github.com/NeurodataWithoutBorders/pynwb
.. _NWB: https://nwb.org

.. |NWB| replace:: Neurodata Without Borders
31 changes: 28 additions & 3 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@

import os
import sys
import re
from datetime import datetime

sys.path.append('sphinx_extensions')
from docstring_processors import process_matlab_docstring
Expand All @@ -24,10 +26,30 @@ def setup(app):
app.add_role('matclass', MatClassRole())

project = 'MatNWB'
copyright = '2024, Neurodata Without Borders' # Todo: compute year
copyright = f'{datetime.now().year}, Neurodata Without Borders'
author = 'Neurodata Without Borders'

release = '2.7.0' # Todo: read from Contents.m
# Read version from Contents.m
def get_version_from_contents():
"""Extract version number from Contents.m file."""
script_dir = os.path.dirname(os.path.abspath(__file__))
contents_path = os.path.abspath(os.path.join(script_dir, '..', '..', 'Contents.m'))

try:
with open(contents_path, 'r', encoding='utf-8') as f:
for line in f:
# Look for line with "% Version X.Y.Z"
match = re.search(r'%\s*Version\s+(\d+\.\d+\.\d+)', line)
if match:
return match.group(1)
except FileNotFoundError:
print(f"Warning: Contents.m not found at {contents_path}")
return 'unknown' # fallback when file is missing

print("Warning: Version not found in Contents.m")
return 'unknown' # fallback when version cannot be parsed

release = get_version_from_contents()

# -- General configuration ---------------------------------------------------
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
Expand Down Expand Up @@ -82,7 +104,10 @@ def linkcode_resolve(domain, info):
html_logo = os.path.join(matlab_src_dir, 'logo', 'logo_matnwb_small.png')
html_favicon = os.path.join(matlab_src_dir, 'logo', 'logo_favicon_32.png')
html_theme_options = {
"style_nav_header_background": "#000000"
"style_nav_header_background": "#000000",
"navigation_depth": 2,
"collapse_navigation": False,
"sticky_navigation": True
}

html_context = {
Expand Down
79 changes: 63 additions & 16 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
@@ -1,32 +1,79 @@
.. include:: _links.rst

##############
NWB for MATLAB
##############

MatNWB is a MATLAB package for working with NWB files. It provides a high-level
API for efficiently working with neurodata stored in the NWB format. If you are
new to NWB and would like to learn more, then please also visit the
:nwb_overview:`NWB Overview <>` website, which provides an entry point for
researchers and developers interested in using NWB.
MatNWB_ is a MATLAB package for working with |NWB|_ (NWB) files.
It provides a high‑level, efficient interface for reading and writing neurophysiology data in the NWB format and includes tutorial Live Scripts that show you how to read NWB files or convert your own data to NWB.

This documentation focuses on MatNWB. If you are new to NWB or want to learn more about the format itself, these resources are a great starting point:

..
- :nwb_overview:`NWB Overview` | Placeholder

- `NWB Overview Introduction <https://nwb-overview.readthedocs.io/en/latest/intro_to_nwb/1_intro_to_nwb.html>`_: Entry point providing a high-level and general overview of the NWB format

- `NWB Format Specification <https://nwb-schema.readthedocs.io/en/latest/index.html#>`_: Detailed overview of the NWB Format and the neurodata type specifications that make up the format.

For a quick introduction to MatNWB, go to the :ref:`Overview <matnwb-overview>`
page. If you immediately want to see how to read or write files, take a look at the
:ref:`Quickstart <quickstart-tutorial>` tutorial.

********
Contents
********
For more in-depth examples of how to create NWB files, we recommend you to start
with the :ref:`Introduction<intro-tutorial>` tutorial and then move on to one or
more of the domain-focused tutorials:

- :ref:`behavior-tutorial`
- :ref:`ecephys-tutorial`
- :ref:`icephys-tutorial`
- :ref:`images-tutorial`
- :ref:`ogen-tutorial`
- :ref:`ophys-tutorial`

To explore the growing world of open-source neuroscience data stored in the
NWB format, check out the :ref:`Read from Dandihub<read_demo_dandihub-tutorial>` tutorial.

..
This documentation is based on the `diataxis <https://diataxis.fr>`_ framework.
When you browse the table of contents below, look for tutorials, how-to-guides,
concepts (explanation) and reference sections to help orient yourself.
<This is only aspirational at the moment. We will refactor the documentation to be more diataxis-compliant over time:>

Looking for a specific topic which has not been mentioned? Check out the full table of contents below:

.. toctree::
:maxdepth: 2
:caption: Getting Started
:maxdepth: 1
:caption: Get Started

pages/getting_started/installation_users
pages/getting_started/important
pages/getting_started/file_read
pages/getting_started/using_extenstions.rst
pages/getting_started/overview
pages/getting_started/installation
pages/getting_started/quickstart

.. toctree::
:maxdepth: 2
:caption: Tutorials

pages/tutorials/index
pages/getting_started/overview_citing

.. toctree::
:maxdepth: 2
:caption: MatNWB Documentation
:caption: How-tos

pages/how_to/index

.. toctree::
:maxdepth: 2
:caption: Concepts

pages/concepts/considerations
pages/concepts/file_read
pages/concepts/file_create
pages/concepts/using_extensions

.. toctree::
:maxdepth: 1
:caption: MatNWB Reference

pages/functions/index
pages/neurodata_types/core/index
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Important
=========
Important considerations (MatNWB)
=================================

When using MatNWB, it is important to understand the differences in how array
dimensions are ordered in MATLAB versus HDF5. While the NWB documentation and
Expand Down
44 changes: 44 additions & 0 deletions docs/source/pages/concepts/file_create.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
Creating NWB Files
==================

When creating an NWB file, you're translating your experimental data and metadata into a structure that follows the NWB schema. MatNWB provides MATLAB classes that represent the different components (neurodata types) of an NWB file, allowing you to build up the file piece by piece.

.. tip::
To understand the general structure of an NWB file, the NWB Overview documentation has a
:nwb_overview:`great introduction <intro_to_nwb/2_file_structure.html>`.

As demonstrated in the :doc:`Quickstart </pages/getting_started/quickstart>` tutorial, when creating an NWB file, you start by invoking the :class:`NwbFile` class. This will return an :class:`NwbFile` object, a container whose properties are derived directly from the NWB schema. Some properties are required, others are optional. Some need specific MATLAB types like ``char`` or ``datetime``, while others need specific neurodata types defined in the NWB schema.

.. note::
An "object" is an instance of a class. Objects are similar to MATLAB structs, but with additional functionality. The fields (called properties) are defined by the class definition (a .m file), and the class can enforce rules about what values are allowed. This helps ensure that your data conforms to the NWB schema.

**The Assembly Process**

Building an NWB file follows a logical pattern:

- **Create neurodata objects**: You create objects for your data (like :class:`types.core.TimeSeries` for time-based measurements)

- **Add to containers**: You add these data objects to your :class:`NwbFile` object (or other NWB container objects) in appropriate locations

- **File export**: You save everything to disk using :func:`nwbExport`, which translates your objects into NWB/HDF5 format

This approach ensures your data is properly organized and validated before it becomes a file.

**Schema Validation**

The NWB schema acts as a blueprint that defines what makes a valid neuroscience data file. When you export your file, MatNWB checks that:

- All required properties are present
- Data types match what the schema expects
- Relationships between different parts of the file are correct

If anything is missing or incorrect, you'll get an error message explaining what needs to be fixed. This validation helps ensure your files will work with other NWB tools and can be understood by other researchers.

.. toctree::
:maxdepth: 1
:titlesonly:

Understanding the NwbFile Object <file_create/nwbfile>
Understanding Neurodata Types <file_create/neurodata_types>
HDF5 Considerations <file_create/hdf5_considerations>
Performance Optimization <file_create/performance_optimization>
60 changes: 60 additions & 0 deletions docs/source/pages/concepts/file_create/hdf5_considerations.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
.. _hdf5-considerations:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am apprehensive about this entire file


HDF5 Considerations and Limitations
===================================
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
HDF5 Considerations and Limitations
===================================
HDF5 Considerations
=================


Working with NWB files in MATLAB involves interacting with the **HDF5** storage format.
HDF5 provides excellent performance, hierarchical organization, and portability — but it also imposes some important **limitations** that influence how you create, modify, and manage NWB files.
This page explains these limitations conceptually, so you can design data pipelines and workflows that avoid common pitfalls.

Why limitations matter
----------------------

HDF5 is designed for efficient, large-scale data storage — not for frequent editing or multi-user collaboration.
Once data is written, changing the file structure or contents is often constrained by the format itself.

Understanding these constraints will help you:

- Plan ahead when designing datasets and attributes
- Avoid costly re-writes and data corruption
- Structure workflows for safe and efficient data access

Key limitations in practice
---------------------------

Existing datasets cannot be freely modified
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Once a dataset is written to disk, it is essentially fixed in size and structure.
If you need to **append** or **stream** additional data (for example, writing trial data as it becomes available), you must create the dataset with this in mind from the start.

In MatNWB, this is typically done with the :class:`~types.untyped.DataPipe` class, which supports writing data incrementally to an extendable dataset.

Data and attributes cannot be removed — and deletion does not reduce file size
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a bit too pessimistic. Let's find another way to phrase this

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

HDF5 does not support in-place removal of datasets or attributes in the way a database might.
While it is possible at a low level to "unlink" objects from the file, space is not reclaimed.
If you need to significantly restructure a file, the standard approach is to **create a new NWB file** and copy the desired data into it.

**Implication:**
Plan carefully which datasets and metadata to include before writing. Making changes later often means recreating the file from scratch.

Multiple-writer access is not supported
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

HDF5 files are not designed for concurrent writes.
If multiple processes or threads attempt to write to the same file at the same time, the result can be **file corruption**.
In most workflows, this means ensuring that **only one process writes to an NWB file** at any time.

**Best practice:**

- Use a single writer process and close the file before reading it elsewhere.
- If multiple processes need access, coordinate reads and writes through a shared queue or write data separately and merge later.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

parallel read is OK. This topic goes deeper outside of matlab


Takeaway
--------

These limitations reflect HDF5’s design priorities: efficient, large-scale storage and high-performance sequential access — **not** dynamic modification or multi-writer concurrency.

When working with NWB in MatNWB, it is therefore important to: design file structure in advance, write data in predictable ways, and treat files as *immutable records* rather than *editable databases*.
Loading