Official implementation of the iScience paper
A Distribution-aware Semi-Supervised Pipeline for Cost-effective Neuron Segmentation in Volume Electron Microscopy
📄 Paper: https://www.sciencedirect.com/science/article/pii/S2589004225027683
Semi-supervised learning offers a cost-effective approach for neuron segmentation in electron microscopy (EM) volumes. This technique leverages unlabeled data to regularize supervised training for robust neuron boundary prediction. However, distribution mismatch between labeled and unlabeled data, caused by limited annotations and diverse neuronal structures, limits model generalization. In this study, we develop a distribution-aware pipeline to address the inherent mismatch issue and enhance semi-supervised neuron segmentation in EM volumes. At the data level, we select representative sub-volumes for annotation using an unsupervised measure of distributional similarity, ensuring broad coverage of neuronal structures. At the model level, we encourage consistent predictions across mixed views of labeled and unlabeled data. This design prompts the network to align feature distributions and learn shared semantics. Experiments on diverse EM datasets demonstrate the effectiveness of our method, which holds the potential to reduce proofreading demands and accelerate large-scale connectomic reconstruction efforts.
This repository contains the official implementation used in the iScience publication and can be readily adapted to other volumetric EM datasets.
To help users better understand and apply our method, we provide an interactive demo by Colab , showcasing the subvolume selection process in both:
This demo can be readily adapted to your own EM datasets.
cd Pretraining
python pretraining.py
cd CGS
python CGS.py
cd IIC-Net
python warmup.py
python semi_tuning.py
This code is based on SSNS-Net (IEEE TMI'22) by Huang Wei et al. The postprocessing tools are based on constantinpape/elf and funkey/waterz. Should you have any further questions, please let us know. Thanks again for your interest.

