This file contains project-specific instructions and environment setup for working on the biahub repository.
Custom conda environments location: /hpc/mydata/$USER/envs/
List all available conda environments:
conda env list --prefix /hpc/mydata/$USER/envs/For neuroglancer visualization tasks:
conda activate /hpc/mydata/$USER/envs/neuroglancer_iohubTools available:
neuroglancer_view.py- Neuroglancer viewer CLI for OME-Zarr datasets
For iohub and general bioimage analysis tasks:
module load anaconda
module load comp_micro
conda activate biautilsTools available:
iohub- OME-Zarr I/O operationsbiahubCLI - Core processing modules
biahub/
├── pyproject.toml # setuptools-based configuration
├── biahub/ # Main package
│ ├── cli/ # Click-based CLI
│ ├── *.py # Processing modules
│ └── vendor/ # Vendored dependencies
├── settings/ # Example YAML configurations
├── tests/ # Test suite
└── docs/ # Documentation
biahub/
├── pyproject.toml # Workspace root
├── packages/
│ ├── biahub/ # Core processing (unchanged name)
│ └── analysis-templates/ # Hydra-based workflows
└── ...
See plan: /hpc/mydata/$USER/code/biahub/plans/biahub-uv.md
- biahub:
/hpc/mydata/$USER/code/biahub(this repository) - infected_vs:
/hpc/mydata/$USER/code/infected_vs(hummingbird microscope processing)
Widefield microscope (Hummingbird):
/hpc/projects/intracellular_dashboard/virtual_stain_ft_infected/2026_01_29_A549_H2B_CAAX_DAPI_DENV_ZIKV/- Stages: 0-convert, 1-reconstruct, 2-concatenate (+ zarrv3 variants)
Light-sheet + label-free microscope (Mantis):
/hpc/projects/intracellular_dashboard/organelle_dynamics/2025_08_26_A549_SEC61_TOMM20_ZIKV/- Stages: 0-convert, 1-preprocess, 2-assemble, 3-visualization, 4-phenotyping
Check with git status and git branch
# In current environment
biahub --help
biahub <command> --help
# Common commands
biahub reconstruct
biahub register
biahub stabilize
biahub concatenatepytest tests/
pytest tests/test_concatenate.py -vStatus: Planning phase
Plan file: plans/biahub-uv.md
Key decisions:
- Keep
biahubpackage name unchanged - New
analysis-templatespackage for Hydra-based workflows - Start with 4-stage Dragonfly workflow (convert → reconstruct → register → virtual stain)
- Data layout must match existing bash-based template output structure
Related PRs:
- PR #200 (feature/merge-mantis-analysis-template): Do NOT merge
- VisCy issue #353: Reference for UV workspaces pattern
- Standard library first
- Third-party packages second
- Local imports last
- Prefer absolute imports:
from biahub.X import Y
- Use Click for command-line interfaces
- Lazy loading for command discovery (see
biahub/cli/main.py) - Pydantic for configuration validation
- Pydantic v2 models in
biahub/settings.py - YAML configuration files in
settings/directory - Extra fields forbidden to catch typos:
extra="forbid"
Two options for visualizing zarr stores during analysis or development:
Option 1: neuroglancer_view.py (preferred — faster)
conda activate /hpc/mydata/$USER/envs/neuroglancer_iohub
neuroglancer_view.py /path/to/data.zarr --position A/1/0Option 2: ndimg_view.py (from nd-embedding-atlas — interactive browser UI)
conda activate /hpc/mydata/$USER/envs/idetik_iohub
python /hpc/mydata/$USER/code/nd-embedding-atlas/scripts/ndimg_view.py /path/to/data.zarr --position A/1/0- Launches a web dashboard at
http://localhost:5055with a table of FOVs and WebGL viewer - Supports multiple zarr stores for side-by-side comparison
- Filter channels with
--channels DAPI,TXR
module load anaconda
module load comp_micro
conda activate biautils
python << 'EOF'
from iohub import open_ome_zarr
# ... see plans/biahub-uv.md for test data creation script
EOFconda activate biautils
python -c "from iohub import open_ome_zarr; d = open_ome_zarr('/path/to/data.zarr'); print(d.channel_names, d.data.shape)"- When working with HPC paths, always use absolute paths
- Check for existing conda environments before suggesting new ones
- The user prefers Click over argparse for CLI development
- The user works with OME-Zarr format via iohub library
- Always check
plans/biahub-uv.mdfor current migration status - Test data locations are important for validation