Skip to content

How to add your custom ReID model to BoxMOT

Mike edited this page Oct 30, 2025 · 1 revision

Registering a Custom ReID Model

This guide walks you through the repository changes required to plug a brand-new person or vehicle re-identification (ReID) model into BoxMOT. By the end you will be able to select your weights with --reid-model across the CLI and Python APIs just like any built-in backbone.  

Tip: Keep your implementation modular. If your model can be exported to ONNX, OpenVINO, TorchScript, or TensorRT later, follow the existing modules for clean separation between the backbone definition, the factory, and configuration metadata.  

1. Add the Backbone Under boxmot/appearance/backbones

  Create a new module that exposes your architecture. Use existing implementations—such as boxmot/appearance/backbones/osnet.py or mobilenetv2.py as references for structure, type hints, and docstrings. Two key rules apply:  

  1. Inference must return feature vectors. The forward pass must emit embeddings v rather than classification logits so that trackers can compare appearance descriptors directly.
  2. Follow PyTorch conventions. Your module should subclass torch.nn.Module, accept torch.Tensor inputs shaped like [batch, channels, height, width], and run on the device provided by upstream code.   If your model needs helper functions (e.g., custom blocks), keep them in the same module or a dedicated subpackage so they can be imported without side effects.  

2. Register a Constructor in boxmot/appearance/reid/factory.py

  Add an import for your backbone and register it in the MODEL_FACTORY map:  

from boxmot.appearance.backbones.my_backbone import MyBackbone
 
MODEL_FACTORY = {
    # ...existing entries...
    "my_backbone": MyBackbone,
}

  The key string ("my_backbone" in the example) becomes the public identifier that users pass through configuration files or CLI flags. Make sure the constructor signature matches how you expect the model to be instantiated (e.g., keyword arguments for number of classes, pretrained weights, or input size).  

3. Declare the Model Type in boxmot/appearance/reid/config.py

  Update two structures so that the rest of BoxMOT knows about the new backbone and any pretrained checkpoints you distribute:  

  1. Append the same identifier used in the factory to MODEL_TYPES.
  2. Add any downloadable weights to TRAINED_URLS. Use stable, publicly accessible URLs (GitHub Releases, Google Drive with uc?id=..., Hugging Face, etc.). Fro traceability add the dataset name to the registered model. For example:  
MODEL_TYPES = [
    # ...
    "my_backbone",
]
 
TRAINED_URLS = {
    # ...
    "my_backbone_market1501.pt": "https://github.com/your-org/your-repo/releases/download/v1.0/my_backbone_market1501.pt",
}

 

5. Test the Integration

  Before opening a pull request, validate the end-to-end workflow:  

  1. Instantiate from the factory
from boxmot.appearance.reid.factory import MODEL_FACTORY
 
model = MODEL_FACTORY["my_backbone"]
model.eval()
  1. Run a forward pass with a dummy tensor to confirm that embeddings are returned without errors.
  2. Track with the CLI on a short video to ensure the new model plays nicely with the tracker of your choice, for example:
boxmot track --source assets/MOT17-mini/track.mp4 \
--yolo-model yolov8n.pt \
--reid-model my_backbone_market1501.pt \
--tracking-method botsort
  1. Optional exports: run boxmot export --weights my_backbone_market1501.pt --include onnx if you intend to distribute additional formats.
Clone this wiki locally