This Repository contains code that has been presented at the SGAI 25. More information on https://www.bcs-sgai.org/ai2025/?section=workshops
To cite this, please use the publication
Manss, Christoph, and Tarek A. El-Mihoub.
"Evaluation of Explanations for Object Detection Using Transformers with Sonar Data."
International Conference on Innovative Techniques and Applications of Artificial Intelligence.
Cham: Springer Nature Switzerland, 2025.
- Create and activate a Python environment (recommended)
- Python 3.10+
- Then install dependencies:
pip install -r requirements.txt- Get the demo data (MarineDebris) — choose one:
- Manually from the release page: https://github.com/mvaldenegro/marine-debris-fls-datasets/releases/tag/watertank-v1.0
- Or run the helper script (downloads and unpacks into
data/): - Prior to executing the download_marineDebris.sh script, please install python3.12-venv for virtual environments
bash download_marineDebris.sh- Get pretrained model checkpoints into
models/:
bash download_models.shThis script downloads a tar archive from the provided OwnCloud link and unpacks it directly into models/.
Notes
- Requirements:
bash,curl, andtarmust be available (standard on Linux/macOS; on Windows use Git Bash or WSL). - Both scripts are idempotent to the extent supported by tar extraction; re-running is safe and will overwrite existing files if present in the archive.
Launch Jupyter and open the notebooks in the notebooks/ folder. Each notebook installs the project requirements in its first cell, so running top-to-bottom should work if the data and models folders are prepared as above.
Start Jupyter (example):
jupyter lab # or: jupyter notebook-
notebooks/workshop_attentions_from_transformer.ipynb- Explores self- and cross-attention from transformer-based object detectors as explanations. Loads a batch, runs inference, and visualizes attention maps interactively.
-
notebooks/workshop_D-Rise_on_Transformer.ipynb- Demonstrates D-RISE for object detection with transformer backbones. Generates saliency maps and visualizes detections plus explanations. Includes simple guidance on thresholds and parameters.
-
notebooks/workshop_ODAM_on_Transformer.ipynb- Shows ODAM (Object Detector Activation Maps) on transformer detectors. Extracts intermediate features with hooks, computes activation-based saliency, and evaluates with localization/faithfulness metrics.
Tips
- If you have multiple checkpoints under
models/**/checkpoints/, adjust thecheckpointpath at the top of each notebook to the one you want to use. - The notebooks use Hugging Face
AutoImageProcessorfor preprocessing and postprocessing; make surerequirements.txthas been installed.
The work presented in this repository is funded by the Federal Ministry of Education and Research, Germany, grant number 01IW23003.