Skip to content

DFKI-NI/Applied-XAI-Workshop-SGAI25

Repository files navigation

SGAI25 Workshop - Applied XAI

This Repository contains code that has been presented at the SGAI 25. More information on https://www.bcs-sgai.org/ai2025/?section=workshops

To cite this, please use the publication

Manss, Christoph, and Tarek A. El-Mihoub. 
"Evaluation of Explanations for Object Detection Using Transformers with Sonar Data." 
International Conference on Innovative Techniques and Applications of Artificial Intelligence. 
Cham: Springer Nature Switzerland, 2025.

Quick setup

  1. Create and activate a Python environment (recommended)
  • Python 3.10+
  • Then install dependencies:
pip install -r requirements.txt
  1. Get the demo data (MarineDebris) — choose one:
bash download_marineDebris.sh
  1. Get pretrained model checkpoints into models/:
bash download_models.sh

This script downloads a tar archive from the provided OwnCloud link and unpacks it directly into models/.

Notes

  • Requirements: bash, curl, and tar must be available (standard on Linux/macOS; on Windows use Git Bash or WSL).
  • Both scripts are idempotent to the extent supported by tar extraction; re-running is safe and will overwrite existing files if present in the archive.

Running the workshop notebooks

Launch Jupyter and open the notebooks in the notebooks/ folder. Each notebook installs the project requirements in its first cell, so running top-to-bottom should work if the data and models folders are prepared as above.

Start Jupyter (example):

jupyter lab  # or: jupyter notebook
  • notebooks/workshop_attentions_from_transformer.ipynb

    • Explores self- and cross-attention from transformer-based object detectors as explanations. Loads a batch, runs inference, and visualizes attention maps interactively.
  • notebooks/workshop_D-Rise_on_Transformer.ipynb

    • Demonstrates D-RISE for object detection with transformer backbones. Generates saliency maps and visualizes detections plus explanations. Includes simple guidance on thresholds and parameters.
  • notebooks/workshop_ODAM_on_Transformer.ipynb

    • Shows ODAM (Object Detector Activation Maps) on transformer detectors. Extracts intermediate features with hooks, computes activation-based saliency, and evaluates with localization/faithfulness metrics.

Tips

  • If you have multiple checkpoints under models/**/checkpoints/, adjust the checkpoint path at the top of each notebook to the one you want to use.
  • The notebooks use Hugging Face AutoImageProcessor for preprocessing and postprocessing; make sure requirements.txt has been installed.

Acknowledgements

The work presented in this repository is funded by the Federal Ministry of Education and Research, Germany, grant number 01IW23003.

About

Notebook materials from the HAI‑x workshop on interactive explanations for hybrid AI systems. Based on a route‑optimisation use case for weed harvesting on Lake Maschsee, demonstrating AI/ML methods and ways to generate and pass their explanations.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors