TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation
Hyeongwon Jang*, Changhun Kim*, Eunho Yang (*: equal contribution)
International Conference on Machine Learning (ICML), 2025 (Spotlight Presentation, 313/12107=2.6%)
Official implementation for TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation. TIMING is implemented in PyTorch and tested on different time series datasets, including switch-feature, state, Mimic-III, PAM, Epilespy, boiler, freezer, and wafer. Our overall experiments are based on time_interpret, ContraLSP, TimeX++, WinIT. Sincere thanks to each of the original authors!
conda create -n timing python==3.10.16
conda activate timing
pip install -r requirement.txt --no-depsThe requirements.txt file is used to install the necessary packages into a virtual environment.
To test with switch-feature, additional setup is required.
git clone https://github.com/TimeSynth/TimeSynth.git
cd TimeSynth
python setup.py install
cd ..
python synthetic/switchstate/switchgenerator.pyTo use MIMIC-III dataset, you need to download the original dataset and set up a PostgreSQL database.
1. Download MIMIC-III Dataset
- Download from PhysioNet (requires credentialed access and CITI training)
- Sign the Data Use Agreement (DUA) and download all CSV files
2. Load into PostgreSQL
- Clone the MIMIC-III code repository:
git clone https://github.com/MIT-LCP/mimic-code.git
cd mimic-code/mimic-iii/buildmimic/postgres- Follow the repository instructions to create database
mimicwith schemamimiciiiand load CSV files
3. Configure Database Connection
Edit datasets/mimic3.py (around lines 151-157) to set your PostgreSQL connection:
dbname = "mimic"
schema_name = "mimiciii"
# Set host, user, and password as neededWe have divided our experiments into two categories: Synthetic and Real.
All experiments can be executed using scripts located in scripts/real, scripts/hmm, or scripts/switchfeature.
This is an example execution for MIMIC-III (ours)
bash scripts/real/train.sh
bash scripts/real/run_mimic_our.sh
bash scripts/real/run_mimic_baseline.shDue to differences between our training environment and the released code, the paper’s results may not be fully reproducible with the training scripts alone. All XAI evaluations were conducted on a single NVIDIA RTX 3090 GPU (24GB VRAM). We have publicly released all model checkpoints, except those trained on the restricted-access MIMIC-III dataset.
All results will be stored in the current working directory.
And then save parsing results:
python real/parse.py --model state --data mimic3 --top_value 100
python real/parse.py --model state --data mimic3 --experiment_name baseline --top_value 100All parsed results will be saved in the results/ directory.
@inproceedings{jang2025timing,
title={{TIMING: Temporality-Aware Integrated Gradients for Time Series Explanation}},
author={Jang, Hyeongwon and Kim, Changhun and Yang, Eunho},
booktitle={International Conference on Machine Learning (ICML)},
year={2025}
}
