[ACM SIGCHI 2025]
Official repository for the paper: AEGIS: Human Attention-based Explainable Guidance for Intelligent Vehicle Systems, presented at ACM SIGCHI 2025.
Make sure you have Anaconda or Miniconda installed.
Then, run the following command in your terminal:
conda env create -f environment.ymlor
conda create -n aegis python==3.7
conda activate aegis
pip install -r requirements.txt# Create and navigate to the CARLA directory
mkdir carla
cd carla
# Download and extract CARLA 0.9.14
wget https://carla-releases.s3.us-east-005.backblazeb2.com/Linux/CARLA_0.9.14.tar.gz
tar -xvf CARLA_0.9.14.tar.gz
# Launch the simulator
./CarlaUE4.sh --world-port=2000python train_car_following.py --simulator_port 2000
Training AEGIS for the Left-Turn Scenario
python train_left_turn.py --simulator_port 2000
python eval_car_following_save.py
You should be able to visualize the machine attention using this evalutation script.
- Zhuang Z, Lu CY, Wang YK, Chang YC, Thomas Do, Lin CT. "AEGIS: Human Attention-based Explainable Guidance for Intelligent Vehicle Systems". ACM CHI Conference on Human Factors in Computing Systems, 2025.
You can read our paper on arXiv here:
AEGIS: Human Attention-based Explainable Guidance for Intelligent Vehicle Systems (arXiv:2504.05950)