Skip to content

AI-4-SE/FSE-2025-Understanding-Debugging-as-Episodes

Repository files navigation

1 Replication Package Description

This repository provides supplementary material to the FSE 2025 paper "Understanding Debugging as Episodes: A Case Study on Performance Bugs in Configurable Software Systems", including (1.1) SoftVR training videos, (1.2) the anonymized videos of the debugging sessions (without the audio recordings), (1.3) interview transcripts, (1.4) our fine-grained coding framework, (1.5) the coding framework of debugging episodes, and (1.6) data analysis and visualization scripts. The latter complements the presentation of the study results in the paper and allows for reproduction of our analyses and findings.

Note

This repository only contains 4 study videos as the free plan of LFS does not support uploading all videos (file size limitations). Therefore, we provide all videos (from 1.1 and 1.2) directly in the zenodo archive DOI.

Prerequisites

  • Docker installed on your system. Get Docker
  • Git for cloning the repository

Project Structure

project/
│
├── Dockerfile
├── run_all.py
├── requirements.txt
├── ...
└── data/
    ├── literature/
    ├── main study videos/
    └── user study/
        ├── study material/
        └── debugging actions data/

1.1 SoftVR Training Videos

We provide the training videos for the user study in this folder.

1.2 Debugging Sessions Videos

In this folder we provide all videos of the user study with reduced quality and removed audio.

1.3 Interview

We transcriped the interview in an 2-step approach: first, transcribing the whole interview automatically with whisper, and second, correcting the two relevant questions by hand. We provide the interview transcript whisper and the manually corrected interview transcript.

1.4 Debugging Strategies and Actions Coding Framework

We extracted debugging strategies from the literature that we provide in this folder. We provide the resulting fine-grained coding framework per participant including the results.

1.5 Goal-Oriented Debugging Episodes Framework

We provide the goal-opriented episodes, which are the results of the open coding, here. The table shows the participant (participant), strat time of the episode in the video (start_time), intermediate episode name (episode), corrected start time (timestamp), duration of an episode (time_delta), and the final episode name (Episode Code).

1.6 Data Analysis and Visualization Scripts

The data analysis and visualizatio scripts read in and process all data and generate the figures we showed in our publication. All scripts, as well as the requirements.txt, are located in the root folder of the project and can be executed by eigther executing the run_all.py or by executing the provided Dockerfile:

Build the Docker image:

docker build -t eval-debugging-process-data .

Running the Docker Container

Run the container with the following command:

docker run -t eval-debugging-process-data

Accessing Output

After the container finishes running, you can copy the generated files from the docker container to the output directory:

docker cp <container_id>:/app/output .

Replace:

  • <container_id> with the correct container ID of the executed Docker container. Find the ID using docker ps -a.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published