Principal Context-aware Diffusion Guided Data Augmentation for Fault Localization
Abstract
Test cases are indispensable for conducting effective fault localization (FL). However, test cases in practice are severely class imbalanced, i.e. the number of failing test cases (i.e. minority class) is much less than that of passing ones (i.e. majority class). The severe class imbalance between failing and passing test cases have hindered the FL effectiveness. To address this issue, we propose PCD-DAug: a Principal Context-aware Diffusion guided Data Augmentation approach that generate synthesized failing test cases for improving FL. PCD-DAug first combines program slicing with principal component analysis to construct a principal context that shows how a set of statements influences the faulty output via statistical program dependencies. Then, PCD-DAug devises a conditional diffusion model to learn from principle contexts for generating synthesized failing test cases and acquiring a class balanced dataset for FL. We conducted large-scale experiments on six state-of-the-art FL approaches and compare PCD-DAug with six data augmentation baselines. The results show that PCD-DAug significantly improves FL effectiveness, e.g. achieving average improvements of 383.83%, 227.08%, and 224.19% in six FL approaches under the metrics Top-1, Top-3, and Top-5, respectively.