My student Pingyu Zhang will be presenting our work on automated amplification of tests to validate code that handles exception.
This is part of a larger effort aimed at amplifying the value of existing test cases through automated transformations into new
tests. This larger efforts includes Pingyu’s previous work on Load Test Case generation (ASE2011), our work on Test Carving (FSE2006 and TSE2009), and
our work on Test Aggregation (ASE2008) are part of this effort as well
Validating code handling exceptional behavior is difficult, particularly when dealing with external resources that
may be noisy and unreliable, as it requires: 1) the systematic exploration of the space of exceptions that may be thrown
by the external resources, and 2) the setup of the context to trigger specific patterns of exceptions.
In this work we present an approach that addresses those difficulties by performing an exhaustive amplification of the space of exceptional behavior
associated with an external resource that is exercised by a test suite. Each amplification attempts to expose a program
exception handling construct to new behavior by mocking the external resource so that it returns normally or throws an
exception following a predefined pattern. Our assessment of the approach indicates that it can be fully automated, is powerful
enough to detect 65% of the bug reports of this kind, and is precise enough that 77% of the detected anomalies correspond
to faults fixed by the developers.