Our community constantly pushes the state-of-the-art by introducing “new” techniques. These techniques often build on top of, and are compared against, existing systems that realize previously published techniques. The underlying assumption is that existing systems correctly represent the techniques they implement or that they represent the state of the art. Guess what? The assumption does not hold very often.
Our ICSE 2016 paper (authored by Eric, Matt, and I) examines that assumption through a study of KLEE, a popular, well-engineered, and well-cited tool in our community. We briefly describe six improvements we made to KLEE, none of which can be considered “new” technique, that provide order-of-magnitude performance gains. Given these improvements, we then investigate how the results and conclusions of a sample of papers that cite KLEE are affected. Our findings indicate that the focus on introducing these “new” techniques is overemphasized at the cost of robust foundations, leading to wasted effort, an accretion of artifacts’ complexity, and questionable conclusions (for our study 27% of the papers that depend on KLEE). We also found that often even approximating the studies presented is difficult, due the lack of basic details and artifacts.
What can we do about it? In the paper we identify a few initiatives that may help to realign the incentives to better support the foundations on which we build.