Blogs (1) >>
ASE 2019
Sun 10 - Fri 15 November 2019 San Diego, California, United States
Tue 12 Nov 2019 16:00 - 16:20 at Cortez 1 - Testing and Visualization Chair(s): Amin Alipour

Compilers, like other software systems, contain bugs, and compiler testing is the most widely-used way to assure compiler quality. A critical task of compiler testing is to generate test programs that could effectively and efficiently discover bugs. Though we can configure test generators such as Csmith to control the features of the generated programs, it is not clear what test configuration is effective. In particular, an effective test configuration needs to generate test programs that are bug-revealing, i.e., likely to trigger bugs, and diverse, i.e., able to discover different types of bugs. It is not easy to satisfy both properties. In this paper, we propose a novel test-program generation approach, called HiCOND, which utilizes historical data for configuration diversification to solve this challenge. HiCOND first infers the range for each option in a test configuration where bug-revealing test programs are more likely to be generated based on historical data. Then, it identifies a set of test configurations that can lead to diverse test programs through a search method (particle swarm optimization). Finally, based on the set of test configurations for compiler testing, HiCOND generates test programs, which are likely to be bug-revealing and diverse. We have conducted experiments on two popular compilers GCC and LLVM, and the results confirm the effectiveness of our approach. For example, HiCOND detects 75.00%, 133.33%, and 145.00% more bugs than the three existing approaches, respectively. Moreover, HiCOND has been successfully applied to actual compiler testing in a global IT company and detected 11 bugs during the practical evaluation.

Tue 12 Nov

16:00 - 17:40: Papers - Testing and Visualization at Cortez 1
Chair(s): Amin AlipourUniversity of Houston
ase-2019-papers16:00 - 16:20
History-Guided Configuration Diversification for Compiler Test-Program GenerationACM SIGSOFT Distinguished Paper Award
Junjie ChenTianjin University, Guancheng WangPeking University, Dan HaoPeking University, Yingfei XiongPeking University, Hongyu ZhangThe University of Newcastle, Lu ZhangPeking University
ase-2019-papers16:20 - 16:40
Data-Driven Compiler Testing and Debugging
Junjie ChenTianjin University
ase-2019-papers16:40 - 17:00
Targeted Example Generation for Compilation Errors
Umair Z. AhmedNational University of Singapore, Renuka SindhgattaQueensland University of Technology, Australia, Nisheeth SrivastavaIndian Institute of Technology, Kanpur, Amey KarkareIIT Kanpur
Link to publication Pre-print
ase-2019-Journal-First-Presentations17:00 - 17:20
Lightweight Assessment of Test-Case Effectiveness using Source-Code-Quality Indicators
Giovanni GranoUniversity of Zurich, Fabio PalombaDepartment of Informatics, University of Zurich, Harald GallUniversity of Zurich
Link to publication Pre-print
ase-2019-Demonstrations17:20 - 17:30
Visual Analytics for Concurrent Java Executions
Cyrille ArthoKTH Royal Institute of Technology, Sweden, Monali PandeKTH Royal Institute of Technology, Qiyi TangUniversity of Oxford
ase-2019-Demonstrations17:30 - 17:40
NeuralVis: Visualizing and Interpreting Deep Learning Models
Xufan ZhangState Key Laboratory for Novel Software Technology Nanjing University, Nanjing, China, Ziyue YinState Key Laboratory for Novel Software Technology Nanjing University, Nanjing, China, Yang FengUniversity of California, Irvine, Qingkai ShiHong Kong University of Science and Technology, Jia LiuState Key Laboratory for Novel Software Technology Nanjing University, Nanjing, China, Zhenyu ChenNanjing University