Blogs (1) >>
ASE 2019
Sun 10 - Fri 15 November 2019 San Diego, California, United States
Wed 13 Nov 2019 16:00 - 16:20 at Hillcrest - Performance Chair(s): Tim Menzies

The performance of a software system plays a crucial role for user perception. Learning from the history of a software system’s performance behavior does not only help discovering and locating performance bugs, but also identifying evolutionary performance patterns and general trends, such as when technical debt accumulates in a slow but steady performance degradation. Exhaustive regression testing is usually impractical, because rigorous performance benchmarking requires executing a realistic workload per commit, which results in large execution times. In this paper, we propose a novel active revision sampling approach, which aims at tracking and understanding a system’s performance history by approximating the performance behavior of a software system across all of its revisions. In a nutshell, we iteratively sample and measure the performance of specific revisions that help us in building an exact performance- evolution model, and we use Gaussian Process models to assess in which revision ranges our model is most uncertain with the goal to to sample further revisions for measurement. We have conducted an empirical analysis of the evolutionary performance behavior modeled as a time series of the history of 6 real-world software systems. Our evaluation demonstrates that Gaussian Process models are able to accurately estimate the performance- evolution history of real-world software systems with only few measurements and to reveal interesting behaviors and trends.

Wed 13 Nov

ase-2019-paper-presentations
16:00 - 17:50: Papers - Performance at Hillcrest
Chair(s): Tim MenziesNorth Carolina State University
ase-2019-papers16:00 - 16:20
Talk
Accurate Modeling of Performance Histories for Evolving Software Systems
Stefan MühlbauerBauhaus-University Weimar, Sven ApelSaarland University, Norbert SiegmundBauhaus-University Weimar
Pre-print
ase-2019-papers16:20 - 16:40
Talk
An Industrial Experience Report on Performance-Aware Refactoring on a Database-centric Web Application
Boyuan ChenYork University, Zhen Ming (Jack) JiangYork University, Paul MatosCopywell Inc., Michael LacariaCopywell Inc.
Authorizer link Pre-print
ase-2019-papers16:40 - 17:00
Talk
An Experience Report of Generating Load Tests Using Log-recovered Workloads at Varying Granularities of User Behaviour
Jinfu ChenJiangsu University, Weiyi (Ian) ShangConcordia University, Canada, Ahmed E. HassanQueen's University, Yong WangAlibaba Group, Jiangbin LinAlibaba Group
Pre-print
ase-2019-papers17:00 - 17:10
Talk
How Do API Selections Affect the Runtime Performance of Data Analytics Tasks?
Yida TaoShenzhen University, Shan TangShenzhen University, Yepang LiuSouthern University of Science and Technology, Zhiwu XuShenzhen University, Shengchao QinUniversity of Teesside
ase-2019-papers17:10 - 17:20
Talk
Demystifying Application Performance Management Libraries for Android
Yutian TangThe Hong Kong Polytechnic University, Zhan XianThe Hong Kong Polytechnic University, Hao ZhouThe Hong Kong Polytechnic University, Xiapu LuoThe Hong Kong Polytechnic University, Zhou XuWuhan University, Yajin ZhouZhejiang University, Qiben YanMichigan State University
ase-2019-Demonstrations17:20 - 17:30
Demonstration
PeASS: A Tool for Identifying Performance Changes at Code Level
David Georg ReicheltUniversität Leipzig, Stefan KühneUniversität Leipzig, Wilhelm HasselbringKiel University
Pre-print Media Attached File Attached
ase-2019-papers17:30 - 17:50
Talk
ReduKtor: How We Stopped Worrying About Bugs in Kotlin Compiler
Daniil StepanovSaint Petersburg Polytechnic University, Marat AkhinSaint Petersburg Polytechnic University / JetBrains Research, Mikhail BelyaevSaint Petersburg Polytechnic University
Pre-print