Simon’s paper “A Case Study on the Stability of Performance Tests for Serverless Applications” was accepted for publication in the Journal of Systems and Software (JSS)! This paper was a collaboration with Diego Costa, Lizhi Liao, Weiyi Shang, Andre van Hoorn and Samuel Kounev through the SPEC RG DevOps Performance Working Group.
Abstract:
“Context. While in serverless computing, application resource management and operational concerns are generally delegated to the cloud provider, ensuring that serverless applications meet their performance requirements is still a responsibility of the developers. Performance testing is a commonly used performance assessment practice; however, it traditionally requires visibility of the resource environment.
Objective. In this study, we investigate whether performance tests of serverless applications are stable, that is, if their results are reproducible, and what implications the serverless paradigm has for performance tests.
Method. We conduct a case study where we collect two datasets of performance test results: (a) repetitions of performance tests for varying memory size and load intensities and (b) three repetitions of the same performance test every day for ten months.
Results. We find that performance tests of serverless applications are comparatively stable if conducted on the same day. However, we also observe short-term performance variations and frequent long-term performance changes.
Conclusion. Performance tests for serverless applications can be stable; however, the serverless model impacts the planning, execution, and analysis of performance tests.”
See our Publications for the full paper.