Abstract
The performance of today’s enterprise applications is influenced by a variety of parameters across different layers. Thus, evaluating the performance of such systems is a time and resource consuming process. The amount of possible parameter combinations and configurations requires many experiments in order to derive meaningful conclusions. Although many tools for automated performance testing are available, controlling experiments and analyzing results still requires large manual effort. In this paper, we apply statistical model inference techniques, namely Kriging and MARS, in order to adaptively select experiments. Our approach automatically selects and conducts experiments based on the accuracy observed for the models inferred from the currently available data. We validated the approach using an industrial ERP scenario. The results demonstrate that we can automatically infer a prediction model with a mean relative error of 1.6% using only 18% of the measurement points in the configuration space.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Balsamo, S., Di Marco, A., Inverardi, P., Simeoni, M.: Model-Based Performance Prediction in Software Development: A Survey. IEEE Transactions on Software Engineering 30(5), 295–310 (2004)
Becker, S., Koziolek, H., Reussner, R.: The Palladio component model for model-driven performance prediction. Journal of Systems and Software 82, 3–22 (2009)
Courtois, M., Woodside, M.: Using regression splines for software performance analysis and software characterization. In: Proceedings of the 2nd International Workshop on Software and Performance, WOSP 2000, September 17–20, pp. 105–114. ACM Press, New York (2000)
De Smith, M.J., Goodchild, M.F., Longley, P.A.: Geospatial Analysis: A Comprehensive Guide to Principles, Techniques and Software Tools. Troubador Publishing
Denaro, G., Polini, A., Emmerich, W.: Early performance testing of distributed software applications. SIGSOFT Software Engineering Notes 29(1), 94–103 (2004)
Fioukov, A.V., Hammer, D.K., Obbink, H., Eskenazi, E.M.: Performance prediction for software architectures. In: Proceedings of PROGRESS 2002 Workshop (2002)
Friedman, J.H.: Multivariate adaptive regression splines. Annals of Statistics 19(1), 1–141 (1991)
Gorton, I., Liu, A.: Performance Evaluation of Alternative Component Architectures for Enterprise JavaBean Applications. IEEE Internet Computing 7(3), 18–23 (2003)
Groenda, H.: Certification of software component performance specifications. In: Proceedings of Workshop on Component-Oriented Programming (WCOP) 2009, pp. 13–21 (2009)
Happe, J., Westermann, D., Sachs, K., Kapová, L.: Statistical inference of software performance models for parametric performance completions. In: Heineman, G.T., Kofron, J., Plasil, F. (eds.) QoSA 2010. LNCS, vol. 6093, pp. 20–35. Springer, Heidelberg (2010)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data mining, Inference,and Prediction, 2nd edn. Springer Series in Statistics. Springer, Heidelberg (2009)
Jin, Y., Tang, A., Han, J., Liu, Y.: Performance evaluation and prediction for legacy information systems. In: Proceedings of ICSE 2007, pp. 540–549. IEEE CS, Washington (2007)
Jung, G., Pu, C., Swint, G.: Mulini: An Automated Staging Framework for QoS of Distributed Multi-Tier Applications. In: ASE Workshop on Automating Service Quality (2007)
Koziolek, H.: Performance evaluation of component-based software systems: A survey. Performance Evaluation (in press, corrected proof, 2009)
Kraft, S., Pacheco-Sanchez, S., Casale, G., Dawson, S.: Estimating service resource consumption from response time measurements. In: Proc. of VALUETOOLS 2009. ACM, NY (2009)
Krige, D.G.: A Statistical Approach to Some Basic Mine Valuation Problems on the Witwatersrand. Journal of the Chemical, Metallurgical and Mining Society of South Africa 52(6), 119–139 (1951)
Kumar, D., Zhang, L., Tantawi, A.: Enhanced inferencing: Estimation of a workload dependent performance model. In: Proceedings of VALUETOOLS 2009 (2009)
Li, J., Heap, A.D.: A review of spatial interpolation methods for environmental scientists. Geoscience Australia, Canberra (2008)
Miller, B.P., Callaghan, M.D., Cargille, J.M., Hollingsworth, J.K., Irvin, R.B., Karavanic, K.L., Kunchithapadam, K., Newhall, T.: The paradyn parallel performance measurement tool. Computer 28, 37–46 (1995)
Mos, A., Murphy, J.: A framework for performance monitoring, modelling and prediction of component oriented distributed systems. In: WOSP 2002: Proc. of the 3rd International Workshop on Software and Performance, pp. 235–236. ACM, New York (2002)
Motulsky, H.J., Ransnas, L.A.: Fitting curves to data using nonlinear regression: a practical and non-mathematical review (1987)
Pacifici, G., Segmuller, W., Spreitzer, M., Tantawi, A.: Dynamic estimation of cpu demand of web traffic. In: Proc. of VALUETOOLS 2006, page 26. ACM, New York (2006)
Pebesma, E.J.: Multivariable geostatistics in s: the gstat package. Computers and Geosciences 30, 683–691 (2004)
Reussner, R., Sanders, P., Prechelt, L., Müller, M.S.: SKaMPI: A detailed, accurate MPI benchmark. In: Alexandrov, V.N., Dongarra, J. (eds.) PVM/MPI 1998. LNCS, vol. 1497, pp. 52–59. Springer, Heidelberg (1998)
Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Statistical Science 4, 409–423 (1989)
Sankarasetty, J., Mobley, K., Foster, L., Hammer, T., Calderone, T.: Software performance in the real world: personal lessons from the performance trauma team. In: Cortellessa, V., Uchitel, S., Yankelevich, D. (eds.) WOSP, pp. 201–208. ACM, New York (2007)
SAP. SAP Standard Application Benchmarks (March 2011), http://www.sap.com/solutions/benchmark
Schneider, T.: SAP Performance Optimization Guide: Analyzing and Tuning SAP Systems. Galileo Pr. Inc., Bonn (2006)
Switzer, P.: Kriging. John Wiley and Sons Ltd., Chichester (2006)
Thakkar, D., Hassan, A.E., Hamann, G., Flora, P.: A framework for measurement based performance modeling. In: WOSP 2008: Proceedings of the 7th International Workshop on Software and Performance, pp. 55–66. ACM, New York (2008)
Tobler, W.: A computer movie simulating urban growth in the detroit region. Economic Geography 46(2), 234–240 (1970)
Westermann, D., Happe, J.: Performance Cockpit: Systematic Measurements and Analyses. In: ICPE 2011: Proceedings of the 2nd ACM/SPEC International Conference on Performance Engineering. ACM, New York (2011)
Westermann, D., Happe, J.: Software Performance Cockpit (March 2011), http://www.softwareperformancecockpit.org/
Westermann, D., Happe, J., Hauck, M., Heupel, C.: The performance cockpit approach: A framework for systematic performance evaluations. In: Proceedings of the 36th EUROMICRO SEAA 2010. IEEE CS, Los Alamitos (2010)
Woodside, C.M., Vetland, V., Courtois, M., Bayarov, S.: Resource function capture for performance aspects of software components and sub-systems. In: Dumke, R.R., Rautenstrauch, C., Schmietendorf, A., Scholz, A. (eds.) WOSP 2000 and GWPESD 2000. LNCS, vol. 2047, pp. 239–256. Springer, Heidelberg (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Westermann, D., Krebs, R., Happe, J. (2011). Efficient Experiment Selection in Automated Software Performance Evaluations. In: Thomas, N. (eds) Computer Performance Engineering. EPEW 2011. Lecture Notes in Computer Science, vol 6977. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-24749-1_24
Download citation
DOI: https://doi.org/10.1007/978-3-642-24749-1_24
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-24748-4
Online ISBN: 978-3-642-24749-1
eBook Packages: Computer ScienceComputer Science (R0)