skip to main content
10.1145/3030207.3030216acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Transferring Performance Prediction Models Across Different Hardware Platforms

Published:17 April 2017Publication History

ABSTRACT

Many software systems provide configuration options relevant to users, which are often called features. Features influence functional properties of software systems as well as non-functional ones, such as performance and memory consumption. Researchers have successfully demonstrated the correlation between feature selection and performance. However, the generality of these performance models across different hardware platforms has not yet been evaluated.

We propose a technique for enhancing generality of performance models across different hardware environments using linear transformation. Empirical studies on three real-world software systems show that our approach is computationally efficient and can achieve high accuracy (less than 10% mean relative error) when predicting system performance across 23 different hardware platforms. Moreover, we investigate why the approach works by comparing performance distributions of systems and structure of performance models across different platforms.

References

  1. NIST/SEMATECH e-Handbook of Statistical Methods. http://www.itl.nist.gov/div898/handbook/.Google ScholarGoogle Scholar
  2. J. Antony. Design of Experiments for Engineers and Scientists. Butterworth-Heinemann, 2003.Google ScholarGoogle Scholar
  3. S. Balsamo and M. Marzolla. Performance Evaluation of UML Software Architectures with Multiclass Queueing Network Models. In Proceedings of the 5th International Workshop on Software and Performance. ACM, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. V. Cortellessa and R. Mirandola. PRIMA-UML: a performance validation incremental methodology on early UML diagrams. Science of Computer Programming, July 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. J. Guo, K. Czarnecki, S. Apel, N. Siegmund, and A. Wasowski. Variability-aware performance prediction: A statistical learning approach. In Proc. ASE. IEEE, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. K. Hoste, A. Phansalkar, L. Eeckhout, A. Georges, L. K. John, and K. De Bosschere. Performance prediction based on inherent program similarity. In Proceedings of the 15th International Conference on Parallel Architectures and Compilation Techniques, PACT '06, pages 114--122, New York, NY, USA, 2006. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. F. Hutter, L. Xu, H. H. Hoos, and K. Leyton-Brown. Algorithm runtime prediction: Methods & evaluation. Artificial Intelligence, 206(0):79--111, Jan. 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. F. Hutter, L. Xu, H. H. Hoos, and K. Leyton-Brown. Algorithm runtime prediction: Methods & evaluation (extended abstract). In Proceedings of the 24th International Joint Conference on Artificial Intelligence (IJCAI), July 2015.Google ScholarGoogle Scholar
  9. M. Kowal, M. Tschaikowski, M. Tribastone, and I. Schaefer. Scaling Size and Parameter Spaces in Variability-Aware Software Performance Models (T). In 2015 30th IEEE/ACM International Conference on Automated Software Engineering (ASE). IEEE, 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. D. Montgomery. Design and Analysis of Experiments. John Wiley & Sons, 2008.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. J. C. Petkovich, A. Oliveira, Y. Zhang, T. Reidemeister, and S. Fischmeister. DataMill: A Distributed Heterogeneous Infrastructure For Robust Experimentation. Software: Practice and Experience, pages n/a--n/a, 2015.Google ScholarGoogle Scholar
  12. W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. Numerical Recipes in C (2nd Ed.): The Art of Scientific Computing. Cambridge Univ. Press, 1992.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. B. D. Ripley. Stochastic simulation, volume 316. John Wiley & Sons, 2009.Google ScholarGoogle Scholar
  14. I. Sobol and Y. Levitan. A pseudo-random number generator for personal computers. Computers and Mathematics with Applications, 37(4--5):33--40, 1999.Google ScholarGoogle Scholar
  15. SQLite. SQLite. https://www.sqlite.org/. Accessed April. 15th, 2016.Google ScholarGoogle Scholar
  16. The Tukaani Project. XZ Utils. http://tukaani.org/xz/. Accessed April. 17th, 2016.Google ScholarGoogle Scholar
  17. E. Thereska, B. Doebel, A. X. Zheng, and P. Nobel. Practical performance models for complex, popular applications. SIGMETRICS Perform. Eval. Rev., 38(1):1--12, June 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. P. Valov, J. Guo, and K. Czarnecki. Empirical comparison of regression methods for variability-aware performance prediction. In Proceedings of the 19th International Conference on Software Product Line, SPLC '15, pages 186--190, New York, NY, USA, 2015. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. VideoLAN Organization. x264, the best H.264/AVC encoder. http://www.videolan.org/developers/x264.html. Accessed April. 15th, 2016.Google ScholarGoogle Scholar
  20. D. Westermann, J. Happe, R. Krebs, and R. Farahbod. Automated inference of goal-oriented performance prediction functions. In Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering, ASE 2012, pages 190--199, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Y. Zhang, J. Guo, E. Blais, and K. Czarnecki. Performance prediction of configurable software systems by fourier learning. In Proceedings of the International Conference on Automated Software Engineering (ASE), 2015.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Transferring Performance Prediction Models Across Different Hardware Platforms

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICPE '17: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering
      April 2017
      450 pages
      ISBN:9781450344043
      DOI:10.1145/3030207

      Copyright © 2017 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 April 2017

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ICPE '17 Paper Acceptance Rate27of83submissions,33%Overall Acceptance Rate252of851submissions,30%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader