skip to main content
10.1145/2554850.2555022acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

An experimental study on execution time variation in computer experiments

Published: 24 March 2014 Publication History

Abstract

In computer experiments, many research works rely on the accuracy of measured programs' execution time. We observe that not all studies consider that repeated executions of the same program, under the same experimental conditions, may produce statistically significant different completion times. In this work, we experimentally demonstrate that several sources of OS Jitter affect the execution time of computer programs. We compare various execution time samples using three test protocols, which apply different statistical techniques. The results show that significant differences are detected in all evaluated scenarios.

References

[1]
Georges, A., Buytaert, D., and Eeckhout, L. Statistically rigorous java performance evaluation. In Proceedings of the 22nd annual ACM SIGPLAN conference on Object-oriented programming systems and applications, (OOPSLA '07). ACM, New York, NY, USA, 2007, 57--76.
[2]
Jain, R. The Art of Computer Systems Performance Analysis: Techniques for Experimental Design, Measurement, Simulation, and Modeling. Wiley-Interscience, New York, NY, 1991.
[3]
Kvam, P. H. and Vidakovic, B. Nonparametric Statistics With Applications to Science and Engineering. Wiley-Interscience, New York, NY, 2007.
[4]
Mazouz, A., Touati, S.-A.-A., and Barthou, D. Study of variations of native program execution times on multi-core architectures. In Proceedings of the 2010 International Conference on Complex, Intelligent and Software Intensive Systems, (CISIS '10). IEEE Computer Society, Washington, DC, USA, 2010, 919--924.
[5]
Mazouz, A., Touati, S.-A.-A., and Barthou, D. Analysing the variability of openmp programs performances on multicore architectures. In Proceedings of the 4th Workshop on Programmability Issues for Heterogeneous Multicores (MULTIPROG '11). Heraklion, Greece, 2011, 14.
[6]
Mazouz, A., Touati, S.-A.-A., and Barthou, D. Performance evaluation and analysis of thread pinning strategies on multi-core platforms: Case study of SPEC OMP applications on intel architectures. In Proceedings of the 2011 International Conference on High Performance Computing and Simulation (HPCS). IEEE, Istanbul, Turkey, 2011, 273--279.
[7]
Montgomery, D. C. Design and Analysis of Experiments. John Wiley, 3rd edition, 2000.
[8]
Mytkowicz, T., Diwan, A., Hauswirth, M., and Sweeney, P. F. Producing wrong data without doing anything obviously wrong! In Proceedings of the 14th international conference on Architectural support for programming languages and operating systems, (ASPLOS XIV). ACM, New York, NY, USA, 2009, 265--276.
[9]
Pusukuri, K. K., Gupta, R., and Bhuyan, L. N. Thread tranquilizer: Dynamically reducing performance variation. ACM Transaction Architecture Code Optimization (TACO), 8, 4 (January 2012), 46: 1--21.
[10]
Sheskin, D. J. Handbook of Parametric and Nonparametric Statiscal Procedures. CRC, 3rd edition, 2003.
[11]
Touati, S.-A.-A., Worms, J., and Briais, S. The speedup-test: a statistical methodology for programme speedup analysis and computation. Concurrency and Computation: Practice and Experience, 25, 10 (July, 2013), 1410--1426.
[12]
Tsafrir, D., Etsion, Y., Feitelson, D. G. and Kirkpatrick, S. System noise, os clock ticks, and fine-grained parallel applications. In Proceedings of the 19th annual international conference on Supercomputing, (ICS '05). New York, NY, 2005, 303--312.
[13]
Van Vugt, S. The Definitive Guide to Suse Linux Enterprise Server. Apress, 2006.

Cited By

View all
  • (2025)A multicriteria framework for assessing energy audit software for low-income households in the United StatesEnergy Efficiency10.1007/s12053-025-10295-418:1Online publication date: 27-Jan-2025
  • (2023)Complexity of meshed and bidirectional maritime DC-System Simulations2023 International Conference on Future Energy Solutions (FES)10.1109/FES57669.2023.10182908(1-6)Online publication date: 12-Jun-2023
  • (2022)Performance Modeling of Computer Vision-based CNN on Edge GPUsACM Transactions on Embedded Computing Systems10.1145/352716921:5(1-33)Online publication date: 26-Mar-2022
  • Show More Cited By

Index Terms

  1. An experimental study on execution time variation in computer experiments

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SAC '14: Proceedings of the 29th Annual ACM Symposium on Applied Computing
    March 2014
    1890 pages
    ISBN:9781450324694
    DOI:10.1145/2554850
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 March 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. OS jitter
    2. execution time variability
    3. experiments

    Qualifiers

    • Research-article

    Conference

    SAC 2014
    Sponsor:
    SAC 2014: Symposium on Applied Computing
    March 24 - 28, 2014
    Gyeongju, Republic of Korea

    Acceptance Rates

    SAC '14 Paper Acceptance Rate 218 of 939 submissions, 23%;
    Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

    Upcoming Conference

    SAC '25
    The 40th ACM/SIGAPP Symposium on Applied Computing
    March 31 - April 4, 2025
    Catania , Italy

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)3
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)A multicriteria framework for assessing energy audit software for low-income households in the United StatesEnergy Efficiency10.1007/s12053-025-10295-418:1Online publication date: 27-Jan-2025
    • (2023)Complexity of meshed and bidirectional maritime DC-System Simulations2023 International Conference on Future Energy Solutions (FES)10.1109/FES57669.2023.10182908(1-6)Online publication date: 12-Jun-2023
    • (2022)Performance Modeling of Computer Vision-based CNN on Edge GPUsACM Transactions on Embedded Computing Systems10.1145/352716921:5(1-33)Online publication date: 26-Mar-2022
    • (2022)Assessing the Complexity of DC-System Simulations2022 IEEE Workshop on Complexity in Engineering (COMPENG)10.1109/COMPENG50184.2022.9905428(1-4)Online publication date: 18-Jul-2022
    • (2021)An Open-Source Many-Scenario Approach for Power System Dynamic Simulation on HPC ClustersElectronics10.3390/electronics1011133010:11(1330)Online publication date: 1-Jun-2021
    • (2021)What's Wrong with My Benchmark Results? Studying Bad Practices in JMH BenchmarksIEEE Transactions on Software Engineering10.1109/TSE.2019.292534547:7(1452-1467)Online publication date: 1-Jul-2021
    • (2021)Interpolated binary search: An efficient hybrid search algorithm on ordered datasetsEngineering Science and Technology, an International Journal10.1016/j.jestch.2021.02.009Online publication date: Mar-2021

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media