skip to main content
10.1145/3592980.3595317acmconferencesArticle/Chapter ViewAbstractPublication PagesmodConference Proceedingsconference-collections
research-article

Why Your Experimental Results Might Be Wrong

Published: 18 June 2023 Publication History

Abstract

Research projects in the database community are often evaluated based on experimental results. A typical evaluation setup looks as follows: Multiple methods to compare with each other are embedded in a single shared benchmarking codebase. In this codebase, all methods execute an identical workload to collect the individual execution times. This seems reasonable: Since the only difference between individual test runs are the methods themselves, any observed time difference can be attributed to these methods. Also, such a benchmarking codebase can be used for gradual optimization: If one method runs slowly, its code can be optimized and re-evaluated. If its performance improves, this improvement can be attributed to the particular optimization.
Unfortunately, we had to learn the hard way that it is not that simple. The reason for this lies in a component that sits right between our benchmarking codebase and the produced experimental results — the compiler. As we will see in the following case study, this black-box component has the power to completely ruin any meaningful comparison between methods, even if we setup our experiments as equal and fair as possible.

References

[1]
[1] 2023. https://gcc.gnu.org/onlinedocs/gcc/Optimize-Options.html
[2]
[2] 2023. https://clang.llvm.org
[3]
[3] 2023. https://www.intel.com/content/www/us/en/developer/tools/oneapi/dpc-compiler.html
[4]
Felix Schuhknecht and Justus Henneberg. 2023. Accelerating Main-Memory Table Scans with Partial Virtual Views. In Proceedings of the 19th International Workshop on Data Management on New Hardware (DaMoN ’23), June 19, 2023, Seattle, WA, USA. ACM. https://doi.org/10.1145/3592980.3595315

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DaMoN '23: Proceedings of the 19th International Workshop on Data Management on New Hardware
June 2023
119 pages
ISBN:9798400701917
DOI:10.1145/3592980
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 June 2023

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

SIGMOD/PODS '23
Sponsor:

Acceptance Rates

DaMoN '23 Paper Acceptance Rate 17 of 23 submissions, 74%;
Overall Acceptance Rate 94 of 127 submissions, 74%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 151
    Total Downloads
  • Downloads (Last 12 months)26
  • Downloads (Last 6 weeks)2
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media