skip to main content
10.1145/2248418.2248440acmconferencesArticle/Chapter ViewAbstractPublication PagescpsweekConference Proceedingsconference-collections
research-article

Creating portable, repeatable, realistic benchmarks for embedded systems and the challenges thereof

Published:12 June 2012Publication History

ABSTRACT

To appreciate the challenges of analysing embedded processor behaviour, step back in time to understand the evolution of embedded processors. Only a few decades ago, embedded processors were relatively simple devices (compared to today), represented by a host of 8- and 16-bit microcontrollers, and 32-bit microprocessors, with minimal integration. Today, these processors (even the so-called, low-end microcontrollers), have evolved into highly-integrated SoCs with a wide variety of architectures capable of tackling both specific and general-purpose tasks. Associated with these transformations, the benchmarks used to quantify the capabilities have also grown in complexity and range. At the simplest level, benchmarks such as CoreMark analyse the fundamental processor cores. At the other end of the spectrum, system benchmarks, such BrowsingBench, analyse the entire SoC as well as the system software stack and even the physical interfaces. This paper examines some of the challenges of applying such benchmarks, and explains the methodologies used at EEMBC to manage portability, repeatability, and realism.

References

  1. Comparing Benchmarks Using Key Microarchitecture-Independent Characteristics by Kenneth Hoste and Lieven Eeckhout IISWC2006, Oct. 2006; San Jose, CA (US)Google ScholarGoogle Scholar
  2. Alam, S. R., Barrett, R. F., Kuehn, J. A., Roth, P. C., and Vetter, J. S. 2006. Characterization of Scientific Workloads on Systems with Multi-Core Processors. In Proceedings of the 2006 IEEE International Symposium on Workload Characterization (Oct. 2006) 225--236Google ScholarGoogle Scholar
  3. Hall, M. (1998). Correlation-based feature subset selection for machine learning. Ph.D. thesis, Department of Computer Science, University of Waikato, Hamilton, New Zealand.Google ScholarGoogle Scholar
  4. A Benchmark Characterization of the EEMBC Benchmark Suite, IEEE Micro Volume 9, Issue 5. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Ajay M. Joshi*, Lieven Eeckhout**, and Lizy K. John* "The Return of Synthetic Benchmarks"Google ScholarGoogle Scholar
  6. Gal-On, S. Levy, M (2010). Exploring CoreMark -- A Benchmark Maximizing Simplicity and Efficacy. www.eembc.org/techlit/whitepaper.phpGoogle ScholarGoogle Scholar
  7. How To Select a Multicore Processor for Embedded Applications. In proceedings of 2010 embedded systems design conference, Israel Technologies Group.Google ScholarGoogle Scholar

Index Terms

  1. Creating portable, repeatable, realistic benchmarks for embedded systems and the challenges thereof

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      LCTES '12: Proceedings of the 13th ACM SIGPLAN/SIGBED International Conference on Languages, Compilers, Tools and Theory for Embedded Systems
      June 2012
      153 pages
      ISBN:9781450312127
      DOI:10.1145/2248418
      • cover image ACM SIGPLAN Notices
        ACM SIGPLAN Notices  Volume 47, Issue 5
        LCTES '12
        MAY 2012
        152 pages
        ISSN:0362-1340
        EISSN:1558-1160
        DOI:10.1145/2345141
        Issue’s Table of Contents

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 June 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate116of438submissions,26%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader