skip to main content
10.1145/1383559.1383565acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

SciSim: a software performance estimation framework using source code instrumentation

Published:23 June 2008Publication History

ABSTRACT

Recently, software performance estimation based on source code instrumentation shows promising results in the literature. It achieves significant speedup without compromising accuracy, compared with cycle-accurate simulations. However, much work still remains to be done to make this technique flexible and accurate enough to estimate software on complex processors. To the best of our knowledge, we are the first to propose ways to tackle microarchitecture related issues in the source code instrumentation approach. We perform static instruction scheduling for superscalar architectures at instrumentation time and combine instrumented code and microarchitecture simulators to model runtime interactions between software and microarchitecture. We have developed a new framework, SciSim, to provide a common infrastructure for the proposed approach. It is designed to be easily extendable and retargetable to different instruction set architectures and processors. Using SciSim SystemC modules may be automatically generated to integrate software into system-level simulation. We will present the applicability of SciSim in system-level design exploration of multiprocessor systems. At last, experiments with standard benchmarks are presented to validate the speed and accuracy of SciSim.

References

  1. J. R. Bammi, W. Kruijtzer, L. Lavagno, E. Harcourt, and M. Lazarescu. Software performance estimation strategies in a system-level design tool. In Proceedings of the Eighth International Workshop on Hardware/Software Codesign, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. G. Bontempi and W. Kruijtzer. A data analysis method for software performance prediction. In Proceedings of the Design, Automation, and Test in Europe (DATE) Conference, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. G. Braun, A. Nohl, A. Hoffmann, O. Schliebusch, R. Leupers, and H. Meyr. A universal technique for fast and flexible instruction-set architecture simulation. IEEE Transaction on Computer-Aided Design of Integrated Circuits and Systems, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. A. Cagney. Psim-model of the powerpc architecture, 1994-1996.Google ScholarGoogle Scholar
  5. M.-K. Chung, S. Yang, S.-H. Lee, and C.-M. Kyung. System-level HW/SW co-simulation framework for multiprocessor and multithread SoC. In Proceedings of IEEE VLSI-TSA international symposium on VLSI Design, Automation and Test, pages 177--180, 2005.Google ScholarGoogle Scholar
  6. M. J. Eager. Introduction to the DWARF debugging format, 2007.Google ScholarGoogle Scholar
  7. L. Formaggio, F. Fummi, and G. Pravadelli. A timing-accurate HW/SW co-simulation of an ISS with SystemC. In Proceedings of the 2nd IEEE/ACM/IFIP international conference on Hardware/software codesign and system synthesis (CODES+ISSS '04), pages 152--157, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. F. Fummi, G. Perbellini, M. Loghi, and M. Poncino. ISS-centric modular HW/SW co-simulation. In Proceedings of the 16th ACM Great Lakes symposium on VLSI (GLSVLSI '06), pages 31--36, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. P. Giusto, G. Martin, and E. Harcourt. Reliable estimation of execution time of embedded software. In Proceedings of the conference on Design, automation and test in Europe (DATE'01), pages 580--589, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M. Guthaus, J. Ringenberg, D. Ernst, T. Austin, T. Mudge, and R. Brown. Mibench: a free, commercially representative embedded benchmark suite. In Proceedings of the IEEE 4th Workshop on Workload Characterization, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Institute of Electrical and Electronics Engineers. IEEE Std 1666 - 2005 IEEE Standard SystemC Language Reference Manual. IEEE Std 1666-2005, 2006.Google ScholarGoogle Scholar
  12. T. Kempf, K. Karuri, S. Wallentowitz, G. Ascheid, R. Leupers, and H. Meyr. A SW performance estimation framework for early system-level-design using fine-grained instrumentation. In Proceedings of the conference on Design, automation and test in Europe (DATE'06), pages 468--473, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. R. Kirner and P. Puschner. Classification of wcet analysis techniques. In Proceedings of the Eighth IEEE International Symposium on Object-Oriented Realtime distributed Computing, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J.-Y. Lee and I.-C. Park. Timed compiled-code simulation of embedded software for performance analysis of SOC design. In Proceedings of the Design Automation Conference (DAC'02), pages 293--298, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. C. Mills, S. C. Ahalt, and J. Fowler. Compiled instruction set simulation. Software-Practice Experience, 21(8):877--889, 1991.Google ScholarGoogle ScholarCross RefCross Ref
  16. F. Z. M.S.Oyamada and F. Wagner. Accurate software performance estimation using domain classification and neural networks. In Proceedings of the Symposium on Integrated Circuits and System Design, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. H. Nakamura, N. Sato, and N. Tabuchi. An efficient and portable scheduler for rtos simulation and its certified integration to systemc. In Proceedings of the conference on Design, automation and test in Europe (DATE'06), pages 1157--1158, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. V. J. Reddi, A. Settle, D. A. Connors, and R. S. Cohn. PIN: a binary instrumentation tool for computer architecture research and education. In Proceedings of the 2004 workshop on Computer architecture education (WCAE'04), 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. A. Srivastava and A. Eustace. ATOM: A system for building customized program analysis tools. In Proceedings of the ACM Symposium on Programming Languages Design and Implementation (PLDI'94), pages 196--205, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. P. Viana, E. Barros, S. Rigo, R. Azevedo, and G. Araujo. Exploring memory hierarchy with ArchC. In Proceedings of the 15th Symposium on Computer Architecture and High Performance Computing (SBAC-PAD'03), 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. H. Yu, A. Gerstlauer, and D. Gajski. Rtos scheduling in transaction level models. In Proceedings of the 1st IEEE/ACM/IFIP international conference on Hardware/software codesign and system synthesis (CODES+ISSS' 03), pages 31--36, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. V. Zivojnovic and H. Meyr. Compiled HW/SW co-simulation. In Proceedings of the Design Automation Conference (DAC), 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. SciSim: a software performance estimation framework using source code instrumentation

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        WOSP '08: Proceedings of the 7th international workshop on Software and performance
        June 2008
        218 pages
        ISBN:9781595938732
        DOI:10.1145/1383559

        Copyright © 2008 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 23 June 2008

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate149of241submissions,62%

        Upcoming Conference

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader