Skip to main content

An Extensible Open-Source Compiler Infrastructure for Testing

  • Conference paper
Hardware and Software, Verification and Testing (HVC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 3875))

Included in the following conference series:

  • 614 Accesses

Abstract

Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application’s source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Quinlan, D., Schordan, M., Yi, Q., Saebjornsen, A.: Classification and utilization of abstractions for optimization. In: Proc. 1st International Symposium on Leveraging Applications of Formal Methods, Paphos, Cyprus (October 2004)

    Google Scholar 

  2. Schordan, M., Quinlan, D.: A source-to-source architecture for userdefined optimizations. In: Proc. Joint Modular Languages Conference (2003)

    Google Scholar 

  3. Yi, Q., Quinlan, D.: Applying loop optimizations to object-oriented abstractions through general classification of array semantics. In: Proc. Workshop on Languages and Compilers for Parallel Computing, West Lafayette, Indiana, USA (September 2004)

    Google Scholar 

  4. Jackson, D., Rinard, M.: Software analysis: A roadmap. In: Proc. Conference on the Future of Software Engineering (International Conference on Software Engineering), Limerick, Ireland, pp. 133–145 (2000)

    Google Scholar 

  5. Hovemeyer, D., Pugh, W.: Finding bugs is easy. SIGPLAN Notices (Proceedings of Onward! at OOPSLA 2004) (December 2004)

    Google Scholar 

  6. Engler, D., Musuvathi, M.: Static analysis versus software model checking for bug finding. In: Proc. International Conference on Verification, Model Checking, and Abstract Interpretation, Venice, Italy (2004)

    Google Scholar 

  7. Edelstein, O., Farchi, E., Goldin, E., Nir, Y., Ratsaby, G., Ur, S.: Testing multithreaded Java programs. IBM Systems Journal: Special Issue on Software Testing (2002)

    Google Scholar 

  8. Ur, S., Ziv., A.: Off-the-shelf vs. custom made coverage models, which is the one for you? In: Proc. International Conference on Software Testing Analysis and Review (May 1998)

    Google Scholar 

  9. Rothermel, G., Harrold, M.J.: A safe, efficient regression test selection technique. ACM Trans. Softw. Eng. Methodol. 6(2), 173–210 (1997)

    Article  Google Scholar 

  10. Ratsaby, G., Sterin, B., Ur, S.: Improvements in coverability analysis. In: Eriksson, L.-H., Lindsay, P.A. (eds.) FME 2002. LNCS, vol. 2391, pp. 41–56. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  11. Farchi, E., Harrington, B.R.: Assisting the code review process using simple pattern recognition. In: Proc. IBM Verification Conference, Haifa, Israel (November 2005)

    Google Scholar 

  12. Bodin, F., Beckman, P., Gannon, D., Gotwals, J., Narayana, S., Srinivas, S., Winnicka, B.: Sage++: An object-oriented toolkit and class library for building fortran and C++ restructuring tools. In: Proceedings. OONSKI 1994, Oregon (1994)

    Google Scholar 

  13. Edison Design Group. EDG front-end, www.edg.com

  14. Jiang, L., Su, Z.: Osprey: A practical type system for validating the correctness of measurement units in C programs (2005)(submitted), wwwcsif.cs.ucdavis.edu/~jiangl/research.html

  15. NIST. The economic impacts of inadequate infrastructure for software testing. Technical Report Planning Report 02-3, National Institute of Standards and Technology (May 2002)

    Google Scholar 

  16. Stroustrop, B.: The C++ programming language, 3rd edn. Addison-Wesley, Reading (2000)

    Google Scholar 

  17. Lam, M.S., Amarasinghe, S.P., Anderson, J.M., Tseng, C.W.: The suif compiler for scalable parallel machines. In: Proc. SIAM Conference on Parallel Processing for Scientific Computing (Febuary 1995)

    Google Scholar 

  18. Silber, G.A., Darte, A.: The Nestor library: A tool for implementing Fortran source to source transformations. In: LNCS, vol. D9(1593) (1999)

    Google Scholar 

  19. Free Software Foundation. GNU Compiler Collection (2005), gcc.gnu.org

  20. Chiba, S.: Macro processing in object-oriented languages. In: TOOLS Pacific 1998, Technology of Object-Oriented Languages and Systems (1998)

    Google Scholar 

  21. Ishikawa, Y., Hori, A., Sato, M., Matsuda, M., Nolte, J., Tezuka, H., Konaka, H., Maeda, M., Kubota, K.: Design and implementation of metalevel architecture in C++—MPC++ approach. In: Proc. Reflection 1996 Conference (April 1996)

    Google Scholar 

  22. Schupp, S., Gregor, D., Musser, D., Liu, S.-M.: Semantic and behavioral library transformations. Information and Software Technology 44(13), 797–810 (2002)

    Article  Google Scholar 

  23. McPeak, S., Necula, G.C.: Elkhound: A fast, practical GLR parser generator. In: Duesterwald, E. (ed.) CC 2004. LNCS, vol. 2985, pp. 73–88. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  24. Stroustrop, B., Reis, G.D.: Supporting SELL for high-performance computing. In: Ayguadé, E., Baumgartner, G., Ramanujam, J., Sadayappan, P. (eds.) LCPC 2005. LNCS, vol. 4339, pp. 458–465. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  25. Johnson, S.C.: Lint, a C program checker (April 1986)

    Google Scholar 

  26. Evans, D., Larochelle, D.: Improving security using extensible lightweight static analysis. IEEE Software, 42–51 (January 2002)

    Google Scholar 

  27. Meyers, S.: Effective C++: 50 specific ways to improve your programs and design, 2nd edn. Addison-Wesley, Reading (1997)

    MATH  Google Scholar 

  28. Williams, A., Thies, W., Ernst, M.D.: Static deadlock detection for Java libraries. In: Black, A.P. (ed.) ECOOP 2005. LNCS, vol. 3586, pp. 602–629. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  29. Foster, J.S., Terauchi, T., Aiken., A.: Flow-sensitive type qualifiers. In: Proc. ACM SIGPLAN Conference on Programming Language Design and Implementation, Berlin, Germany, June 2002, pp. 1–12 (2002)

    Google Scholar 

  30. Hallem, S., Chelf, B., Xie, Y., Engler, D.: A system and language for building system-specific, static analyses. In: Proc. ACM SIGPLAN Conference on Programming Language Design and Implementation, Berlin, Germany (June 2002)

    Google Scholar 

  31. Coverity Inc. Coverity source code security tool, http://www.coverity.com

  32. Bush, W.R., Pincus, J.D., Sielaff., D.J.: A static analyzer for finding dynamic programming errors. Software-Practice and Experience 30, 775–802 (2000)

    Article  MATH  Google Scholar 

  33. Khurshid, S., Pasareanu, C., Visser, W.: Generalized symbolic execution for model checking and testing. In: Garavel, H., Hatcliff, J. (eds.) TACAS 2003. LNCS, vol. 2619, pp. 553–568. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  34. Gregor, D., Schupp, S.: STLlint: Lifting static checking from languages to libraries. Software: Practice and Experience (2005)(to appear)

    Google Scholar 

  35. Das, M., Lerner, S., Seigle, M.: ESP: Path-sensitive program verification in polynomial time. In: Proc. ACM SIGPLAN Conference on Programming Language Design and Implementation, Berlin, Germany (June 2002)

    Google Scholar 

  36. Clarke, E., Kroening, D., Lerda, F.: A tool for checking ANSI C programs. In: Jensen, K., Podelski, A. (eds.) TACAS 2004. LNCS, vol. 2988, pp. 168–176. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  37. Xie, Y., Aiken, A.: Scalable error detection using boolean satisfiability. In: Proc. Principles of Programming Languages, Long Beach, CA, USA (January 2005)

    Google Scholar 

  38. Parasoft Corporation. Jtest (2005), http://www.parasoft.com

  39. Strout, M.M., John, M.C., Hovald, P.D.: Representation-independent program analysis. In: Proc. ACM SIGPLANSIGSOFT Workshop on Program Analysis for Software Tools and Engineering (September 2005)

    Google Scholar 

  40. Holzmann, G.J., Smith, M.H.: Automating software feature verification. Bell Labs Technical Journal 5(2), 72–87 (2000)

    Article  Google Scholar 

  41. Visser, W., Havelund, K., Brat, G., Park, S.-J., Lerda, F.: Model checking programs. Automated Software Engineering Journal 10(2) (2002)

    Google Scholar 

  42. Holzmann., G.J.: The model checker SPIN. IEEE Trans. on Software Engineering 23(5), 279–295 (1997)

    Article  Google Scholar 

  43. Corbett, J.C., Dwyer, M.B., Hatcliff, J., Laubach, S., Păsăreanu, C.S., Robby: Bandera: Extracting finite-state models from Java source cod. In: Proc. International Conference on Software Engineering, Limerick, Ireland, pp. 439–448 (2000)

    Google Scholar 

  44. Robby, Dwyer, M.B., Hatcliff, J.: Bogor: An extensible and highly modular model checking framework. In: Proc. Joint Meeting of the European Software Engineering Conference and ACM SIGSOFT Symposium on the Foundations of Software Engineering (March 2003)

    Google Scholar 

  45. Chen, H., Dean, D., Wagner, D.: Model checking one million lines of C code. In: Proc. Network and Distributed System Security Symposium, San Diego, CA, USA (February 2004)

    Google Scholar 

  46. Ball, T.A., Rajamani, S.K.: The SLAM project: Debugging system software via static analysis. In: Proc. Principles of Programming Languages (January 2002)

    Google Scholar 

  47. Henzinger, T.A., Jhala, R., Majumdar, R., Sutre, G.: Software verification with BLAST. In: Ball, T., Rajamani, S.K. (eds.) SPIN 2003. LNCS, vol. 2648, pp. 235–239. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  48. Godefroid, P.: Software model checking: the VeriSoft approach. Technical Report ITD-03-44189G, Bell Labs (2003)

    Google Scholar 

  49. Detlefs, D.L., Rustan, K., Leino, M., Nelson, G., Saxe, J.B.: Extended static checking. Technical Report SRC-159, Compaq Systems Research Center, December 18 (1998)

    Google Scholar 

  50. Kiczales, G., Lamping, J., Menhdhekar, A., Maeda, C., Lopes, C., Loingtier, J.-M., Irwin, J.: Aspect-oriented programming. In: Aksit, M., Matsuoka, S. (eds.) ECOOP 1997. LNCS, vol. 1241, pp. 220–242. Springer, Heidelberg (1997)

    Chapter  Google Scholar 

  51. Kiczales, G., Hilsdale, E., Hugunin, J., Kersten, M., Palm, J., Griswold, W.G.: An overview of aspectJ. In: Knudsen, J.L. (ed.) ECOOP 2001. LNCS, vol. 2072, pp. 327–355. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  52. Hughes, D., Greenwood, P.: Aspect testing framework. In: Proceedings of the Formal Methods for Open Object-based Distributed Systems and Distributed Applications and Interoperable Systems Student Workshop, Paris, France (November 2003)

    Google Scholar 

  53. Copty, S., Ur, S.: Multi-threaded testing with AOP is easy, and it finds bugs. In: Cunha, J.C., Medeiros, P.D. (eds.) Euro-Par 2005. LNCS, vol. 3648, pp. 740–749. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Quinlan, D., Ur, S., Vuduc, R. (2006). An Extensible Open-Source Compiler Infrastructure for Testing. In: Ur, S., Bin, E., Wolfsthal, Y. (eds) Hardware and Software, Verification and Testing. HVC 2005. Lecture Notes in Computer Science, vol 3875. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11678779_9

Download citation

  • DOI: https://doi.org/10.1007/11678779_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-32604-5

  • Online ISBN: 978-3-540-32605-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics