Skip to main content

Performance engineering evaluation of object-oriented systems with SPE·ED TM

  • Conference paper
  • First Online:
Computer Performance Evaluation Modelling Techniques and Tools (TOOLS 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1245))

Abstract

Although object-oriented methods have been shown to help construct software systems that are easy to understand and modify, have a high potential for reuse, and are relatively quick and easy to implement, concern over performance of object-oriented systems represents a significant barrier to its adoption. Our experience has shown that it is possible to design object-oriented systems that have adequate performance and exhibit the other qualities, such as reusability, maintainability, and modifiability, that have made OOD so successful. However, doing this requires careful attention to performance goals throughout the life cycle. This paper describes the use of SPE·ED, a performance modeling tool that supports the SPE process, for early life cycle performance evaluation of object-oriented systems. The use of SPE·ED for performance engineering of object-oriented software is illustrated with a simple example.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. Baldassari, B. Bruno, V. Russi, and R. Zompi, “PROTOB: A Hierarchical Object-Oriented CASE Tool for Distributed Systems,” Proceedings European Software Engineering Conference-1989, Coventry, England, 1989.

    Google Scholar 

  2. M. Baldassari and G. Bruno, “An Environment for Object-Oriented Conceptual Programming Based on PROT Nets,” in Advances in Petri Nets, Lectures in Computer Science No. 340, Berlin, Springer-Verlag, 1988, pp. 1–19.

    Google Scholar 

  3. H. Beilner, J. Mäter, and N. Weissenburg, “Towards a Performance Modeling Environment: News on HIT,” Proceedings 4th International Conference on Modeling Techniques and Tools for Computer Performance Evaluation, Plenum Publishing, 1988.

    Google Scholar 

  4. H. Beilner, J. Mäter, and C. Wysocki, “The Hierarchical Evaluation Tool HIT,” in Performance Tools & Model Interchange Formats, vol. 581/1995, F. Bause and H. Beilner, ed., D-44221 Dortmund, Germany, Universität Dortmund, Fachbereich Informatik, 1995, pp. 6–9.

    Google Scholar 

  5. G. Booch, Object-Oriented Analysis and Design with Applications, Redwood City, CA, Benjamin/Cummings, 1994.

    Google Scholar 

  6. G. Booch and J. Rumbaugh, “Unified Method for Object-Oriented Development,” Rational Software Corporation, Santa Clara, CA, 1995.

    Google Scholar 

  7. R. J. A. Buhr and R. S. Casselman, Use Case Maps for Object-Oriented Systems, Upper Saddle River, NJ, Prentice Hall, 1996.

    Google Scholar 

  8. R. J. A. Buhr and R. S. Casselman, “Timethread-Role Maps for Object-Oriented Design of Real-Time and Distributed Systems,” Proceedings of OOPSLA '94: Object-Oriented Programming Systems, Languages and Applications, Portland, OR, October, 1994, pp. 301–316.

    Google Scholar 

  9. R. J. A. Buhr and R. S. Casselman, “Architectures with Pictures,” Proceedings of OOPSLA '92: Object-Oriented Programming Systems, Languages and Applications, Vancouver, BC, October, 1992, pp. 466–483.

    Google Scholar 

  10. D. Coleman, P. Arnold, S. Bodoff, C. Dollin, H. Gilchrist, F. Hayes, and P. Jeremaes, Object-Oriented Development: The Fusion Method, Englewood Cliffs, NJ, Prentice Hall, 1994.

    Google Scholar 

  11. R. T. Goettge, “An Expert System for Performance Engineering of Time-Critical Software,” Proceedings Computer Measurement Group Conference, Orlando FL, 1990, pp. 313–320.

    Google Scholar 

  12. A. Grummitt, “A Performance Engineer's View of Systems Development and Trials,” Proceedings Computer Measurement Group Conference, Nashville, TN, 1991, pp. 455–463.

    Google Scholar 

  13. C. Hrischuk, J. Rolia, and C. M. Woodside, “Automatic Generation of a Software Performance Model Using an Object-Oriented Prototype,” Proceedings of the Third International Workshop on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems, Durham, NC, January, 1995, pp. 399–409.

    Google Scholar 

  14. ITU, “Criteria for the Use and Applicability of Formal Description Techniques, Message Sequence Chart (MSC),” International Telecommunication Union, 1996.

    Google Scholar 

  15. I. Jacobson, M. Christerson, P. Jonsson, and G. Overgaard, Object-Oriented Software Engineering, Reading, MA, Addison-Wesley, 1992.

    Google Scholar 

  16. K. E. E. Raatikainen, “Accuracy of Estimates for Dynamic Properties of Queueing Systems in Interactive Simulation,” University of Helsinki, Dept. of Computer Science Teollisuuskatu 23, SF-00510 Helsinki, Finland, 1993.

    Google Scholar 

  17. J. A. Rolia, “Predicting the Performance of Software Systems,” University of Toronto, 1992.

    Google Scholar 

  18. J. Rumbaugh, M. Blaha, W. Premerlani, F. Eddy, and W. Lorensen, Object-Oriented Modeling and Design, Englewood Cliffs, NJ, Prentice Hall, 1991.

    Google Scholar 

  19. H. Schwetman, “CSIM17: A Simulation Model-Building Toolkit,” Proceedings Winter Simulation Conference, Orlando, 1994.

    Google Scholar 

  20. C. U. Smith, Performance Engineering of Software Systems, Reading, MA, Addison-Wesley, 1990.

    Google Scholar 

  21. C. U. Smith and L. G. Williams, “Software Performance Engineering: A Case Study Including Performance Comparison with Design Alternatives,” IEEE Transactions on Software Engineering, vol. 19, no. 7, pp. 720–741, 1993.

    Article  Google Scholar 

  22. C. U. Smith and L. G. Williams, “A Performance Model Interchange Format,” in Performance Tools and Model Interchange Formats, vol. 581/1995, F. Bause and H. Beilner, ed., D-44221 Dortmund, Germany, Universität Dortmund, Informatik IV, 1995, pp. 67–85.

    Google Scholar 

  23. M. Turner, D. Neuse, and R. Goldgar, “Simulating Optimizes Move to Client/Server Applications,” Proceedings Computer Measurement Group Conference, Reno, NV, 1992, pp. 805–814.

    Google Scholar 

  24. L. G. Williams, “Definition of Information Requirements for Software Performance Engineering,” Technical Report No. SERM-021-94, Software Engineering Research, Boulder, CO, October, 1994.

    Google Scholar 

  25. L. G. Williams and C. U. Smith, “Information Requirements for Software Performance Engineering,” in Quantitative Evaluation of Computing and Communication Systems, Lecture Notes in Computer Science, vol. 977, H. Beilner and F. Bause, ed., Heidelberg, Germany, Springer-Verlag, 1995, pp. 86–101.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Raymond Marie Brigitte Plateau Maria Calzarossa Gerardo Rubino

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Smith, C.U., Williams, L.G. (1997). Performance engineering evaluation of object-oriented systems with SPE·ED TM . In: Marie, R., Plateau, B., Calzarossa, M., Rubino, G. (eds) Computer Performance Evaluation Modelling Techniques and Tools. TOOLS 1997. Lecture Notes in Computer Science, vol 1245. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0022203

Download citation

  • DOI: https://doi.org/10.1007/BFb0022203

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63101-9

  • Online ISBN: 978-3-540-69131-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics