Skip to main content
Log in

Performance assessment of intelligent distributed systems through software performance ontology engineering (SPOE)

  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

In the computer science community there is a growing interest in the field of Ambient Intelligent Systems. This systems surround their human users with computing and networking technology unobtrusively embedded in their environment. This technology is aimed to provide the users with useful information and to take action to make the environment more convenient for them. As the number of users increases the resources that make Ambient Intelligence possible can be easily saturated making the system unstable and projecting an image of poor QoS to the users. The main goal of this paper is to provide the means for the Ambient Intelligent Systems to monitor themselves and take corrective action automatically if performance starts to drop. Our approach uses a Performance Ontology that structures the knowledge about Software Performance Engineering, and a reasoning engine that acts like an expert system with the Performance Ontology as its foundation. The case study at the end shows the applicability of the developed techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Antoniou, G., van Harmelen, F. 2004. A Semantic Web Primer. The MIT Press.

  • Beer, W., Christian, V., Ferscha, A., Mehrmann, L. 2003. Modeling context-aware behavior by interpreted eca rules. In: Euro-Par, pp. 1064–1073.

  • Clements, P.C. 2000. Active reviews for intermediate designs. CMU/SEI-2000-TN-009.

  • Crowley, J.L., Coutaz, J., Rey, G., Reignier, P. 2002. Perceptual components for context aware computing. In: UbiComp '02: Proceedings of the 4th International Conference on Ubiquitous Computing, London, UK, Springer-Verlag, pp. 117–134.

  • Cycorp: Foaf (2002) http://www.foaf-project.org/ (accessed Oct. 27, 2005).

  • DAML: Owl-s supplies web service providers with a core set of markup language constructs for describing the properties and capabilities of their web services. (2002). http://www.daml.org/services/owl-s/ (accessed March. 27, 2005).

  • Davies, J., van Harmelen, F., Fensel, D. (eds.) 2002. Towards the Semantic Web: Ontology-driven Knowledge Management. New York, NY, USA: John Wiley & Sons, Inc.

  • De Simone, M., Kazman, R. 1995. Software architectural analysis: An experience report. In: Proceedings of CASCON'95, Toronto, ON, pp. 251–261.

  • Gomaa, H. 1993. Software Design Methods for Concurrent and Real-Time Systems. Boston, MA, USA: Addison-Wesley Longman Publishing Co., Inc.

  • Gruber, T.R. 1993. A translation approach to portable ontology specification. Knowledge Acquisition 5(2):199–220.

    Google Scholar 

  • Hobbs, R. J. DamlTime (2004). www.cs.rochester.edu/∼ferguson/daml (accessed Oct. 27, 2005).

  • JADE: Java agent development framework (2001) http://jade.tilab.com/.

  • Jain, R. (1991). Art of Computer Systems Performance Analysis Techinques For Experimental Design Measurements Simulation and Modeling. Wiley Computer Publishing, John Wiley & Sons, Inc.

  • Kazman, R., Klein, M.H., Barbacci, M., Longstaff, T.A., Lipson, H.F., Carrière, S.J. 1998. The architecture tradeoff analysis method. In: ICECCS, pp. 68–78.

  • Khedr, M. Cononto (2004) http://www.site.uottawa.ca/∼mkhedr/ (accessed Oct. 27, 2005).

  • King, P.J.B., Pooley, R. 2000. Derivation of petri net performance models from uml specifications of communications software. In: TOOLS '00: Proceedings of the 11th International Conference on Computer Performance Evaluation: Modelling Techniques and Tools, London, UK,: Springer-Verlag, pp. 262–276.

  • Menascé, D.A., Gomaa, H. 1998. On a language based method for software performance engineering of client/server systems. In: WOSP '98: Proceedings of the 1st International Workshop on Software and Performance, New York, NY, USA, ACM Press, pp. 63–69.

  • OMG. 2002. Uml profile for schedulability, performance and time specification. http:// www.omg.org/technology/documents/formal/schedulability.htm (accessed March. 27, 2005).

  • OMG 2005. Object management group OMG: Unified modeling language specification. http://www.omg.org/uml (accessed Oct. 27, 2005).

  • Sancho, P.P., Juiz, C., Puigjaner, R. 2004. Integrating system performance engineering into mascot methodology through discrete-event simulation. In: Proceedings of Forte 2004 Workshops TheFormEMC, EPEW, ITM, Springer LNCS 3236, pp. 278–292.

  • Szekely, B., Betz, J. Jastor (2005). http://jastor.sourceforge.net/ (accessed Oct. 27, 2005).

  • Uschold, M. 1995. Towards a methodology for building ontologies.

  • W3C: Owl(2005). http://www.w3.org/TR/owl-features/ (accessed Oct. 27, 2005).

Download references

Acknowledgment

The authors acknowledge the partial financial support of this research through the programme Accions especials del Govern de les Illes Balears from Conselleria d'Economia, Hisenda i Innovació.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Isaac Lera.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lera, I., Sancho, P.P., Juiz, C. et al. Performance assessment of intelligent distributed systems through software performance ontology engineering (SPOE). Software Qual J 15, 53–67 (2007). https://doi.org/10.1007/s11219-006-9004-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11219-006-9004-1

Keywords

Navigation