Skip to main content
Log in

Broadened support for software and system model interchange

  • Regular Paper
  • Published:
Software and Systems Modeling Aims and scope Submit manuscript

Abstract

Although sound performance analysis theories and techniques exist, they are not widely used because they require extensive expertise in performance modeling and measurement. The overall goal of our work is to make performance modeling more accessible by automating much of the modeling effort. We have proposed a model interoperability framework that enables performance models to be automatically exchanged among modeling (and other) tools. The core of the framework is a set of model interchange formats (MIF): a common representation for data required by performance modeling tools. Our previous research developed a representation for system performance models (PMIF) and another for software performance models (S-PMIF), both based on the Queueing Network Modeling (QNM) paradigm. In order to manage the research scope and focus on model interoperability issues, the initial MIFs were limited to QNMs that can be solved by efficient, exact solution algorithms. The overall model interoperability approach has now been demonstrated to be viable. This paper broadens the scope of PMIF and S-PMIF to represent models that can be solved with additional methods such as analytical approximations or simulation solutions. It presents the extensions considered, describes the extended meta-models, and provides verification with examples and a case study.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. We opted not to change the name to QueueingNetworkModelPlus so that previous models are still compatible with this meta-model.

References

  1. Smith, C.U., Lladó, C.M., Puigjaner, R.: Model interchange format specifications for experiments, output and results. Comput. J. 54, 674–690 (2011)

    Article  Google Scholar 

  2. Smith, C.U., Lladó, C.M.: Performance model interchange format (PMIF 2.0): XML definition and implementation. In: Proceedings of the First International Conference on the Quantitative Evaluation of Systems, pp. 38–47, September (2004)

  3. Smith, C.U., Lladó, C.M., Puigjaner, R.: Performance model interchange format (PMIF 2): a comprehensive approach to queueing network model interoperability. Perform. Eval. 67(7), 548–568 (2010)

    Article  Google Scholar 

  4. Smith, C.U., Cortellessa, V., Di Marco, A., Lladó, C.M., Williams, L.G.: From uml models to software performance results: an SPE process based on XML interchange formats. In: Proceedings of the Fifth International Workshop on Software and Performance (WOSP), pp. 87–98. July (2005)

  5. Moreno, G.A., Smith, C.U.: Performance analysis of real-time component architectures: an enhanced model interchange approach. Perform Eval. Spec. Issue Softw. Perform. 67, 612–633 (2010)

    Article  Google Scholar 

  6. Woodside, M., Petriu, D.C., Merseguer, J., Petriu, D.B., Alhaj, M.: Transformation challenges: from software models to performance models. Softw. Syst. Model. 13(4), 1529–1552 (2014)

    Article  Google Scholar 

  7. PNML: Petri Net Markup Language. www2.informatik.hu-berlin.de/top/pnml/

  8. Smith, C.U., Williams, L.G.: A performance model interchange format. J. Syst. Softw. 49(1), 63–80 (1999)

    Article  Google Scholar 

  9. Williams, L.G., Smith, C.U.: Information requirements for software performance engineering. In: Beilner, H., Bause, F. (eds.) Quantitative Evaluation of Computing and Communication Systems, Lecture Notes in Computer Science, pp. 86–101. Springer, Berlin (1995)

    Chapter  Google Scholar 

  10. Troya, J., Vallecillo, A.: Specification and simulation of queuing network models using domain–specific languages. Comput. Stand. Interfaces 36(5), 863–879 (2014)

    Article  Google Scholar 

  11. Berardinelli, L., Maetzler, E., Mayerhofen, T., Wimmer, M.: Integrating performance modeling in industrial automation through automationml and pmif. In: 2016 IEEE 14th International Conference on Industrial Informatics (INDIN), pp. 383–388, July (2016)

  12. Lladó, C.M., Smith, C.U., Bonet, P.: A model transformation tool: Pmif+ to qnap. In: Proceedings of the 8th International Conference on Performance Evaluation Methodologies and Tools (2014)

  13. Lladó, C.M., Smith, C.U.: Pmif+: Extensions to broaden the scope of supported models. In: Computer Performance Engineering LNCS 8168. Proceedings of the 10th European Workshop, EPEW 2013 (2013)

  14. Smith, C.U., Lladó, C.M.: SPE for the internet of things and other real-time embedded systems. In: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering Companion, ICPE ’17 Companion, pp. 227–232. ACM, New York, NY

  15. Smith, C.U., Williams, L.G.: Performance Solutions: A Practical Guide to Creating Responsive, Scalable Software. Addison-Wesley, Boston (2002)

    Google Scholar 

  16. Woodside, M.: Tutorial introduction to layered modeling of software performance (2013). www.sce.carleton.ca/rads/lqns/lqn-documentation/tutorialh.pdf

  17. Smith, C.U.: Performance Engineering of Software Systems. Addison-Wesley, Boston (1990)

    Google Scholar 

  18. Mesquite software. www.mesquite.com

  19. SPE-ED. LS Computer Technology Inc. Performance Engineering Services Division. www.spe-ed.com

  20. Simulog. MODLINE 2.0 QNAP2 9.3: Reference Manual (1996)

  21. Java modelling tools (jmt). http://jmt.sourceforge.net/

  22. Bertoli, M., Casale, G., Serazzi, G.: JMT: performance engineering tools for system modeling. SIGMETRICS Perform. Eval. Rev. 36(4), 10–15 (2009)

    Article  Google Scholar 

  23. CA Hyperformix. Ca hyperformix documentation/manuals (2016). https://supportcontent.ca.com/phpdocs/0/8481/8481_docindex.html

  24. Zallocco, S.: An introduction to WEASEL, a web service for analyzing queueing networks with multiple solvers and PMIF editor user manual. http://www.zallocco.net/Documents/weasel_and_pmif_editor_user_manual_English.pdf

  25. Computer aids for vlsi design. Appendix d: electronic design interchange format. www.rulabinsky.com/cavd/text/chapd.html

  26. Electronics Industries Association. CDIF– CASE Data Interchange Format Overview. EIA/IS-106 (1994)

  27. Woodside, C.M., Petriu, D.B.: An intermediate metamodel with scenarios and resources for generating performance models from uml designs. Softw. Syst. Model. 6(2), 163–184 (2007)

    Article  Google Scholar 

  28. Woodside, C.M., Petriu, D.C., Petriu, D.B., Shen, H., Israr, T., Merseguer, J.: Performance by unified model analysis (PUMA). In: Proceedings of the Fifth International Workshop on Software and Performance (WOSP), pp. 1–12, July (2005)

  29. Becker, S., Koziolek, H., Reussner, R.: The palladio component model for model-driven performance prediction. J. Syst. Softw. 82(1), 3–22 (2009)

    Article  Google Scholar 

  30. Happe, J., Koziolek, H., Reussner, R.: Facilitating performance predictions using software components. IEEE Softw. 28(3), 27–33 (2011)

    Article  Google Scholar 

  31. Grassi, V., Mirandola, R., Randazzo, E., Sabetta, A.: Klaper: An Intermediate Language for Model-Driven Predictive Analysis of Performance and Reliability. The Common Component Modeling Example, volume 5153 of Lecture Notes in Computer Science, pp. 327–356. Springer, Berlin (2008)

  32. Object Management Group. The unified modeling language. www.uml.org

  33. Object Management Group. The UML profile for MARTE: Modeling and analysis of real-time and embedded systems. www.omgmarte.org

  34. Selic, B., Gérard, S.: Modeling and Analysis of Real-Time and Embedded Systems with UML and MARTE: Developing Cyber-Physical Systems. The MK/OMG Press, Los Altos (2013)

    Google Scholar 

  35. Medina, J.L., Cuesta, A.G.: From composable design models to schedulability analysis with uml and the uml profile for marte. SIGBED Rev. 8(1), 64–68 (2011)

    Article  Google Scholar 

  36. Mohammad, A., Petriu, D.: Aspect-oriented modelling of platforms in software and performance models. In: Proceedigs of the International Conference on Electrical and Computer Systems (2012)

  37. Mohammad, A., Petriu, D.: Using aspects for platform-independent to platform-dependent model transformations. Int. J. Electr. Comput. Eng. 1, 35–48 (2012)

    Google Scholar 

  38. Petriu, D.: Model driven engineering for distributed real-time systems: MARTE modelling, model transformations and their usages. In: Chapter Software Model-Based Performance Analysis, pp. 139–166. Wiley, New York (2010)

  39. Merseguer, J., Bernardi, S.: Dependability analysis of DES based on MARTE and UML state machines models. Discret. Event Dyn. Syst. 22(2), 163–178 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  40. Berardinelli, L., Bernardo, M., Cortellessa, V., Di Marco, A.: Multidimensional context modeling applied to non-functional analysis of software. Softw Syst. Model. pp. 1–40 (2017)

  41. Bandyopadhyay, A., Ghost, S.: Developing model transformation tools using the uml metamodel: challenges and solutions. In: Proceedings of the International Conference on Software Engineering and Applications (2007)

  42. Smith, C.U., Lladó, C.M.: Model Interoperability for Performance Engineering: Survey of Milestones and Evolution, pp. 10–23. Springer, Berlin (2011)

    Google Scholar 

  43. Casale, G., Gribaudo, M., Serazzi, G.: Tools for Performance Evaluation of Computer Systems: Historical Evolution and Perspectives, pp. 24–37. Springer, Berlin (2011)

    Google Scholar 

  44. Smith, C.U., Lladó, C.M., Puigjaner, R.: PMIF extensions: increasing the scope of supported models. In: Proceedgings of the 1st Joint WOSP/SIPEW International Conference on Performance Engineering (ICPE), pp. 255–256, Jannuary (2010)

  45. Cicchetti, A., Di Ruscio, D., Eramo, R., Pierantonio, A.: Automating co-evolution in model-driven engineering. In: 2008 12th International IEEE Enterprise Distributed Object Computing Conference, pp 222–231, Sept (2008)

  46. Erl, T.: SOA Design Patterns. Prentice Hall, Upper Saddle River (2009)

    Google Scholar 

  47. Smith, C.U., Smith, M.A.: Automated performance prediction for model-driven engineering of real-time embedded systems. In: Proceedings of the Systems and Software Technology Conference (2011)

  48. Eclipse modeling project. www.eclipse.org/modeling

  49. Mesquite Software Inc. www.mesquite.com

  50. Medina, J.L.: The uml profile for marte: Modelling predictable real-time systems with uml (2011). http://www.artist-embedded.org/docs/Events/2011/Models_for_SA/01-MARTE-SAM-Julio_Medina.pdf

  51. Drake, J.M., Medina, J.L.: Robot teleoperado: Ejemplo uml mast (2001). http://mast.unican.es/simmast/simmast-example.pdf

  52. Balsamo, S., Marzolla, M.: Performance evaluation of UML software architectures with multiclass queueing network models. In: Proceedings of the Fifth International Workshop of Software and Performance (WOSP), July (2005)

  53. Gómez, A., Smith, C.U., Spellmann, A., Cabot, J.: Enabling performance modeling for the masses: initial experiences. In: Khendek, F., Gotzhein, R. (eds.) System Analysis and Modeling. Languages, Methods, and Tools for Systems Engineering, pp. 105–126. Springer International Publishing, Cham (2018)

    Chapter  Google Scholar 

  54. Platform Independent Petri net Editor 2. http://pipe2.sourceforge.net/

  55. Garcia, D., Lladó, C.M., Smith, C.U., Puigjaner, R.: A PMIF semantic validation tool. In: Proceedings of the Third International Conference on the Quantitative Evaluation of Systems, pp. 121–122, September (2006)

Download references

Acknowledgements

Smith’s participation was sponsored by US Air Force Contract FA8750-15-C-0171 Case number 88ABW-2016-3702.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Catalina M. Lladó.

Additional information

Communicated by Dr Jeff Gray.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

A S-PMIF+ specification for the CommandProcess scenario

figure e
Table 9 PMIF meta-model differences
Table 10 SPMIF meta-model differences

B Comparison of the meta-model versions

Table 9 shows the comparison of PMIF2 and PMIF+ versions, with respect to entities, references and attributes. Apart from the arcs that were taken out from PMIF2 since they are only needed for the graphical representation of QNMs, the rest of the changes are extensions in PMIF+ w.r.t PMIF. WorkUnitServiceRequest is taken out in PMIF+ but it can be expressed as a ServiceRequestPlus, therefore we removed it to avoid duplication.

Earlier versions of S-PMIF required each scenario to be assigned to one facility. S-PMIF+ no longer has this requirement so this is implemented with several changes:

  • Both Facility and OverheadMatrix are now optional,

  • The reference from scenario to facility has been removed; there is a reference from ServiceSpec to Server instead,

  • A reference from Server to Project relaxes the previous requirement that Server must be a child of Facility,

  • A reference from the new element CalculatedService to SoftwareResourceRequirement makes explicit when the OverheadMatrix is used to calculate the ServiceSpec.

Some other minor details related to the OverheadMatrix were discovered and corrected. They do not change the content of the OverheadMatrix only its representation so they are not covered in detail here.

SchedulingPolicy was extended to allow additional policies such as round robin, last-come first-served and others. Table 10 gives details of the changes between the different versions.

Table 11 MIF features supported by tools

C MIF feature interoperability

Table 11 shows features supported by various tools. A D for the feature means the corresponding tool directly supports that feature. For example, CSIM [49] has a primitive called buffer that has the same behavior as defined in Sect. 3.1. An I indicates that it is possible to represent the defined behavior with other features provided by the tool. For example, CSIM does not have a primitive operation called join, but it is possible to implement the behavior by using an event that is set by individual workloads as they complete. An N indicates that the feature is not currently supported by the tool. The 3 shown for “Phases” indicates tools that support exactly 3 phases, not unlimited as in the definition.

All tools provide for ProbabilityDistributions and queue SchedulingPolicies in addition to the basic ones that can be solved by efficient, exact solution algorithms, but there is a significant difference in what is supported among tools. Thus, the transformation or import of distributions and scheduling policies must make substitutions for specifications that are not supported. Of course, the model solution is likely to differ among tools that do not have the same model specifications. It is still useful to compare solutions, and it may be possible to determine how much the specification affects performance (when there are not many differences). It is also possible to automatically move from a tool that does not support the desired specification to one that does to quantify its benefit.

Note that all new features are supported by one or more tools, and most of them are supported by multiple tools. Other than distributions or scheduling policies, we did not find modeling features in other tools not considered in the MIF extensions.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lladó, C.M., Smith, C.U. Broadened support for software and system model interchange. Softw Syst Model 18, 3527–3550 (2019). https://doi.org/10.1007/s10270-019-00728-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10270-019-00728-x

Keywords

Navigation