Abstract
Controlled experiments in model-based software engineering, especially those involving human subjects performing modeling tasks, often require comparing models produced by experiment subjects with reference models, which are considered to be correct and complete. The purpose of such comparison is to assess the quality of models produced by experiment subjects so that experiment hypotheses can be accepted or rejected. The quality of models is typically measured quantitatively based on metrics. Manually defining such metrics for a rich modeling language is often cumbersome and error-prone. It can also result in metrics that do not systematically consider relevant details and in turn may produce biased results. In this paper, we present a framework to automatically generate quality metrics for MOF-based metamodels, which in turn can be used to measure the quality of models (instances of the MOF-based metamodels). This framework was evaluated by comparing its results with manually derived quality metrics for UML class and sequence diagrams and it has been used to derive metrics for measuring the quality of UML state machine diagrams. Results show that it is more efficient and systematic to define quality metrics with the framework than doing it manually.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Claes Wohlin, P.R.: Martin Höst: Experimentation in software engineering: An introduction. Kluwer Acdamic Publisher, London (2000)
Yue, T., Briand, L., Labiche, Y.: Facilitating the Transition from Use Case Models to Analysis Models: Approach and Experiments. ACM Transactions on Software Engineering and Methodology (TOSEM) 22 (2013)
Yue, T., Briand, L.C., Labiche, Y.: A Use Case Modeling Approach to Facilitate the Transition Towards Analysis Models: Concepts and Empirical Evaluation. In: Schürr, A., Selic, B. (eds.) MODELS 2009. LNCS, vol. 5795, pp. 484–498. Springer, Heidelberg (2009)
OMG: MOF 2.0 Core Specification (formal/2006-01-01).
OMG: UML 2.2 Superstructure Specification (formal/2009-02-04).
Ali, S., Yue, T., Briand, L.: Assessing Quality and Effort of Applying Aspect State Machines for Robustness Testing: A Controlled Experiment. In: International Conference on Software Testing, Verification and Validation (2013)
Ali, S., Yue, T.: Comprehensively Evaluating Conformance Error Rates of Applying Aspect State Machines for Robustness Testing. In: International Conference on Aspect-Oriented Software Development (AOSD 2012), pp. 155–166. ACM (2012)
Ali, S., Briand, L., Hemmati, H.: Modeling Robustness Behavior Using Aspect-Oriented Modeling to Support Robustness Testing of Industrial Systems (2010)
Briand, L., Melo, W., Wüst, J.: Assessing the applicability of fault-proneness models across object-oriented software projects. IEEE Transactions on Software Engineering, 706–720 (2002)
Briand, L., Wüst, J.: Empirical studies of quality models in object-oriented systems. Advances in Computers 56, 97–166 (2002)
Chidamber, S.R., Kemerer, C.: Towards a Metrics Suite for Object Oriented design. In: Object-Oriented Programming: Systems, Languages and Applications (OOPSLA 1991), pp. 197–211. SIGPLAN Notices (1991)
Bieman, J., Kang, B.: Cohesion and reuse in an object-oriented system. In: The 1995 Symposium on Software Reusability, pp. 259–262. ACM (1995)
Briand, L., Bunse, C., Daly, J.: A controlled experiment for evaluating quality guidelines on the maintainability of object-oriented designs. IEEE Transactions on Software Engineering 27, 513–530 (2002)
Harrison, R., Counsell, S., Nithi, R.: Experimental assessment of the effect of inheritance on the maintainability of object-oriented systems. Journal of Systems and Software 52, 173–179 (2000)
Lange, C.F.J.: Assessing and improving the quality of modeling: A series of empirical studies about the UML. Technische Universiteit Eindhoven (2007)
Xu, B., Kang, D., Lu, J.: A Structural Complexity Measure for UML Class Diagrams. In: Bubak, M., van Albada, G.D., Sloot, P.M.A., Dongarra, J. (eds.) ICCS 2004. LNCS, vol. 3036, pp. 421–424. Springer, Heidelberg (2004)
Reißing, R.: Towards a model for object-oriented design measurement. In: ECOOP Workshop on Quantative Approaches in Object-Oriented Software Engineering, pp. 71–84 (2001)
Marchesi, M.: OOA metrics for the Unified Modeling Language. In: The 2nd Euromicro Conference on Software Maintenance and Reengineering, pp. 67–73 (1998)
Genero, M., Piattini, M., Calero, C.: Early measures for UML class diagrams. L’Objet 6, 489–505 (2000)
Genero, M., Piattini, M., Calero, C.: Empirical validation of class diagram metrics. In: International Symposium on Empirical Software Engineering, pp. 195–203. IEEE (2002)
Kim, H., Boldyreff, C.: Developing software metrics applicable to UML models. In: The 6th International Workshop on Quantitative Approaches in Object–Oriented Software Engineering. Citeseer (2002)
Lavazza, L., Agostini, A.: Automated Measurement of UML Models: an open toolset approach. Journal of Object Technology 4, 114–134 (2005)
Carbone, M., Santucci, G.: Fast&&Serious: a UML based metric for effort estimation. In: 6th ECOOP Workshop on Quantitative Approaches in Object-Oriented Software Engineering. Citeseer (2011)
Cruz-Lemus, J., Maes, A., Genero, M., Poels, G., Piattini, M.: The impact of structural complexity on the understandability of UML statechart diagrams. Information Sciences 180, 2209–2220 (2010)
OMG: OMG Knowledge Discovery Meta-Model (KDM) (formal/2009-01-02)
OMG: OMG Business Procee Definition Metamodel (BPDM) (formal/2008-11-03)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Yue, T., Ali, S. (2014). A MOF-Based Framework for Defining Metrics to Measure the Quality of Models. In: Cabot, J., Rubin, J. (eds) Modelling Foundations and Applications. ECMFA 2014. Lecture Notes in Computer Science, vol 8569. Springer, Cham. https://doi.org/10.1007/978-3-319-09195-2_14
Download citation
DOI: https://doi.org/10.1007/978-3-319-09195-2_14
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-09194-5
Online ISBN: 978-3-319-09195-2
eBook Packages: Computer ScienceComputer Science (R0)