skip to main content
10.1145/2424563.2424571acmconferencesArticle/Chapter ViewAbstractPublication PagesmodelsConference Proceedingsconference-collections
research-article

Making the case for measuring mental effort

Published:01 October 2012Publication History

ABSTRACT

To empirically investigate conceptual modeling languages, subjects are typically confronted with experimental tasks, such as the creation, modification or understanding of conceptual models. Thereby, accuracy, i.e., the amount of correctly performed tasks divided by the number of total tasks, is usually used to assess performance. Even though accuracy is widely adopted, it is connected to two often overlooked problems. First, accuracy is a rather insensitive measure. Second, for tasks of low complexity, the measurement of accuracy may be distorted by peculiarities of the human mind. In order to tackle these problems, we propose to additionally assess the subject's mental effort, i.e., the mental resources required to perform a task. In particular, we show how aforementioned problems connected to accuracy can be resolved, that mental effort is a valid measure of performance and how mental effort can easily be assessed in empirical research.

References

  1. J. Aranda, N. Ernst, J. Horkoff, and S. Easterbrook. A Framework for Empirical Evaluation of Model Comprehensibility. In Proc. MISE '07, pages 7--12, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. Baddeley. Working Memory: Theories, Models, and Controversies. Annu. Rev. Psychol., 63(1): 1--29, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  3. M. Bannert. Managing cognitive load---recent trends in cognitive load theory. Learning and Instruction, 12(1): 139--146, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  4. J. C. Carver, E. Syriani, and J. Gray. Assessing the frequency of empirical evaluation in software modeling research. In Proc. EESSMod '11, pages 28--37, 2011.Google ScholarGoogle Scholar
  5. J. A. Cruz-Lemus, M. Genero, M. E. Manso, S. Morasca, and M. Piattini. Assessing the understandability of UML statechart diagrams with composite states---A family of empirical studies. Empirical Software Engineering, 25(6): 685--719, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. F. Davies. A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results. PhD thesis, Sloan School of Management, 1986.Google ScholarGoogle Scholar
  7. A. M. Férnandez-Sáez, M. Genero, and M. R. V. Chaudron. Does the level of detail of uml models affect the maintainability of source code? In Proc. EESSMod '11, pages 3--17, 2011.Google ScholarGoogle Scholar
  8. A. Gemino and Y. Wand. A framework for empirical evaluation of conceptual modeling techniques. Requir. Eng., 9(4): 248--260, 2004.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. D. Gopher and R. Brown. On the psychophysics of workload: Why bother with subjective measure? Human Factors: The Journal of the Human Factors and Ergonomics Society, 26(5): 519--532, 1984.Google ScholarGoogle ScholarCross RefCross Ref
  10. I. Hadar and U. Leron. How intuitive is object-oriented design? Communications of the ACM, 51(5): 41--46, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. C. Houy, P. Fettke, and P. Loos. Understanding understandability of conceptual models: What are we actually talking about? In Proc. ER '12, pages 64--77, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. R. Laue and A. Gadatsch. Measuring the Understandability of Business Process Models - Are We Asking the Right Questions? In Proc. BPD '10, pages 37--48, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  13. R. Mayer and P. Chandler. When learning is just a click away: Does simple user interaction foster deeper understanding of multimedia messages. Journal of Educational Psychology, 93(2): 390--397, 2001.Google ScholarGoogle ScholarCross RefCross Ref
  14. D. L. Moody. Cognitive Load Effects on End User Understanding of Conceptual Models: An Experimental Analysis. In Proc. ADBIS '04, pages 129--143, 2004.Google ScholarGoogle ScholarCross RefCross Ref
  15. J. Mylopoulos. Information modeling in the time of the revolution. Information Systems, 23(3/4): 127--155, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. F. Paas. Training strategies for attaining transfer of problem-solving skill in statistics: A cognitive-load approach. Journal of Educational Psychology, 84(4): 429--434, 1992.Google ScholarGoogle ScholarCross RefCross Ref
  17. F. Paas, A. Renkl, and J. Sweller. Cognitive Load Theory and Instructional Design: Recent Developments. Educational Psychologist, 38(1): 1--4, 2003.Google ScholarGoogle Scholar
  18. J. Parsons and L. Cole. What do the pictures mean? Guidelines for experimental evaluation of representation fidelity in diagrammatical conceptual modeling techniques. DKE, 55(3): 327--342, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. J. Pinggera, M. Furtner, M. Martini, P. Sachse, K. Reiter, S. Zugal, and B. Weber. Investigating the Process of Process Modeling with Eye Movement Analysis. In Proc. ER-BPM '12, to appear.Google ScholarGoogle Scholar
  20. J. Pinggera, P. Soffer, S. Zugal, B. Weber, M. Weidlich, D. Fahland, H. Reijers, and J. Mendling. Modeling Styles in Business Process Modeling. In Proc. BPMDS '12, pages 151--166, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  21. J. Pinggera, S. Zugal, and B. Weber. Investigating the process of process modeling with cheetah experimental platform. In Proc. ER-POIS '10, pages 13--18, 2010.Google ScholarGoogle Scholar
  22. K. E. Stanovich and R. West. Individual differences in reasoning: implications for the rationality debate? Behavioural and Brain Sciences, 23(5): 665--726, 2000.Google ScholarGoogle ScholarCross RefCross Ref
  23. J. Sweller. Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2): 257--285, 1988.Google ScholarGoogle ScholarCross RefCross Ref
  24. W. J. Tracz. Computer programming and the human thought process. Software: Practice and Experience, 9(2): 127--137, 1979.Google ScholarGoogle ScholarCross RefCross Ref
  25. W. P. Vogt. Dictionary of Statistics & Methodology: A Nontechnical Guide for the Social Sciences (Fourth Edition). SAGE Publications, 2011.Google ScholarGoogle Scholar
  26. C. Wohlin, M. Höst, and K. Henningsson. Empirical research methods in software engineering. In Empirical Methods and Studies in Software Engineering, volume 2765 of LNCS, pages 7--23. Springer, 2003.Google ScholarGoogle Scholar
  27. C. Wohlin, R. Runeson, M. Halst, M. Ohlsson, B. Regnell, and A. Wesslen. Experimentation in Software Engineering: an Introduction. Kluwer, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. S. Zugal, J. Pinggera, J. Mendling, H. Reijers, and B. Weber. Assessing the Impact of Hierarchy on Model Understandability---A Cognitive Perspective. In Proc. EESSMod '11, pages 123--133, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. S. Zugal, J. Pinggera, and B. Weber. Assessing process models with cognitive psychology. In Proc. EMISA '11, pages 177--182, 2011.Google ScholarGoogle Scholar
  30. S. Zugal, J. Pinggera, and B. Weber. Creating Declarative Process Models Using Test Driven Modeling Suite. In Proc. CAiSE Forum '11, pages 16--32, 2011.Google ScholarGoogle Scholar
  31. S. Zugal, J. Pinggera, and B. Weber. The impact of testcases on the maintainability of declarative process models. In Proc. BPMDS '11, pages 163--177, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  32. S. Zugal, J. Pinggera, and B. Weber. Toward Enhanced Life-Cycle Support for Declarative Processes. Journal of Software: Evolution and Process, 24(3): 285--302, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  33. S. Zugal, P. Soffer, J. Pinggera, and B. Weber. Expressiveness and Understandability Considerations of Hierarchy in Declarative Business Process Models. In Proc. BPMDS '12, pages 167--181, 2012.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Making the case for measuring mental effort

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      EESSMod '12: Proceedings of the Second Edition of the International Workshop on Experiences and Empirical Studies in Software Modelling
      October 2012
      57 pages
      ISBN:9781450318112
      DOI:10.1145/2424563

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 October 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      EESSMod '12 Paper Acceptance Rate9of18submissions,50%Overall Acceptance Rate9of18submissions,50%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader