Skip to main content
Log in

Student Task Modeling in Design and Evaluation of Open Problem-Solving Environments

  • Published:
Education and Information Technologies Aims and scope Submit manuscript

Abstract

Design and evaluation of computer-based open problem solving environments is a non-trivial task. Definition of a design framework, which involves a strong field-evaluation phase, has been the subject of the research described in this paper. This framework is based on the concept of student task modeling. Tools to support design and evaluation have been built and used in the frame of this study. The framework and the developed tools have produced promising results during the evaluation of an open problem-solving educational environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abowd, G. D. (1992) Using formal methods for the specification of user interfaces. In Proceedings of the Second Irvine Software Symposium, ISS, 92, 109–130.

    Google Scholar 

  • Ainsworth, L. and Pendlebury, G. (1995) Task-based contributions to the design and assessment of the manmachine interfaces for a pressurized water reactor. Ergonomics, 38(3), 462–474.

    Google Scholar 

  • Avouris, N. M., Tselios, N. and Tatakis, E. C. (2001) Development and evaluation of a computer-based laboratory teaching tool. Journal of Computer Applications in Engineering Education, 9(1), 8–19.

    Google Scholar 

  • Balacheff, N. and Kaput, J. (1996) Computer-based learning environments in mathematics. In International Handbook on Mathematics Education, A. J. Bishop, K. Klements, C. Keitel, J. Kilpatric and C. Laborde (eds), Kluwer, Dortdrecht, pp. 469–501.

    Google Scholar 

  • Barnard, P. J. (1987) Cognitive resources and the learning of human-computer dialogues, In Interfacing Thought: Cognitive Aspects of Human Computer Interaction, J. M. Carroll (ed), MIT Press, Cambridge, MA.

    Google Scholar 

  • Card, S., Moran, T. and Newell, A. (1983) The Psychology of Human Computer Interaction. Lawrence Erlbaum Associates, London.

    Google Scholar 

  • Diaper, D. (1989) Task analysis and systems analysis for software development. Interacting with Computers, 4(1), 124–139.

    Google Scholar 

  • Gong, R. and Kieras, D. (1994) A validation of the GOMS model methodology in the development of a specialized, commercial software application. In Proceedings ACM CHI. Boston, Massachusetts, April 24-28. pp. 351–357.

  • Haunold, P. and Kuhn, W. (1994) A keystroke level of a graphic application: manual map digitizing. In Proceedings ACM CHI, pp. 337–343.

  • Inkpen, K. (1997) Three Important Research Agendas for Educational Multimedia: Learning, Children, and Gender. AACE World Conference on Educational Multimedia and Hypermedia, Calgary, AB, June 1997, pp. 521–526.

  • John, B. and Kieras, D. (1996) Using GOMS for user interface design and evaluation: Which technique? In Proceedings ACM Transactions on Computer-Human Interaction, 3(4), 287–319.

    Google Scholar 

  • John, B. and Kieras, D. (1996a) The GOMS family of user interface analysis techniques: Comparison and contrast. ACM Transactions on Computer-Human Interaction, 3(4), 320–351.

    Google Scholar 

  • John, B. and Vera, A. (1992) A GOMS analysis of a graphic machine paced, highly interactive task. In Proceedings ACM CHI, pp. 251–258.

  • John, B. and Wayne, G. (1994) GOMS analysis for parallel activities: tutorial. In Proceedings ACM Transactions on Computer-Human Interaction. pp. 395–396.

  • Johnson, H. and Johnson, P. (1991) Task knowledge structures: Psychological basis and integration into system design. Acta Psychologica, 78, 3–26.

    Google Scholar 

  • Kaput, J. J. (1987) Representation systems and mathematics. In Problems of Representation in Teaching and Learning of Mathematics, C. Janvier (ed), Lawrence Erlbaum Associates, London, pp. 19–26.

    Google Scholar 

  • Karat, J. (1988) Software evaluation methodologies. In Handbook of Human-Computer Interaction, M. Helander (ed), Elsevier Science Publishers, B. V. (North-Holland), pp. 891–903.

    Google Scholar 

  • Kieras, D. (1996) Task analysis and the design of functionality. In CRC Handbook of Computer Science and Engineering, CRC Press.

  • Kordaki, M. and Potari, D. (1998) A learning environment for the conservation of area and its measurement: a computer microworld. Computers and Education, 31, 405–422.

    Google Scholar 

  • Laborde, J. M. and Strasser, R. (1990) Cabri-geometry: A microworld of geometry for guided discovery learning. Zentrablatt für Didaetikden Mathematik, 5, 171–177.

    Google Scholar 

  • Lansdale, M. W. and Ormerod, T. C. (1994) Understanding Interfaces: A Handbook of Human-Computer Interaction. Academic Press, London.

    Google Scholar 

  • Lim, K. Y. (1996) Structured task analysis: an instantiation of the MUSE method for usability engineering. Interacting with Computers, 8(1), 31–50.

    Google Scholar 

  • Mayes, J. T. and Fowler, C. J. (1999) Learning technology and usability: A framework for understanding courseware usability and educational software design. Interacting with Computers, 11(5), 485–497.

    Google Scholar 

  • Moran, T. P. (1981) The command language grammar: a representation for the user interface of interactive computer system. International Journal of Man-Machine Studies, 15, 3–50.

    Google Scholar 

  • Nielsen, J. (1993) Usability Engineering. Academic Press, London.

    Google Scholar 

  • Olson, J. and Olson, G. (1990) The growth of cognitive modeling in human-computer interaction since GOMS. Human Computer Interaction, 5, 221–265.

    Google Scholar 

  • Orhun, E. (1995) Design of computer-based cognitive tools. In Computers and Exploratory Learning, A. A. di Sessa, C. Hoyles and R. Noss (eds), Springer, Berlin, pp. 305–320.

    Google Scholar 

  • Papert, S. (1980) Mindstorms: Pupils, Computers, and Powerful Ideas. Basic Books, New York.

    Google Scholar 

  • Paterńo, F. and Ballardin, G. (2000) RemUSINE: A bridge between empirical and model-based evaluation when evaluators and users are distant. Interacting with Computers, 13(2), 151–167.

    Google Scholar 

  • Paterńo, F. and Mancini, C. (2000) Model-based design of interactive applications. ACM Intelligence, Winter 2000, pp. 27–37.

  • Richardson, J., Ormerod, T. and Shepherd, A. (1998) The role of task analysis in capturing requirements for interface design. Interacting with Computers, 9(2), 367–384.

    Google Scholar 

  • Sedig, K., Klawe, M. and Westrom, M. (2001) Role of interface manipulation style and scaffolding on cognition and concept learning in learnware. ACM Transactions on Computer-Human Interaction, March 2001, 8(1), 34–59.

    Google Scholar 

  • Shepherd, A. (1989) Analysis and training in information technology task. In Task Analysis for Human Computer Interaction, D. Diaper (ed), Ellis Horwood Limited, Chichester, UK, pp. 15–55.

    Google Scholar 

  • Squires, D. and Preece, J. (1999) Predicting quality in education software: Evaluating for learning, usability and the synergy between them, Interacting with Computers, 11, 467–483.

    Google Scholar 

  • Tselios, N., Maragoudakis, M., Avouris, N., Fakotakis, N. and Kordaki, M. (2001a) Automatic diagnosis of student problem solving strategies using Bayesian networks. 5th Panhellenic Conf. on Didactics of Mathematics and Informatics in Education, Thessaloniki, 12-14 October 2001.

  • Tselios, N., Avouris, N., Diuitracopoulou, A., Daskalakis, (2001) Evaluation of distance-learning environments: Impact of usability on student performance, International Journal of Educational Telecommunications, 7(4), 355–378.

    Google Scholar 

  • Umbers, I. G. and Reirsen, C. S. (1995) Task analysis in support of the design and development of a nuclear power plant safety system. Ergonomics, 38(3), 443–454.

    Google Scholar 

  • van Welie, M., van der Veer, G. C. and Eliens, A. (1998) Euterpe–tool support for analyzing cooperative environments. In Proceedings of the 9th European Conference on Cognitive Ergonomics, August 24-26, 1998, Limerick, Ireland.

    Google Scholar 

  • von Glasersfeld, E. (1987) Learning as a constructive activity. In Problems of Representation in Teaching and Learning of Mathematics, C. Janvier (ed), Lawrence Erlbaum, London, pp. 3–18.

    Google Scholar 

  • Walsh, P. A. (1989) Analysis for Task Object Modeling (ATOM) towards a method of integrating task analysis with Jackson system development for user interface software design. In Task Analysis for Human Computer Interaction, D. Diaper (ed), Ellis Horwood Limited, Chichester, UK, pp. 186–209.

    Google Scholar 

  • Wilson, S., Johnson, P., Kelly, C., Cunningham, J. and Markopoulos, P. (1993) Beyond hacking: a model based approach to user interface design. In Proc. of HCI'93, J. Alty, D. Diaper and S. Guest (eds), Cambridge University Press, pp. 217–231.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Tselios, N.K., Avouris, N.M. & Kordaki, M. Student Task Modeling in Design and Evaluation of Open Problem-Solving Environments. Education and Information Technologies 7, 17–40 (2002). https://doi.org/10.1023/A:1015306507126

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1015306507126

Navigation