ABSTRACT
Evaluating interface usability and system functionality is time-consuming and effort-intense. The short time-span involved in development iterations, such as those in agile development, makes it challenging for software teams to perform ongoing interface usability evaluation and system functionality testing. We propose a way to perform product ongoing evaluation, thus enabling software teams to perform interface usability evaluation alongside automated system functionality testing. We use formal task models, created in our defined TaMoGolog task modeling language, to conduct product evaluation experiments through TaMUlator. TaMUlator is a tool we developed for use at the Integrated Development Environment (IDE) level. Our case study revealed that software teams can easily engage in system evaluation by using TaMUlator on an iterative basis.
- Agile Alliance 2001. Manifesto for Agile Software Development. Technical Report by Agile Alliance, http://www.agilealliance.org.Google Scholar
- Balbo, S. Automatic Evaluation of User Interface Usability: Dream or Reality. In S. Balbo, Ed., Proc of the Queensland Computer- Human Interaction Symposium, Queensland, Australia, August - 1995.Google Scholar
- Buchholz, G., Engel, J., Märtin, C, and Propp, S. Model-based usability evaluation: evaluation of tool support. J. Jacko (Ed.): Human-Computer Interaction, Part I, HCII 2007, LANCS 4550, pp. 1043--1052, 2007. Google ScholarDigital Library
- de Giacomo, G., Lespérance, Y., and Levesque, H. J. ConGolog, a concurrent programming language based on the situation calculus. Artif. Intell. 121, 1--2 August 2000. Google ScholarDigital Library
- de Giacomo, G., Lespérance, Y., and Pearce, A. R. Situation Calculus Based Programs for Representing and Reasoning about Game Structures. In Proc. KR 2010.Google Scholar
- Dix, A., Finlay, J. E., Abowd, G. D., and Beale, R. Human Computer Interaction, 3rd Edition, Prentice Hall, 2003. Google ScholarDigital Library
- Dubinsky, Y., Catarci, T., Humayoun, S. R., and Kimani, S. Integrating user evaluation into software development environments, 2nd DELOS Conference on Digital Libraries, Pisa, Italy, 2007.Google Scholar
- Dubinsky, Y. and Hazzan, O. The construction process of a framework for teaching software development methods, Computer Science Education, 15:4, pp. 275--296, 2005.Google ScholarCross Ref
- Hazzan, O. and Dubinsky, Y. Agile Software Engineering. Undergraduate Topics in Computer Science Series, Springer-Verlag London Ltd, 2008. Google ScholarDigital Library
- Humayoun, S. R., Dubinsky, Y., and Catarci, T. UEMan: A tool to manage user evaluation in development environments. In Proc. ICSE'09, IEEE Computer Society, 551--554, 2009. Google ScholarDigital Library
- Humayoun, S. R., Dubinsky, Y., and Catarci, T. A Three-Fold Integration Framework to Incorporate User Centred Design into Agile Software Development. LNCS Volume 6776, M. Kurosu (Ed.): Human Centered Design, HCII 2011, pp. 55--64, 2011. Google ScholarDigital Library
- Humayoun, S. R., Catarci, T., and Dubinsky, Y. A Dynamic Framework for Multi-View Task Modeling. In Proc. CHItaly'2011, pp. 185--190, 2011. Google ScholarDigital Library
- Humayoun, S. R., Dubinsky, Y., Nazarov, E., Israel, A., and Catarci, T. TaMUlator: A Tool to Manage Task Model-based Usability Evaluation in Development Environments. IADIS Conference IHCI 2011, July 24--26, 2011.Google Scholar
- Ivory, M. and Hearst, M. The State of the Art in Automating Usability Evaluation of User Interfaces. ACM Computing Surveys, 33, 4, pp. 470--516, Dec. 2001. Google ScholarDigital Library
- ISO (1998). Information Processing Systems - Open Systems Interconnection - LOTOS - A Formal Description Technique Based on Temporal Ordering of Observation Behavior. ISO/IS 8807, ISO Central Secretariat, 1988.Google Scholar
- Lecerof, A. and Paternò, F. Automatic Support for Usability Evaluation. IEEE Trans. Softw. Eng. 24, 10, 863--888, 1998. Google ScholarDigital Library
- Levesque, H. J., Reiter, R., Lespérance, Y., Lin, F., and Scherl, R. B. GOLOG: A Logic Programming Language for Dynamic Domains. Journal of Logic Programming, vol. 33, pp. 59--83, 1997.Google ScholarCross Ref
- Paternò, F. and Ballardin, G. RemUSINE: a bridge between empirical and model-based evaluation when evaluators and users are distant. Interacting with Computers, Vol. 13, Iss. 2, pp. 229--251, Dec. 2000.Google ScholarCross Ref
- Paternò, F., Mancini, S., and Meniconi, S. ConcurTaskTrees: A Diagrammatic Notation for Specifying Task Models. In Proc. Interact'97, pp. 362--369, 1997. Google ScholarDigital Library
- Paternò, F., Russino, A., and Santoro, C. Remote evaluation of mobile applications. In Proc. TAMODIA'07, pp. 155--169, 2007. Google ScholarDigital Library
- Propp, S., Buchholz, G., and Forbrig, P. Task Model-Based Usability Evaluation for Smart Environments. In Proc. HCSE-TAMODIA '08, pp. 29--40, 2008. Google ScholarDigital Library
- Reichart, D., Forbrig, P., and Dittmar, A. Task Models as Basis for Requirements Engineering and Software Execution. In Proc. TAMODIA '04, vol. 86. ACM, New York, NY, pp. 51--58, 2004. Google ScholarDigital Library
- Giese, M., Mistrzyk, T., Pfau, A., Szwillus, G., and von Detten, G. AMBOSS: A Task Modeling Approach for Safety-Critical Systems. Engineering Interactive Systems, pp. 98--109, v. 5247, Springer Berlin/Heidelberg, 2008. Google ScholarDigital Library
- Restak, R. The New Brain: How the Modern Age Is Rewiring Your Mind. Rodale Books, 2003.Google Scholar
- Bernhaupt, R., Palanque, P., Winckler, M., and Navarre, D. Usability study of multi-modal interfaces using eye-tracking. In Proc. of the 11th IFIP TC 13 international conference on Human-computer interaction, pp. 412--424, 2007. Google ScholarDigital Library
- Sinnig, D., Wurdel, M., Forbrig, P., Chalin, P., and Khendek, F. Practical extensions for task models. In Proc. TAMODIA'07, pp. 42--55, 2007. Google ScholarDigital Library
- Talby, D., Hazzan, O., Dubinsky, Y. and Keren, A. Agile software testing in a large-scale project, IEEE Software, Special Issue on Software Testing, pp. 30--37, 2006. Google ScholarDigital Library
- Whiteside, J., Bennett, J., and Holtzblatt, K. Usability engineering: Our experience and evolution. In Handbook of Human-Computer Interaction, pp. 791--817, 1988.Google ScholarCross Ref
- Palanque, P., Barboni, E., Martinie, C., Navarre, D., and Winckler, M. A model-based approach for supporting engineering usability evaluation of interaction techniques. EICS '11, ACM, New York, NY, USA, pp. 21--30, 2011 Google ScholarDigital Library
- BCS SIGIST (British Computer Society Specialist Interest Group in Software Testing). Standard for Software Component Testing. Working Draft 3.4, 27. April 2001.Google Scholar
- Cockburn, A. Agile Software Development: The Cooperative Game (2nd Edition), Agile Software Development Series, Addison-Wesley Professional, 2006. Google ScholarDigital Library
- Abrahamsson, P., Salo, O., Ronkainen, J., and Warsta, J. Agile software development methods - review and analysis. Technical Report 478, VTT Publications, 2002.Google Scholar
Index Terms
- A model-based approach to ongoing product evaluation
Recommendations
Is usability evaluation important: the perspective of novice software developers
BCS-HCI '13: Proceedings of the 27th International BCS Human Computer Interaction ConferenceIn this paper we present the results of a study which aims to explore the perspective of novice software developers about usability evaluation. It is important for a software organization to understand how novice developers perceive the role and ...
Model-based usability evaluation: evaluation of tool support
HCI'07: Proceedings of the 12th international conference on Human-computer interaction: interaction design and usabilityUsability evaluation can be accomplished in different ways, depending on individual information interests and specific constraints. In some cases the test user and the usability evaluator are located at different places, for instance in mobile ...
Integration of usability evaluation and model-based software development
Model-based software development is carried out as a well defined process. Depending on the applied approach, different phases can be distinguished, e.g. requirements specification, design, prototyping, implementation and usability evaluation. During ...
Comments