Abstract
The current variety of existing approaches for HCI quality evaluation is marked by a lack of the integration of subjective methods (such as the questionnaire method) and objective methods (such as the electronic informer method) for supporting in making an evaluation final decision. Over the past decades, different researches have been interested to define various quality criteria with their measures. However, the lack in determining how to integrate qualitative with quantitative data leads us to specify new indicators for HCI quality evaluation. This paper aims at defining and constructing quality indicators with their measures related relatively to existing quality criteria based on ISO/IEC 15939 standard. These indicators allow the integration of qualitative and quantitative data and provide a basis for decision making about the quality of the HCI relatively to the evaluation quality criteria. This paper presents a proposal for defining and constructing quality indicators and it highlights a proposed example. A feasibility study of using a quality indicator is presented by the evaluation of traffic supervision system in Valenciennes (France) as a part of CISIT-ISART project.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Al-Wabil, A., Al-Khalifa, H.: A Framework for Integrating Usability Evaluations Methods: The Mawhiba Web Portal Case Study. In: The International Conference on the Current Trends in Information Technology (CTIT 2009), Dubai, UAE, pp. 1–6 (2009)
Assila, A., de Oliveira, K.M., Ezzedine, H.: Towards qualitative and quantitative data integration approach for enhancing HCI quality evaluation. In: Kurosu, M. (ed.) HCI 2014, Part I. LNCS, vol. 8510, pp. 469–480. Springer, Heidelberg (2014)
Bastien, J.M.C., Scapin, D.: Ergonomic Criteria for the Evaluation of Human Computer interfaces. Technical Report n° 156, Institut Nationale de Recherche en Informatique et en Automatique, France (1993)
Charfi, S., Ezzedine, H., Kolski, C.: RITA: A Framework based on multi-evaluation techniques for user interface evaluation, Application to a transport network supervision system. In: ICALT, May 29-31, pp. 263–268. IEEE, Tunisia (2013) ISBN 978-1-4799-0312-2
Hardin, M., Hom, D., Perez, R., Williams, L.: Quel diagramme ou graphique vous convient le mieux? Copyright Tableau Software, Inc. (2012)
Hartson, H.R., Andre, T.S., Will, R.C.: Criteria for evaluating usability evaluation methods. International Journal of Human-Computer Interaction, vol 15(1), 145–181 (2003)
Hwang, W., Salvendy, G.: Number of people required for usability evaluation: the 10 2 rule. Commun. ACM 53(5), 130–133 (2010)
ISO/IEC 15939 Systems and software engineering — Measurement process (2007)
ISO/IEC. ISO 9241-11 Ergonomic requirements for office work with visual display terminals (VDT) s- Part 11 Guidance on usability. ISO/IEC 9241-11: 1998(E)
Kerzazi, N., Lavallée, M.: Inquiry on usability of two software process modeling systems using ISO/IEC 9241. In: CCECE, pp. 773–776 (2011)
Lewis, J.R.: IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. IJHCI (7), 57–78 (1995)
Monteiro, L., Oliveira, K.: Defining a catalog of indicators to support process performance analysis. Journal of Software Maintenance and Evolution: Research and Practice 23(6), 395–422 (2010)
Nielsen, N.: Engineering, Usability. Morgan Kaufmann Publishers Inc., San Francisco (1993)
Trabelsi, A., Ezzedine, H.: Evaluation of an Information Assistance System based on an agent-based architecture in transportation domain: first results. International Journal of Computers, Communications and Control 8(2), 320–333 (2013)
Tran, C., Ezzedine, H., Kolski, C.: EISEval, a Generic Reconfigurable Environment for Evaluating Agent-based Interactive Systems. International Journal of Human-Computer Studies 71(6), 725–761 (2013)
Whiting, M.A., Haack, J., Varley, C.: Creating realistic, scenario-based synthetic data for test and evaluation of information analytics software. In: Proc. Conference on Beyond Time and Errors: Novel Evaluation Methods For information Visualization, A Workshop of the ACM CHI 2008 Conference, Florence, Italy, pp. 1–9 (2008)
Yang, T., Linder, J., Bolchini, D.: DEEP: Design-Oriented Evaluation of Perceived Usability. International Journal of Human Computer Interaction, 308–346 (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Assila, A., de Oliveira, K.M., Ezzedine, H. (2014). Towards Indicators for HCI Quality Evaluation Support. In: Indulska, M., Purao, S. (eds) Advances in Conceptual Modeling. ER 2014. Lecture Notes in Computer Science, vol 8823. Springer, Cham. https://doi.org/10.1007/978-3-319-12256-4_19
Download citation
DOI: https://doi.org/10.1007/978-3-319-12256-4_19
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-12255-7
Online ISBN: 978-3-319-12256-4
eBook Packages: Computer ScienceComputer Science (R0)