Abstract
Semantic Web technologies are being applied to increasingly diverse areas where user involvement is crucial. While a number of user interfaces for Semantic Web systems have become available in the past years, their evaluation and reporting often still suffer from weaknesses. Empirical evaluations are essential to compare different approaches, demonstrate their benefits and reveal their drawbacks, and thus to facilitate further adoption of Semantic Web technologies. In this paper, we review empirical user studies of user interfaces, visualizations and interaction techniques recently published at relevant Semantic Web venues, assessing both the user studies themselves and their reporting. We then chart the design space of available methods for user studies in Semantic Web contexts. Finally, we propose a framework for their comprehensive reporting, taking into consideration user expertise, experimental setup, task design, experimental procedures and results analysis.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
VOILA: International workshop series on “Visualization and Interaction for Ontologies and Linked Data”, see http://voila.visualdataweb.org.
- 2.
The classified papers can be accessed at: http://survey.visualdataweb.org.
- 3.
We use the term operation instead of task here to differentiate it from evaluation tasks.
References
McGrath, J.E.: Human-Computer Interaction, pp. 152–169. Morgan Kaufmann Publishers Inc., San Francisco (1995)
Thomas, D.R.: A general inductive approach for analyzing qualitative evaluation data. Am. J. Eval. 27(2), 237–246 (2006)
Mitschick, A., Nieschalk, F., Voigt, M., Dachselt, R.: IcicleQuery: a web search interface for fluid semantic query construction. In: 3rd International Workshop on Visualization and Interaction for Ontologies and Linked Data. CEUR Workshop Proceedings, vol. 1947, pp. 99–110. CEUR-WS.org (2017)
Marchionini, G.: Exploratory search: from finding to understanding. Commun. ACM 49(4), 41–46 (2006)
Vega-Gorgojo, G., Slaughter, L., Giese, M., Heggestøyl, S., Soylu, A., Waaler, A.: Visual query interfaces for semantic datasets: an evaluation study. J. Web Semant. 39, 81–96 (2016)
Nuzzolese, A.G., Presutti, V., Gangemi, A., Peroni, S., Ciancarini, P.: Aemoo: linked data exploration based on knowledge patterns. Semant. Web 8(1), 87–112 (2017)
Lazar, J., Feng, J.H., Hochheiser, H.: Research Methods in Human-Computer Interaction. Morgan Kaufmann, San Francisco (2017)
Hwang, W., Salvendy, G.: Number of people required for usability evaluation: the \(10\pm 2\) rule. Commun. ACM 53(5), 130–133 (2010)
Kittur, A., Chi, E.H., Suh, B.: Crowdsourcing user studies with mechanical Turk. In: SIGCHI Conference on Human Factors in Computing Systems, pp. 453–456. ACM (2008)
Marcolin, B.L., Compeau, D.R., Munro, M.C., Huff, S.L.: Assessing user competence: conceptualization and measurement. Inf. Syst. Res. 11(1), 37–60 (2000)
Ziemkiewicz, C., Ottley, A., Crouser, R.J., Chauncey, K., Su, S.L., Chang, R.: Understanding visualization by understanding individual users. IEEE Comput. Graph. Appl. 32(6), 88–94 (2012)
Dragisic, Z., Ivanova, V., Lambrix, P., Faria, D., Jiménez-Ruiz, E., Pesquita, C.: User validation in ontology alignment. In: Groth, P., et al. (eds.) ISWC 2016. LNCS, vol. 9981, pp. 200–217. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46523-4_13
Dadzie, A.S., Pietriga, E.: Visualisation of linked data - reprise. Semant. Web 8(1), 1–21 (2017)
Sarasua, C., Simperl, E., Noy, N., Bernstein, A., Leimeister, J.M.: Crowdsourcing and the Semantic Web: a research manifesto. Hum. Comput. (HCOMP) 2(1), 3–17 (2015)
White, R.W., Roth, R.A.: Exploratory search: beyond the query-response paradigm. Synth. Lect. Inf. Concepts, Retr. Serv. 1(1), 1–98 (2009)
Frøkjær, E., Hertzum, M., Hornbæk, K.: Measuring usability: are effectiveness, efficiency, and satisfaction really correlated? In: SIGCHI Conference on Human Factors in Computing Systems, pp. 345–352. ACM (2000)
Huang, W., Eades, P., Hong, S.H.: Beyond time and error: a cognitive approach to the evaluation of graph drawings. In: 2008 Workshop on BEyond Time and Errors: Novel EvaLuation Methods for Information Visualization - BELIV, pp. 3:1–3:8. ACM (2008)
Saraiya, P., North, C., Duca, K.: An insight-based methodology for evaluating bioinformatics visualizations. IEEE Trans. Vis. Comput. Graph. 11(4), 443–456 (2005)
White, R.W., Drucker, S.M., Marchionini, G., Hearst, M., Schraefel, M.C.: Exploratory search and HCI: designing and evaluating interfaces to support exploratory search interaction. In: CHI 2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2877–2880. ACM (2007)
Fu, B., Noy, N.F., Storey, M.A.: Eye tracking the user experience-an evaluation of ontology visualization techniques. Semant. Web 8(1), 23–41 (2017)
Miles, M.B., Huberman, A.M., Saldana, J.: Qualitative Data Analysis. Sage, Los London (2013)
Rubin, J., Chisnell, D.: Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. Wiley, Hoboken (2008)
Brazma, A., et al.: Minimum information about a microarray experiment (MIAME) - toward standards for microarray data. Nat. Genet. 29(4), 365–371 (2001)
Field, A., Hole, G.: How to Design and Report Experiments. Sage, London (2003)
Plaisant, C.: The challenge of information visualization evaluation. In: Working conference on Advanced visual interfaces, pp. 109–116. ACM (2004)
Kamdar, M.R., Walk, S., Tudorache, T., Musen, M.A.: BiOnIC: a catalog of user interactions with biomedical ontologies. In: d’Amato, C., et al. (eds.) ISWC 2017. LNCS, vol. 10588, pp. 130–138. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-68204-4_13
Ivanova, V., Lambrix, P., Åberg, J.: Requirements for and evaluation of user support for large-scale ontology alignment. In: Gandon, F., Sabou, M., Sack, H., d’Amato, C., Cudré-Mauroux, P., Zimmermann, A. (eds.) ESWC 2015. LNCS, vol. 9088, pp. 3–20. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-18818-8_1
González Sánchez, J.L., García González, R., Brunetti Fernández, J.M., Gil Iranzo, R.M., Gimeno Illa, J.M.: Using SWET-QUM to compare the quality in use of Semantic Web exploration tools. J. Univers. Comput. Sci. 19, 1025–1045 (2013)
Nunes, T., Schwabe, D.: Frameworks for information exploration-a case study. In: 4th International Workshop on Intelligent Exploration of Semantic Data - IESD (2015)
Nunes, T., Schwabe, D.: Frameworks of information exploration-towards the evaluation of exploration systems. In: 5th International Workshop on Intelligent Exploration of Semantic Data - IESD (2016)
García, R., Gil, R., Gimeno, J.M., Bakke, E., Karger, D.R.: BESDUI: a benchmark for end-user structured data user interfaces. In: Groth, P., et al. (eds.) ISWC 2016. LNCS, vol. 9982, pp. 65–79. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46547-0_8
Acknowledgements
Catia Pesquita is funded by the Portuguese FCT through the LASIGE Strategic Project (UID/CEC/00408/2013), and also by FCT grant PTDC/EEI-ESS/4633/2014. Patrick Lambrix is funded by the Swedish e-Science Society (SeRC). Steffen Lohmann is partly funded by the Fraunhofer Cluster of Excellence Cognitive Internet Technologies (CIT).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Pesquita, C., Ivanova, V., Lohmann, S., Lambrix, P. (2018). A Framework to Conduct and Report on Empirical User Studies in Semantic Web Contexts. In: Faron Zucker, C., Ghidini, C., Napoli, A., Toussaint, Y. (eds) Knowledge Engineering and Knowledge Management. EKAW 2018. Lecture Notes in Computer Science(), vol 11313. Springer, Cham. https://doi.org/10.1007/978-3-030-03667-6_36
Download citation
DOI: https://doi.org/10.1007/978-3-030-03667-6_36
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-03666-9
Online ISBN: 978-3-030-03667-6
eBook Packages: Computer ScienceComputer Science (R0)