Abstract
The conception, and usage, of methods designed to evaluate information visualizations is a challenge that goes along with the development of these visualizations. In the scientific literature there is a myriad of proposals for such methods. However, none of them was able to pacify the field or establish itself as a de facto standard, due to difficulties like: (a) the complexity of its usage; (b) high financial and time costs; and (c) the need of a large number of raters to guarantee the reliability of the results. One way to circumvent such adversities is the usage of Heuristic Evaluation given its simplicity, low cost of application and the quality of reached results. This article intends to conduct an empirical methodological study about the use of three of such methods (Zuk et al., Forsell & Johansson and Wall et al.) for evaluation of visualizations in the context of Educational Timetabling Problems. Five different visualizations were evaluated using the original methods and versions modified by the current authors (where an importance factor was assigned to each statement being evaluated, as well as the rater’s level of confidence) in order to improve their efficiency when measuring the quality of visualizations. The experimental results demonstrated that for the two first heuristics, only the modification on the importance of the statements proved to be (statistically) relevant. For the third one, both factors did not induce different results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Chris North [17] clarifies that the ability to measure whether a given visualization can induce insights, or not, is subjective and individual. Also, it is known that insights are characterized by being complex, deep, qualitative, unexpected and relevant. In another words, an insight can not simply be directly extracted from the visualization.
References
de Souza Alencar, W., do Nascimento, H.A.D., Jradi, W.A.R., Soares, F.A.A.M.N., Felix, J.P.: Information visualization for highlighting conflicts in educational timetabling problems. In: Bebis, G., et al. (eds.) ISVC 2019. LNCS, vol. 11844, pp. 275–288. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-33720-9_21
Amar, R., Stasko, J.: A knowledge task-based framework for design and evaluation of information visualizations. In: IEEE Symposium on Information Visualization, pp. 143–150. IEEE (2004). https://doi.org/10.1109/INFVIS.2004.10
Babaei, H., Karimpour, J., Hadidi, A.: A survey of approaches for university course timetabling problem. Comput. Indust. Eng. 86, 43–59 (2015). https://doi.org/10.1016/j.cie.2014.11.010
Carpendale, S.: Evaluating information visualizations. In: Kerren, A., Stasko, J.T., Fekete, J.-D., North, C. (eds.) Information Visualization. LNCS, vol. 4950, pp. 19–45. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-70956-5_2
Elmqvist, N., Yi, J.S.: Patterns for visualization evaluation. Inform. Vis. 14(3), 250–269 (2015). https://doi.org/10.1177/1473871613513228
Forsell, C.: A guide to scientific evaluation in information visualization. In: Proceedings of the 2010 14th International Conference Information Visualisation, IV 2010, pp. 162–169. IEEE Computer Society, USA (2010). https://doi.org/10.1109/IV.2010.33
Forsell, C., Johansson, J.: An heuristic set for evaluation in information visualization. In: Proceedings of the International Conference on Advanced Visual Interfaces, AVI 2010, pp. 199–206. ACM, New York (2010). https://doi.org/10.1145/1842993.1843029
Fu, X., Wang, Y., Dong, H., Cui, W., Zhang, H.: Visualization assessment: a machine learning approach. In: Proceedings of the 2019 IEEE Visualization Conference (VIS 2019), pp. 126–130. IEEE (2019). https://doi.org/10.1109/VISUAL.2019.8933570
Gomez, S.R., Guo, H., Ziemkiewicz, C., Laidlaw, D.H.: An insight and task-based methodology for evaluating spatiotemporal visual analytics. In: Proceedings of the IEEE Symposium on Visual Analytics Science and Technology 2014, pp. 9–14. IEEE (2014). https://doi.org/10.1109/VAST.2014.7042482
Harrison, L., Reinecke, K., Chang, R.: Infographic aesthetics: designing for the first impression. In: Proceedings of the ACM Conference on Human Factors in Computing Systems, pp. 1186–1190. ACM (2015). https://doi.org/10.1145/2702123.2702545
Hearst, M.A., Laskowski, P., Silva, L.: Evaluating information visualization via the interplay of heuristic evaluation and question-based scoring. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI 2016, pp. 5028–5033. ACM, New York (2016). https://doi.org/10.1145/2858036.2858280
Hermawati, S., Lawson, G.: Establishing usability heuristics for heuristics evaluation in a specific domain: is there a consensus? Appl. Ergon. 56, 34–51 (2016). https://doi.org/10.1016/j.apergo.2015.11.016
Isenberg, T., Isenberg, P., Chen, J., Sedlmair, M., Möller, T.: A systematic review on the practice of evaluating visualization. IEEE Trans. Vis. Comput. Graphics 19, 2818–2827 (2013). https://doi.org/10.1109/TVCG.2013.126
Lam, H., Bertini, E., Isenberg, P., Plaisant, C., Carpendale, S.: Empirical studies in information visualization: seven scenarios. IEEE Trans. Vis. Comput. Graphics 18, 1520–1536 (2012). https://doi.org/10.1109/TVCG.2011.279
Mühlenthaler, M.: Fairness in Academic Course Timetabling. Lecture Notes in Economics and Mathematical Systems, vol. 678. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-12799-6
Nielsen, J.: Finding usability problems through heuristic evaluation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1992, pp. 37–380. ACM (1992). https://doi.org/10.1145/142750.142834
North, C.: Toward measuring visualization insight. IEEE Comput. Graph. Appl. 26(3), 6–9 (2006). https://doi.org/10.1109/MCG.2006.70
Saket, B., Endert, A., Stasko, J.: Beyond usability and performance: a review of user experience-focused evaluations in visualization. In: Proceedings of the Sixth Workshop on Beyond Time and Errors on Novel Evaluation Methods for Visualization, BELIV 2016, pp. 133–142. ACM (2016). https://doi.org/10.1145/2993901.2993903
Santos, B.S., Silva, S.S., Dias, P.: Heuristic evaluation in visualization: an empirical study (position paper). In: 2018 IEEE Evaluation and Beyond-Methodological Approaches for Visualization, pp. 78–85 (2018). https://doi.org/10.1109/beliv.2018.8634108
Stasko, J.: Value-driven evaluation of visualizations. In: Lam, H., Isenberg, P. (eds.) Proceedings of the Fifth Workshop on Beyond Time and Errors: Novel Evaluation Methods for Visualizations, BELIV 2014, pp. 46–53. ACM, New York (2014). https://doi.org/10.1145/2669557.2669579
Thomas, J.J., Khader, A.T., Belaton, B.: Visualization techniques on the examination timetabling pre-processing data. In: 2009 Sixth International Conference on Computer Graphics, Imaging and Visualization, pp. 454–458. IEEE (2009). https://doi.org/10.1109/CGIV.2009.23
Tory, M., Moller, T.: Evaluating visualizations: do expert reviews work? IEEE Comput. Graph. Appl. 25(5), 8–11 (2005). https://doi.org/10.1109/MCG.2005.102
Wall, E., et al.: A heuristic approach to value-driven evaluation of visualizations. IEEE Trans. Vis. Comput. Graphics 25, 491–500 (2019). https://doi.org/10.1109/TVCG.2018.2865146
Zhou, X., Xue, C., Zhou, L., Yafeng, N.: An evaluation method of visualization using visual momentum based on eye-tracking data. Int. J. Pattern Recognit. Artif. Intell. 32(5), 1850016 (2018). https://doi.org/10.1142/S0218001418500167
Zuk, T., Schlesier, L., Neumann, P., Hancock, M.S., Carpendale, S.: Heuristics for information visualization evaluation. In: Proceedings of the 2006 AVI Workshop on BEyond Time and Errors, BELIV 2006, pp. 1–6. ACM (2006). https://doi.org/10.1145/1168149.1168162
Acknowledgements
The first author is a PhD candidate and thanks the Brazilian research supporting agency FAPEG for scholarships. The others authors thanks CAPES.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
de Souza Alencar, W., Abdala Rfaei Jradi, W., Dantas do Nascimento, H.A., Felix, J.P., Alves de Melo Nunes Soares, F.A. (2020). An Empirical Methodological Study of Evaluation Methods Applied to Educational Timetabling Visualizations. In: Bebis, G., et al. Advances in Visual Computing. ISVC 2020. Lecture Notes in Computer Science(), vol 12509. Springer, Cham. https://doi.org/10.1007/978-3-030-64556-4_17
Download citation
DOI: https://doi.org/10.1007/978-3-030-64556-4_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-64555-7
Online ISBN: 978-3-030-64556-4
eBook Packages: Computer ScienceComputer Science (R0)