Abstract
Visual analytic systems, especially mixed-initiative systems, can steer analytical models and adapt views by making inferences from users’ behavioral patterns with the system. Because such systems rely on incorporating implicit and explicit user feedback, they are particularly susceptible to the injection and propagation of human biases. To ultimately guard against the potentially negative effects of systems biased by human users, we must first qualify what we mean by the term bias. Thus, in this chapter we describe four different perspectives on human bias that are particularly relevant to visual analytics. We discuss the interplay of human and computer system biases, particularly their roles in mixed-initiative systems. Given that the term bias is used to describe several different concepts, our goal is to facilitate a common language in research and development efforts by encouraging researchers to mindfully choose the perspective(s) considered in their work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Commission errors are contrasted with automation omission errors, which occur if the human-machine team fails to respond to system irregularities or the system fails to provide an indicator of a problematic state. In visual analytics, an omission error could occur if a system “knows” an algorithm might be mis-matched to a data type but does not alert the analyst.
References
Alaieri F, Vellino A (2016) Ethical decision making in robots: autonomy, trust and responsibility. In: Agah A, Cabibihan JJ, Howard AM, Salichs MA, He H (eds) Social robotics: 8th international conference. Springer International Publishing, Kansas City, MO, pp 159–168
Amershi S, Cakmak M, Knox WB, Kulesza T (2014) Power to the people: the role of humans in interactive machine learning. AI Mag 35(4):105–120
Brown ET, Ottley A, Zhao H, Lin Q, Souvenir R, Endert A, Chang R (2014) Finding Waldo: learning about users from their interactions. IEEE Trans Visual Comput Graphics 20(12):1663–1672
Burnett M, Stumpf S, Macbeth J, Makri S, Beckwith L, Kwan I, Peters A, Jernigan W (2016) GenderMag: a method for evaluating software’s gender inclusiveness. Interact Comput 28(6):760–787
Busemeyer JR, Diederich A (2010) Cognitive modeling. Sage, Los Angeles, CA
Busemeyer JR, Townsend JT (1993) Decision field theory: a dynamic-cognitive approach to decision making in an uncertain environment. Psychol Rev 100(3):432–459
Chaiken S, Trope Y (1999) Dual-process theories in social psychology. Guilford Press, New York
Cho I, Wesslen R, Karduni A, Santhanam S, Shaikh S, Dou W (2017) The anchoring effect in decision-making with visual analytics. In: IEEE conference on visual analytics science and technology (VAST)
Dimara E, Bezerianos A, Dragicevic P (2017) The attraction effect in information visualization. IEEE Trans Visual Comput Graphics 23(1):471–480
Dou W, Jeong DH, Stukes F, Ribarsky W, Lipford HR, Chang R (2009) Recovering reasoning process from user interactions. IEEE Comput Graphics Appl pp 52–61. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.157.407&rep=rep1&type=pdf
Egeth HE, Yantis S (1997) Visual attention: control, representation, and time course. Annu Rev Psychol 48(1):269–297
Endert A, Ribarsky W, Turkay C, Wong B, Nabney I, Blanco ID, Rossi F (2017) The state of the art in integrating machine learning into visual analytics. In: Computer graphics forum. Wiley Online Library
Fekete JD, Van Wijk J, Stasko J, North C (2008) The value of information visualization. Inf Visual pp 1–18
Friedman B (1996) Value-sensitive design. Interactions 3(6):16–23
Friedman B, Nissenbaum H (1996) Bias in computer systems. ACM Trans Inf Syst (TOIS) 14(3):330–347
Frisby JP, Stone JV (2010) Seeing: the computational approach to biological vision. The MIT Press, Cambridge, MA
Gotz D, Zhou MX (2009) Characterizing users’ visual analytic activity for insight provenance. Inf Visual 8(1):42–55
Gotz D, Sun S, Cao N (2016) Adaptive contextualization: combating bias during high-dimensional visualization and data selection. In: Proceedings of the 21st international conference on intelligent user interfaces - IUI ’16 pp 85–95. http://dl.acm.org/citation.cfm?doid=2856767.2856779
Green DM, Birdsall TG, Tanner WP Jr (1957) Signal detection as a function of signal intensity and duration. J Acoust Soc Am 29(4):523–531
Heuer Jr RJ (1999) Psychology of intelligence analysis. Washington, D.C
Hoffman RR, Johnson M, Bradshaw JM, Underbrink A (2013) Trust in automation. IEEE Intell Syst 28(1):84–88
Horvitz E (1999) Principles of mixed-initiative user interfaces. In: Proceedings of the SIGCHI conference on human factors in computing systems pp 159–166
Huber J, Payne JW, Puto C (1982) Adding asymmetrically dominated alternatives: violations of regularity and the similarity hypothesis. J Consum Res 9(1):90–98
Kahneman D, Frederick S (2005) A model of heuristic judgment. The Cambridge handbook of thinking and reasoning pp 267–294
Klein G, Moon B, Hoffman RR (2006) Making sense of sensemaking 2: a macrocognitive model. IEEE Intell Syst 21(5):88–92
Koffka K (2013) Principles of gestalt psychology, vol 44. Routledge, London
Lee JD, See KA (2004) Trust in automation: designing for appropriate reliance. Hum Factors 46(1):50–80
Lee P (2016) Learning from Tay’s introduction. https://blogs.microsoft.com/blog/2016/03/25/learning-tays-introduction/
Luce RD (1977) The choice axiom after twenty years. J Math Psychol 15(3):215–233
Macmillan NA, Creelman CD (2004) Detection theory: a user’s guide. Psychology Press, New York
Malhotra NK (1982) Information load and consumer decision making. J Consum Res 8(4):419–430
Milord JT, Perry RP (1977) A methodological study of overloadx. J Gen Psychol 97(1):131–137
Mosier KL, Skitka LJ (1996) Human decision makers and automated decision aids: made for each other. In: Parasuraman R, Mouloua M (eds) Automation and human performance: theory and applications. Lawrence Erlbaum Associates, Mahwah, NJ, pp 201–220
Mosier KL, Skitka LJ (1999) Automation use and automation bias. In: Proceedings of the human factors and ergonomics society annual meeting, vol 43. Sage, Beverley Hills, pp 344–348
Nickerson RS (1998) Confirmation bias: a ubiquitous phenomenon in many guises. Rev Gen Psychol 2(2):175–220
North C, May R, Chang R, Pike B, Endert A, Fink GA, Dou W (2011) Analytic provenance: process+interaction+insight. In: 29th annual CHI conference on human factors in computing systems, CHI 2011 pp 33–36
Nosofsky RM (1991) Stimulus bias, asymmetric similarity, and classification. Cogn Psychol 23(1):94–140
Parasuraman R, Manzey DH (2010) Complacency and bias in human use of automation: an attentional integration. Hum Factors 52:381–410
Patterson RE, Blaha LM, Grinstein GG, Liggett KK, Kaveney DE, Sheldon KC, Havig PR, Moore JA (2014) A human cognition framework for information visualization. Comput Graphics 42:42–58
Pirolli P, Card S (2005) The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis. In: Proceedings of international conference on intelligence analysis 2005, pp 2–4. http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:The+Sensemaking+Process+and+Leverage+Points+for+Analyst+Technology+as+Identified+Through+Cognitive+Task+Analysis#0
Posner MI (1980) Orienting of attention. Q J Exp Psychol 32(1):3–25
Riesenhuber M, Poggio T (1999) Hierarchical models of object recognition in cortex. Nat Neurosci 2(11):1019–1025
Sacha D, Stoffel A, Stoffel F, Kwon BC, Ellis G, Keim DA (2014) Knowledge generation model for visual analytics. IEEE Trans Visual Comput Graphics 20(12):1604–1613
Simons DJ, Chabris CF (1999) Gorillas in our midst: sustained inattentional blindness for dynamic events. Perception 28(9):1059–1074
Stanovich KE, West RF (2000) Advancing the rationality debate. Behav Brain Sci 23(5):701–717
Torralba A, Oliva A, Castelhano MS, Henderson JM (2006) Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychol Rev 113(4):766–786
Treisman A (1985) Preattentive processing in vision. Comput Vis Graphics Image Process 31(2):156–177
Tsotsos JK (2011) A computational perspective on visual attention. MIT Press, Cambridge, MA
Tversky A, Kahneman D (1973) Availability: a heuristic for judging frequency and probability. Cogn Psychol 5(2):207–232
Tversky A, Kahneman D (1974) Judgment under uncertainty: heuristics and biases. Science 185:1124–1131
Valdez AC, Ziefle M, Sedlmair M (2018a) A framework for studying biases in visualization research. In: Ellis G (ed) Cognitive biases in visualizations, Chap. 2. Springer, Berlin
Valdez AC, Ziefle M, Sedlmair M (2018b) Priming and anchoring effects in visualization. IEEE Trans Visual Comput Graphics 24(1):584–594
Vandekerckhove J (2014) A cognitive latent variable model for the simultaneous analysis of behavioral and personality data. J Math Psychol 60:58–71
Wall E, Blaha LM, Franklin L, Endert A (2017) Warning, bias may occur: a proposed approach to detecting cognitive bias in interactive visual analytics. In: IEEE conference on visual analytics science and technology (VAST)
Xu K, Attfield S, Jankun-Kelly T, Wheat A, Nguyen PH, Selvaraj N (2015) Analytic provenance for sensemaking: a research agenda. IEEE Comput Graphics Appl 35(3):56–64
Acknowledgements
The research described in this document was sponsored by the U.S. Department of Defense. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the U.S. Government.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Wall, E., Blaha, L.M., Paul, C.L., Cook, K., Endert, A. (2018). Four Perspectives on Human Bias in Visual Analytics. In: Ellis, G. (eds) Cognitive Biases in Visualizations. Springer, Cham. https://doi.org/10.1007/978-3-319-95831-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-95831-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-95830-9
Online ISBN: 978-3-319-95831-6
eBook Packages: Computer ScienceComputer Science (R0)