Abstract
Social signals (e.g., facial expression, gestures) are important in social interactions. Most of them are visual cues, which are hardly accessible for visually impaired people, causing difficulties in their daily living. In human–computer interaction (HCI), assistive systems for social interactions are getting increasing attention due to related technological advancements. Yet, there is still lack of a comprehensive and vivid understanding of visually impaired people’s social signal perception to broadly identify their needs in face-to-face communication. To fill this gap, we conducted in-depth interviews to study the lived experiences of 20 visually impaired participants. We analyzed a rich set of qualitative empirical data based on a comprehensive taxonomy of social signals, using a standard qualitative content analysis method. Our results revealed a set of vivid examples and an overview of visually impaired people’s lived experiences regarding social signals, including both their capabilities and limitations. As reported, the participants perceived social signals through their compensatory modalities such as hearing, touch, smell, or obstacle sense. However, their perception of social signals is generally with low resolution and limited by certain environmental factors (e.g., crowdedness, or noise level of the surrounding). Interestingly, sight was still importantly relied on by low-vision participants in social signal perception (e.g., rough postures and gestures). Besides, the participants experienced difficulties in sensing others’ subtle emotional states which are often revealed by nuanced behaviors (e.g., a smile). Based on rich empirical findings, we propose a set of design implications to inform future-related HCI works aimed at supporting visually impaired users’ social signal perception.




Similar content being viewed by others
References
World Health Organization, “Visual impairment and blindness,” Oct-2017. http://www.who.int/mediacentre/factsheets/fs282/en/. Accessed: 27-Dec-2017
Vinciarelli, A., Pantic, M., Bourlard, H.: Social signal processing: survey of an emerging domain. Image Vis. Comput. 27(12), 1743–1759 (2009)
Van Hasselt, V.B.: Social adaptation in the blind. Clin. Psychol. Rev. 3(1), 87–102 (1983)
Goharrizi, Z.E.: Blindness and Initiating Communication. University of Oslo, Oslo (2010)
Griffin, E.A.: A First Look at Communication Theory. McGraw-Hill, New York (2012)
Naraine, M.D., Lindsay, P.H.: Social inclusion of employees who are blind or low vision. Disabil. Soc. 26(4), 389–403 (2011)
Kemp, N.J., Rutter, D.R.: Social interaction in blind people: an experimental analysis. Hum. Relat. 39(3), 195–210 (1986)
Baumeister, R.F., Leary, M.R.: The need to belong: desire for interpersonal attachments as a fundamental human motivation. Psychol. Bull. 117(3), 497–529 (1995)
Maslow, A.H.: Personality and Motivation. Harper, New York (1954)
Brock, M., Kristensson, P. O.: Supporting blind navigation using depth sensing and sonification. In: Proceedings of the 2013 ACM Conference on Pervasive and Ubiquitous Computing Adjunct Publication, pp. 255–258. ACM (2013)
Galioto, G., Tinnirello, I., Croce, D., Inderst, F., Pascucci, F., Giarré, L.: Sensor fusion localization and navigation for visually impaired people. In: 2018 European Control Conference (ECC), pp. 3191–3196. IEEE (2018)
Botzer, A., Shvalb, N.: Using sound feedback to help blind people navigate. In: Proceedings of the 36th European Conference on Cognitive Ergonomics, Article 23, p. 3. ACM (2018)
Yusoh, S. M. N. S., Nomura, Y., Kokubo, N., Sugiura, T., Matsui, H., Kato, N.: Dual mode fingertip guiding manipulator for blind persons enabling passive/active line-drawing explorations. In: International Conference on Computers for Handicapped Persons, pp. 851–858. Springer, Berlin (2008)
Goncu, C., Marriott, K.: GraCALC: an accessible graphing calculator. In: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 311–312. ACM (2015)
Prescher, D., Weber, G., Spindler, M.: A tactile windowing system for blind users. In: Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility, pp. 91–98. ACM (2010)
Milne, L. R., Bennett, C. L., Ladner, R. E., Azenkot, S.: BraillePlay: educational smartphone games for blind children. In Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility, pp. 137–144. ACM (2014)
Shinohara, K. Wobbrock, J. O.: In the shadow of misperception: assistive technology use and social interactions. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 705–714. ACM (2011)
Neto, L.B., Grijalva, F., Maike, V.R.M.L., Martini, L.C., Florencio, D., Baranauskas, M.C.C., Rocha, A., Goldenstein, S.: A kinect-based wearable face recognition system to aid visually impaired users. IEEE Trans. Hum. Mach. Syst. 47(1), 52–64 (2017)
Astler, D. et al.: Increased accessibility to nonverbal communication through facial and expression recognition technologies for blind/visually impaired subjects. In: The Proceedings of the 13th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 259–260. ACM (2011)
Yin, R.K.: Case Study Research And Applications: Design and Methods. Sage Publications, Thousand Oaks (2017)
Sears, A., Hanson, V.L.: Representing users in accessibility research. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2235–2238. ACM (2011)
Hsieh, H.-F., Shannon, S.E.: Three approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005)
Knapp, M., Hall, J., Horgan, T.: Nonverbal Communication in Human Interaction, 8th edn. Wadsworth Cengage Learning, Boston (2014)
Borkenau, P., Mauer, N., Riemann, R., Spinath, F.M., Angleitner, A.: Thin slices of behavior as cues of personality and intelligence. J. Pers. Soc. Psychol. 86(4), 599–614 (2004)
Kleck, R.E., Nuessle, W.: Congruence between the indicative and communicative functions of eye contact in interpersonal relations. Br. J. Soc. Clin. Psychol. 7(4), 241–246 (1968)
Cook, M., Smith, J.M.C.: The role of gaze in impression formation. Br. J. Soc. Clin. Psychol. 14(1), 19–25 (1975)
Arndt, H., Janney, R.W.: InterGrammar: Toward an Integrative Model of Verbal, Prosodic and Kinesic Choices in Speech. Walter de Gruyter, Berlin (2011)
Warren, D.H.: Blindness and Early Childhood Development. American Foundation for the Blind, Arlington (1977)
Fraiberg, S.: Insights from the Blind: Comparative Studies of Blind and Sighted Infants. Basic Books, New York (1977)
Kemp, N.J., Rutter, D.R.: Social interaction in blind people: an experimental analysis. Hum. Relat. 39(3), 195–210 (1986)
Krishna, S., Little, G., Black, J., Panchanathan, S.: A wearable face recognition system for individuals with visual impairments. In: Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility - Assets’05, pp. 216–217. ACM (2005)
Kramer, K. M., Hedin, D. S., Rolkosky, D. J.: Smartphone based face recognition tool for the blind. In: 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC’10, pp. 4538–4541. ACM (2010)
Krishna, S., Panchanathan, S.: Assistive technologies as effective mediators in interpersonal social interactions for persons with visual disability. In: Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2010, vol. 6180 LNCS, PART 2, pp. 316–323
Buimer, H. P., Bittner, M., Kostelijk, T., van der Geest, T. M., van Wezel, R. J., Zhao, Y.: Enhancing emotion recognition in vips with haptic feedback. In: International Conference on Human-Computer Interaction, pp. 157–163. Springer, Cham (2016)
McDaniel, T., Bala, S., Rosenthal, J., Tadayon, R., Tadayon, A., Panchanathan, S.: Affective haptics for enhancing access to social interactions for individuals who are blind. In: International Conference on Universal Access in Human-Computer Interaction, pp. 419–429. Springer, Cham (2014)
Bala, S., McDaniel, T., Panchanathan, S.: Visual-to-tactile mapping of facial movements for enriched social interactions. In: 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings, pp. 82–87. IEEE (2014
Anam, A. I., Alam, S., Yeasin, M.: Expression: a dyadic conversation aid using Google Glass for people who are blind or visually impaired. In: 6th International Conference on Mobile Computing, Applications and Services, pp. 57–64. IEEE (2014)
Tanveer, M. I., Anam, A. S. M., Yeasin, M., Khan, M.: Do you see what I see?: designing a sensory substitution device to access non-verbal modes of communication. In: Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility, p. 8. Article 10 (2013)
Pentland, A.: Social signal processing exploratory DSP. IEEE Signal Process. Mag. 24(4), 108–111 (2007)
Knapp, M.L., Hall, J.A., Horgan, T.G.: Nonverbal Communication in Human Interaction. Harcourt Brace College Publishers, New York (1972)
Richmond, V.P., McCroskey, J.C., Payne, S.K.: Nonverbal Behavior in Interpersonal Relations. Prentice Hall, Englewood Cliffs (1991)
Ambady, N., Rosenthal, R.: Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis. Psychol. Bull. 111(2), 256–274 (1992)
Coulson, M.: Attributing emotion to static body postures: recognition accuracy, confusions, and viewpoint dependence. J. Nonverbal Behav. 28(2), 117–139 (2004)
Van den Stock, J., Righart, R., De Gelder, B.: Body expressions influence recognition of emotions in the face and voice. Emotion 7(3), 487–494 (2007)
Darwin, C.: 1965. The Expression of the Emotions in Man and Animals. John Marry, London (1872)
Keltner, D., Ekman, P., Gonzaga, G.C., Beer, J.: Facial Expression of Emotion. Guilford Publications, New York (2000)
Kleinke, C.L.: Gaze and eye contact. a research review. Psychol. Bull. 100(1), 78–100 (1986)
Scherer, K.R.: Vocal communication of emotion: a review of research paradigms. Speech Commun. 40(1–2), 227–256 (2003)
Hall, E.T.: The Silent Language, vol. 3. Doubleday, New York (1959)
Lott, D.F., Sommer, R.: Seating arrangements and status. J. Pers. Soc. Psychol. 7(1, Pt.1), 90–95 (1967)
Dion, K., Berscheid, E., Walster, E.: What is beautiful is good. J. Pers. Soc. Psychol. 24(3), 285–290 (1972)
Ivonin, L., Chang, H.-M., Diaz, M., Catala, A., Chen, W., Rauterberg, M.: Traces of unconscious mental processes in introspective reports and physiological responses. PLoS ONE 10(4), e0124519 (2015)
World Health Organization, “Change the definition of blindness,” Disponível no endereço eletrônico, 2008. http://www.who.int/blindness/ChangetheDefinitionofBlindness.pdf Accessed: 27-Dec-2017
Rosengren, K. E.: Advances in Scandinavia content analysis: an introduction. Adv. Content Anal. 9–19 (1981)
Nandy, B.R., Sarvela, P.D.: Content analysis reexamined: a relevant research method for health education. Am. J. Health Behav. 21(3), 222–234 (1997)
An, P., Bakker, S., Eggen, B.: Understanding teachers’ routines to inform classroom technology design. Educ. Inf. Technol. 22(4), 1347–1376 (2017)
Bakker, S., van den Hoven, E., Eggen, B.: Knowing by ear: leveraging human attention abilities in interaction design. J. Multimodal User Interfaces 5(3–4), 197–209 (2012)
Darwin, C., Prodger, P.: The Expression of the Emotions in Man and Animals. Oxford University Press, Oxford (1998)
Argyle, M.: The Psychology of Interpersonal Behaviour. Penguin, London (1994)
Théoret, H., Merabet, L., Pascual-Leone, A.: Behavioral and neuroplastic changes in the blind: evidence for functionally relevant cross-modal interactions. J. Physiol. Paris 98(1), 221–233 (2004)
Ivanchenko, V., Coughlan, J., Shen, H.: Crosswatch: a camera phone system for orienting visually impaired pedestrians at traffic intersections. In: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), pp. 1122–1128. LNCS, 5015 (2008)
Dunai, L., Fajarnes, G. P., Praderas, V. S., Garcia, B. D., Lengua, I. L.: Real-time assistance prototype—a new navigation aid for blind people. In: IECON 2010-36th Annual Conference on IEEE Industrial Electronics Society, pp. 1173–1178. IEEE (2010)
Ashmead, D.H., Hill, E.W., Talor, C.R.: Obstacle perception by congenitally blind children. Atten. Percept. Psychophys. 46(5), 425–433 (1989)
Ahmed, T., Hoyle, R., Connelly, K., Crandall, D., Kapadia, A.: Privacy concerns and behaviors of people with visual impairments. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3523–3532. ACM (2015)
Gruebler, A., Suzuki, K.: Design of a wearable device for reading positive expressions from facial emg signals. IEEE Trans. Affect. Comput. 5(3), 227–237 (2014)
Qiu, S., Rauterberg, M., Hu, J.: Designing and evaluating a wearable device for accessing gaze signals from the sighted. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 9737, 454–464 (2016)
Qiu, S., Anas, S. A., Osawa, H., Rauterberg, M., Hu, J.: E-gaze glasses: simulating natural gazes for blind people. In: Proceedings of the TEI’16: Tenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 563–569. ACM (2016)
Bond, M.H., Goodman, G.N.: Gaze patterns and interaction contexts: effects on personality impressions and attributions. Psychol. Int. J. Psychol., Orient (1980)
Argyle, M., Henderson, M., Bond, M., Iizuka, Y., Contarello, A.: Cross-cultural variations in relationship rules. Int. J. Psychol. 21(1–4), 287–315 (1986)
Senju, A., Vernetti, A., Kikuchi, Y., Akechi, H., Hasegawa, T., Johnson, M.H.: Cultural background modulates how we look at other persons’ gaze. Int. J. Behav. Dev. 37(2), 131–136 (2013)
Utsumi, A., Kawato, S., Abe, S.: Attention monitoring based on temporal signal-behavior structures. In: International Workshop on Human-Computer Interaction, pp. 100–109. Springer, Berlin (2005)
Murphy-Chutorian, E., Trivedi, M.M.: Head pose estimation in computer vision: a survey. IEEE Trans. Pattern Anal. Mach. Intell. 31(4), 607–626 (2009)
Ruffieux, S., Ruffieux, N., Caldara, R., Lalanne, D.: iKnowU–exploring the potential of multimodal ar smart glasses for the decoding and rehabilitation of face processing in clinical populations. In: IFIP Conference on Human-Computer Interaction, pp. 423–432. Springer, Cham (2017)
Sandnes, F. E.: What do low-vision users really want from smart glasses? Faces, text and perhaps no glasses at all. In: International Conference on Computers Helping People with Special Needs, pp. 187–194. Springer, Cham (2016)
Sandnes, F. E., Eika, E.: Head-mounted augmented reality displays on the cheap: a DIY approach to sketching and prototyping low-vision assistive technologies. In: International Conference on Universal Access in Human-Computer Interaction, pp. 167–186. Springer, Cham (2017)
Acknowledgements
We would like to thank Gordon, Xiang Cheng, and Liang Zang for helping us organize the participants from Hong Kong Blind Union and Yangzhou Special Education School. This research is supported by the China Scholarship Council and facilitated by the Eindhoven University of Technology.
Author information
Authors and Affiliations
Contributions
S Q, P A, J H contributed equally to this work as co-first authors.
Corresponding author
Ethics declarations
Conflict of interest
On behalf of all authors, the corresponding author states that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Qiu, S., An, P., Hu, J. et al. Understanding visually impaired people’s experiences of social signal perception in face-to-face communication. Univ Access Inf Soc 19, 873–890 (2020). https://doi.org/10.1007/s10209-019-00698-3
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-019-00698-3