skip to main content
chapter

Theoretical foundations of multimodal interfaces and systems

Published: 24 April 2017 Publication History

Abstract

This chapter discusses the theoretical foundations of multisensory perception and multimodal communication. It provides a basis for understanding the performance advantages of multimodal interfaces, as well as how to design them to reap these advantages. Historically, the major theories that have influenced contemporary views of multimodal interaction and interface design include Gestalt theory, Working Memory theory, and Activity theory. They include perception-action dynamic theories and also limited resource theories that focus on constraints involving attention and short-term memory. This chapter emphasizes these theories in part because they are supported heavily by neuroscience findings. Their predictions also have been corroborated by studies on multimodal human-computer interaction. In addition to summarizing these three main theories and their impact, several related theoretical frameworks will be described that have influenced multimodal interface design, including Multiple Resource theory, Cognitive Load theory, Embodied Cognition, Communication Accommodation theory, and Affordance theory.
The large and multidisciplinary body of research on multisensory perception, production, and multimodal interaction confirms many Gestalt,Working Memory, and Activity theory predictions that will be discussed in this chapter. These theories provide conceptual anchors. They create a path for understanding how to design more powerful systems, so we can gain better control over our own future. In spite of this, it is surprising how many systems are developed from an engineering perspective that is sophisticated, yet in a complete theoretical vacuum that Leonardo da Vinci would have ridiculed:
Those who fall in love with practice without science are like a sailor who enters a ship without helm or compass, and who never can be certain whither he is going. Richter and Wells [2008]
This chapter aims to provide a better basis for motivating and accelerating future multimodal system design, and the quality of its impact on human users. AB@For a definition of highlighted terms in this chapter, see the Glossary. For other related terms and concepts, also see the textbook on multimodal interfaces by [Oviatt and Cohen 2015]. Focus Questions to aid comprehension are available at the end of this chapter.

References

[1]
T. Anastasio and P. Patton. 2004. Analysis and modeling of multisensory enhancement in the deep superior colliculus. In G. Calvert, C. Spence, and B. Stein, editors. The Handbook of Multisensory Processing. pp. 265--283. MIT Press, Cambridge, MA. 25
[2]
M. Anderson and C. Green. 2001. Suppressing unwanted memories by executive control. Nature, 410:366--369. 32
[3]
A. Baddeley. 1986. Working Memory. Oxford University Press, New York. 30, 32
[4]
A. Baddeley. 2003. Working memory: Looking back and looking forward. Nature Reviews., 4:829--839. 30
[5]
A. D. Baddeley and G. J. Hitch. 1974.Working memory. In G.H. Bower, editor. The Psychology of Learning and Motivation: Advances in Research and Theory, vol. 8, pp. 47--89. Academic, New York. 30
[6]
S. L. Beilock, I. M. Lyons, A. Mattarella-Micke, H. C. Nusbaum, and S. L. Small. 2008. Sports experience changes the neural processing of action language. In Proceedings of the National Academy of Sciences, vol. 105, pages, 13269--13273. 35
[7]
L. E. Berk. 1994. Why children talk to themselves. Scientific American, 71(5):78--83. 33
[8]
V. Berninger, R. Abbott, A. Augsburger, and N. Garcia. 2009. Comparison of pen and keyboard transcription modes in children with and without learning disabilities. Learning Disability Quarterly 32:123--141. 35
[9]
L. Bernstein and C. Benoit. 1996. For speech perception by humans or machines, three senses are better than one. In Proceedings of the International Conference on Spoken Language Processing, vol. 3, pages 1477--1480.
[10]
P. Bertelson and B. deGelder. 2004. The psychology of multimodal perception. In C. Spence and J. Driver, editors. Crossmodal Space and Crossmodal Attention, pp. 141--177. Oxford University Press, Oxford, UK. 24
[11]
J. Black K. Isaacs, B. Anderson, A. Alcantara, and W. Greenough. 1990. Learning causes synaptogenesis, whereas motor activity causes angiogenesis in cerebellar cortex of adult rats. In Proceedings of the National Academy of Sciences, vol. 87, 5568--5572. 34
[12]
A. S. Bregman. 1990. Auditory Scene Analysis. MIT Press, Cambridge, MA. 21 J. Burgoon, L. Stern, and L. Dillman. 1995. Interpersonal Adaptation: Dyadic Interaction Patterns. Cambridge University Press, Cambridge, UK. 37
[13]
G. Calvert, C. Spence, and B. E. Stein, editors. 2004. The Handbook of Multisensory Processing. MIT Press, Cambridge, MA. 20, 21, 25
[14]
P. R. Cohen, M. Dalrymple, D. B. Moran, F. C. N. Pereira, J. W. Sullivan, R. A. Gargan, J. L. Schlossberg, and S. W. Tyler. 1989. Synergistic use of direct manipulation and natural language. In Proceedings of the Conference on Human Factors in Computing Systems {CHI'89}, pp. 227--234. ACMPress, New York. Reprinted in M. T. Maybury and W. Wahlster, editors. 1998. Readings in Intelligent User Interfaces, pp. 29--37. Morgan Kaufmann, San Francisco. 29
[15]
A. Comblain. 1994. Working memory in Down's Syndrome: Training the rehearsal strategy. Down's Syndrome Research and Practice, 2(3):123--126. 33
[16]
K. Daffner and M. Searl. 2008. The dysexecutive syndromes. In G. Goldenberg and B. Miller, editors. Handbook of Clinical Neurology, vol. 88, ch. 12, pp. 249--267. Elsevier B.V. 31
[17]
M. D'Esposito. 2008. Working memory, The dysexecutive syndrome. In G. Goldenberg and B. Miller, edtiors. Handbook of Clinical Neurology, vol. 88, ch. 11, pp. 237--248. Elsevier B.V. 31
[18]
N. F. Dixon and L. Spitz. 1980. The detection of auditory visual desynchrony. Perception, 9:719--721. 24
[19]
R. M. Duncan and J. A. Cheyne. 2002. Private speech in young adults: Task difficulty, selfregulation, and psychological predication. Cognitive Development, 16:889--906. 33
[20]
M. Ernst and M. Banks. 2002. Humans integrate visual and haptic information in a statistically optimal fashion. Nature, 415:429--433. 25, 36, 37
[21]
M. Ernst and H. Bulthoff. 2004. Merging the senses into a robust percept. Trends in Cognitive Sciences, 8(4):162--169. 36
[22]
N. Fay, S. Garrod, L. Roberts, and N. Swoboda. 2010. The interactive evolution of human communication systems. Cognitive Science, 34:351--386. 37
[23]
P. Ferchmin and E. Bennett. 1975. Direct contact with enriched environment is required to alter cerebral weight in rats. Journal of Comparative and Physiological Psychology, 88:360--367. 34
[24]
W. Gaver. 1991. Technology affordances. In Proceedings of the CHI Conference, pp. 79--84. ACM Press, New York. 38, 39
[25]
J. Gibson. 1977. The theory of affordances. In R. Shaw and J. Bransford, editors. Perceiving, Acting and Knowing. vol. 3, pp. 67--82. Erlbaum, Hillsdale, NJ. 38 J. Gibson. 1979. The Ecological Approach to Visual Perception. Houghton Mifflin, Boston. 38
[26]
H. Giles, A. Mulac, J. Bradac, and P. Johnson. 1987. Speech accommodation theory: The first decade and beyond. In M. L. McLaughlin, editor. Communication Yearbook 10, pp. 13--48. Sage Publications, London. 37
[27]
S. Goldin-Meadow. 2003. The Resilience of Language: What Gesture Creation in Deaf Children Can Tell Us About How Children Learn Language. Psychology Press, New York. 38
[28]
S. Goldin-Meadow and S. Beilock. 2010. Action's influence on thought: The case of gesture. Perspectives on Psychological Science, 5(6):664--674. 35
[29]
S. Goldin-Meadow, H. Nusbaum, S. J. Kelly, and S.Wagner. 2001. Explaining math: Gesturing lightens the load. Psychological Science, 12(6):516--522. 33
[30]
J. Greeno. 1994. Gibson's affordances. Psychological Review, 101(2):336--342. 39
[31]
J. Hayes and V. Berninger. 2010. Relationships between idea generation and transcription: How the act of writing shapes what children write. In C. Bazerman, R. Krut, K. Lunsford, S. McLeod, S. Null, P. Rogers, and A. Stansell, editors. Traditions of Writing Research, pp. 116--180. Routledge, New York. 35
[32]
M. Howison, D. Trninic, D. Reinholz, and D. Abrahamson. 2011. The mathematical imagery trainer: From embodied interaction to conceptual learning. In Proceedings of the CHI Conference, pp. 1989--1998. ACM Press, New York. 35
[33]
X. Huang and S. Oviatt. 2005. Toward adaptive information fusion in multimodal systems. In Second Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms {MIML'05}. Springer-Verlag, Edinburgh, UK. 28
[34]
X. Huang, S. L. Oviatt, and R. Lunsford. 2006. Combining user modeling and machine learning to predict users' multimodal integration patterns. In S. Renals and S. Bengio, editors. Third Joint Workshop on Multimodal Interaction and Related Machine Learning Algorithms {MIML'06}, Springer Lecture Notes in Computer Science. Springer-Verlag GmbH. 28
[35]
F. Hummel and C. Gerloff. 2005. Larger interregional synchrony is associated with greater behavioral success in a complex sensory integration task in humans. Cerebral Cortex, 15:670--678. 26
[36]
H. Ishibashi, S. Obayashi, and A. Iriki. 2004. Cortical mechanisms of tool use subserved by multisensory integration. In G. Calvert, C. Spence, and B. E. Stein, editors. The Handbook of Multisensory Processing. pp. 453--462. MIT Press, Cambridge, MA. 34
[37]
K. James. 2010. Sensori-motor experience leads to changes in visual processing in the developing brain. Developmental Science 13:279--288. 35
[38]
K. James and L. Engelhardt. 2012. The effects of handwriting experience on functional brain development in pre-literate children. Trends in Neuroscience Education, 1:32--42. 35
[39]
K. James and S. Swain. 2010. Only self-generated actions create sensori-motor systems in the developing brain. Developmental Science, 1--6.
[40]
K. James, S. Vinci-Booher, and F. Munoz-Rubke. 2017. The impact of multimodalmultisensory learning on human performance and brain activation patterns, The Handbook of Multimodal-Multisensor Interfaces, Volume 1: Foundations, User Modeling, and Common Modality Combinations (ed. by S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos & A. Krüger), San Rafael, CA: Morgan Claypool Publishers. 37
[41]
J. Kegl, A. Senghas, and M. Coppola. 1999. Creation through contact: Sign language emergence and sign language change in Nicaragua. In M. DeGraff, editor. Language Creation and Language Change: Creolization, Diachrony and Development, pp. 179--237. MIT Press, Cambridge MA. 38
[42]
A. Kersey and K. James. 2013. Brain activation patterns resulting from learning letter forms through active self-production and passive observation in young children. Frontiers in Psychology, 4(567):1--15. 35
[43]
A. J. King and A. R. Palmer. 1985. Integration of visual and auditory information in bimodal neurons in the guinea-pic superior colliculus. Experimental Brain Research, 60:492--500. 24
[44]
J. Kleim, K. Vij, J. Kelly, D. Ballard, and W. Greenough. 1997. Learning-dependent synaptic modifications in the cerebellar cortex of the adult rat persist for at least 4 weeks. Journal of Neuroscience, 17:717--721. 34
[45]
K. Koffka. 1935. Principles of Gestalt Psychology. Harcourt, Brace and Company, New York. 21, 27
[46]
W. Kohler. 1929. Dynamics in Psychology. Liveright, New York. 21, 27
[47]
E. Kohler, C. Keysers, M. Umilta, L. Fogassi, V. Gallese, and G. Rizzolatti. 2002. Hearing sounds, understanding actions: Action representation in mirror neurons. Science, 297:846--848. 38
[48]
M. Longcamp, C. Boucard, J-C. Gilhodes, J.-L. Anton, M. Roth, B. Nazarian, and J-L. Velay. 2008. Learning through hand- or typewriting influences visual recognition of new graphic shapes: Behavioral and functional imaging evidence. Journal of Cognitive Neuroscience, 20(5):802--815. 35
[49]
M. Longcamp, M.-T. Zerbato-Poudou, and J.-L. Velay. 2005. The influence of writing practice on letter recognition in preschool children: A comparison of handwriting and typing. Acta Psychologica, 119:67--79. 35
[50]
A. R. Luria. 1961. The Role of Speech in the Regulation of Normal and Abnormal Behavior. Liveright, Oxford. 33, 34
[51]
Y. Maehara and S. Saito. 2007. The relationship between processing and storage in working memory span: Not two sides of the same coin. Journal of Memory and Language, 56(2):212--228. 30
[52]
J.Markham and W. Greenough. 2004. Experience-driven brain plasticity: beyond the synapse. Neuron Glia Biology, 1(4):351--363. 34
[53]
R. Mayer and R. Moreno. 1998. A split-attention effect in multimedia learning: Evidence for dual processing systems in working memory. Journal of Educational Psychololgy, 90:312--320.
[54]
M. A. Meredith, J.W. Nemitz, and B. E. Stein. 1987. Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors. Journal of Neuroscience, 7:3215--3229. 24
[55]
G. Miller. 1956. The magical number seven plus or minus two: some limits on our capacity for processing information. Psychological Review, 63(2):81--97. 30
[56]
G. A. Miller, E. Galanter, and K. H. Pribram. 1960. Plans and the Structure of Behavior. Holt, Rinehart and Winston, New York. 30
[57]
S. Morein-Zamir, S. Soto-Faraco, and A. Kingstone. 2003. Auditory capture of vision: Examining temporal ventriloquism. Cognitive Brain Research, 17(1):154--163. 24
[58]
S. Mousavi, R. Low, and J. Sweller. 1995. Reducing cognitive load by mixing auditory and visual presentation modes. Journal of Educational Psychology 87(2):319--334. 32
[59]
K. Nakamura, W.-J. Kuo, F. Pegado, L. Cohen, O. Tzeng, and S. Dehaene. 2012. Universal brain systems for recognizing word shapes and handwriting gestures during reading. In Proceedings of the National Academy of Science, 109(50):20762--20767. 24, 35, 626
[60]
D. Norman. 1988. The Design of Everyday Things. Basic Books, New York. 38, 39
[61]
S. L. Oviatt. 2000. Multimodal signal processing in naturalistic noisy environments. In B. Yuan, T. Huang, and X. Tang, editors, Proceedings of the International Conference on Spoken Language Processing {ICSLP'2000}, pp. 696--699, vol. 2. Chinese Friendship Publishers, Beijing. 25
[62]
S. L. Oviatt. 2006. Human-centered design meets cognitive load theory: Designing interfaces that help people think. In Proceedings of the Conference on ACM Multimedia, pp. 871--880. ACM, New York. 32, 34
[63]
S. L. Oviatt. 2012. Multimodal interfaces. In J. Jacko, editor, The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications {revised 3rd edition}, pp. 405--430, ch. 18. CRC Press, Boca Raton, FL. 25
[64]
S. Oviatt. 2013. The Design of Future Educational Interfaces. Routledge Press, New York. 35, 39, 41
[65]
S. Oviatt, A. Arthur, and J. Cohen. 2006. Quiet interfaces that help students think. In Proceedings of the Conference on User Interface Software Technology, pp. 191--200. ACM Press, New York.
[66]
S. Oviatt, A. Arthur, Y. Brock, and J. Cohen. 2007. Expressive pen-based interfaces for math education. Proceedings of the Conference on Computer-Supported Collaborative Learning, International Society of the Learning Sciences. 33, 34
[67]
S. Oviatt, A. Cohen, A. Miller, K. Hodge, and A. Mann. 2012. The impact of interface affordances on human ideation, problem solving and inferential reasoning. In Transactions on Computer-Human Interaction, ACM Press, New York. 22, 39, 40, 609
[68]
S. Oviatt and P. Cohen. 2015. The Paradigm Shift to Multimodality in Contemporary Computer Interfaces. Morgan Claypool Synthesis Series. Morgan & Claypool Publishers, San Rafael, CA. 20, 27, 29
[69]
S. L. Oviatt, R. Coulston, S. Shriver, B. Xiao, R. Wesson, R. Lunsford, and L. Carmichael. 2003. Toward a theory of organized multimodal integration patterns during humancomputer interaction. In Proceedings of the International Conference on Multimodal Interfaces {ICMI'03}, pp. 44--51. ACM Press, New York. 21, 25, 27, 28
[70]
S. L. Oviatt, R. Coulston, and R. Lunsford. 2004a. When do we interact multimodally? Cognitive load and multimodal communication patterns. In Proceedings of the International Conference on Multimodal Interaction {ICMI'04}. ACM Press, New York. 32
[71]
S. L. Oviatt, C. Darves, and R. Coulston. 2004b. Toward adaptive conversational interfaces: Modeling speech convergence with animated personas. Transactions on Computer Human Interaction {TOCHI} 11(3):300--328. 37, 38, 39
[72]
S. L. Oviatt, R. Lunsford, and R. Coulston. 2005. Individual differences in multimodal integration patterns: What are they and why do they exist? In Proceedings of the Conference on Human Factors in Computing Systems {CHI'05}, CHI Letters. pp. 241--249. ACM Press, New York. 26
[73]
S. L. Oviatt, M. MacEachern, and G. Levow. 1998. Predicting hyperarticulate speech during human-computer error resolution. Speech Commun., 24(2):1--23. 22, 618
[74]
A. Owen, K. McMillan, A. Laird and E. Bullmore. 2005. N-back working memory paradigm: A meta-analysis of normative functional neuroimaging studies. Human Brain Mapping, 25:46--59. 31
[75]
F. Paas, J. Tuovinen, H. Tabbers, and P. Van Gerven. 2003. Cognitive load measurement as a means to advance cognitive load theory. Educational Psychology, 38(1):63--71. 32
[76]
L. Reeves, J. Lai, J. Larson, S. Oviatt, T. Balaji, S. Buisine, P. Collings, P. Cohen, B. Kraal, J.-C. Martin, M. McTear, T. V. Raman, K. Stanney, H. Su, and Q. Wang. 2004. Guidelines for multimodal user interface design. Communications of the ACM, 47(1):57--59. 28
[77]
I. A. Richter and T.Wells, (eds.) 2008. Leonardo da Vinci Notebooks, OxfordWorld's Classics (2nd edition), Oxford University Press. 20
[78]
G. Rizzolatti and L. Craighero. 2004. The mirror-neuron system. Annual Review of Neuroscience, 27:169--192. 38
[79]
A. Sale, N. Berardi, and L. Maffei. 2009. Enrich the environment to empower the brain. Trends in Neuroscience, 32:233--239. 34
[80]
E. Saund, D. Fleet, D. Larner, and J. Mahoney. 2003. Perceptually-supported image editing of text and graphics. In Proceedings of the 16th Annual ACM Symposium on User Interface Software Technology {UIST'2003}, pp. 183--192. ACM Press, New York. 28
[81]
C. Schroeder and J. Foxe. 2004. Multisensory convergence in early cortical processing. In G. Calvert, C. Spence, and B. Stein, editors, The Handbook of Multisensory Processing, pp. 295--309. MIT Press, Cambridge, MA. 20
[82]
L. Shapiro, editor. 2014. The Routledge Handbook of Embodied Cognition. Routledge Press, New York. 35
[83]
B. Smith, editor. 1988. Foundations of Gestalt Theory. Philosophia Verlag, Munich and Vienna. 20
[84]
C. Spence and S. Squire. 2003. Multisensory integration: Maintaining the perception of synchrony. Current Biology 13:R519--R521. 24
[85]
B. E. Stein, editor. 2012. The New Handbook of Multisensory Processing, 2nd ed. MIT Press, Cambridge, MA. 20
[86]
B. E. Stein and M. Meredith. 1993. The Merging of the Senses. MIT Press, Cambridge, MA. 20, 21
[87]
J. Sweller. 1988. Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2):257--285. 32
[88]
S. Tindall-Ford, P. Chandler, and P. Sweller. 1997. When two sensory modes are better than one. Journal of Experimental Psychology Applied, 3(4):257--287. 32
[89]
J. van Merrienboer and J. Sweller. 2005. Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17(2):147--177. 32
[90]
F. Varela, E. Thompson, and E. Rosch. 1991. The Embodied Mind: Cognitive Science and Human Experience. The MIT Press, Cambridge, MA. 35
[91]
L. Vygotsky. 1962. Thought and Language. MIT Press, Cambridge, MA (Translated by E. Hanfmann and G. Vakar from 1934 original). 33, 34
[92]
L. Vygotsky. 1978. Mind in Society: The Development of Higher Psychological Processes.M. Cole, V. John-Steiner, S. Scribner, and E. Souberman, editors. Harvard University Press, Cambridge, MA. 33
[93]
L. Vygotsky. 1987. The Collected Works of L. S. Vygotsky, Volume I: Problems of General Psychology, Edited and translated by N. Minick. Plenum, New York. 33
[94]
N. Waugh and D. Norman. 1965. Primary memory. Psychological Review72:89--104. 30
[95]
J. Welkowitz, G. Cariffe, and S. Feldstein. 1976. Conversational congruence as a criterion of socialization in children. Child Development 47:269--272. 37
[96]
M. Wertheimer. 1938. Laws of organization of perceptual forms. In W. Ellis, editor, translation published in A Sourcebook of Gestalt Psychology. pp. 71--88, Routledge and Kegan Paul, London. 21
[97]
C. Wickens, D. Sandry, and M. Vidulich. 1983. Compatibility and resource competition between modalities of input, central processing, and output. Human Factors, 25(2):227--248. 31
[98]
C. Wickens. 2002. Multiple resources and performance prediction. Theoretical Issues in Ergonomic Science, 3(2):159--17. 31
[99]
B. Xiao, C. Girand, and S. L. Oviatt. 2002. Multimodal integration patterns in children. In Proceedings of the International Conference on Spoken Language Processing, pp. 629--632. 26
[100]
B. Xiao, R. Lunsford, R. Coulston, M. Wesson, and S. L. Oviatt. 2003. Modeling multimodal integration patterns and performance in seniors: Toward adaptive processing of individual differences. Fifth International Conference on Multimodal Interfaces {ICMI}, ACM, Vancouver. 33
[101]
J. Zhang and V. Patel. 2006. Distributed cognition, representation, and affordance. In I. Dror and S. Harnad, editors. Cognition Distributed: How Cognitive Technology Extends Our Mind. pp. 137--144. John Benjamins, Amsterdam. 39
[102]
G. Yang, F. Pan, andW. B. Gan. 2009. Stably maintained dendritic spines are associated with lifelong memories. Nature, 462:920--924. 34
[103]
Zhou, J., Yu, K., Chen, F., Wang, Y. and Arshad, S. 2017. Multimodal behavioral and physiological signals as indicators of cognitive load. S. Oviatt, B. Schuller, P. Cohen, D. Sonntag, G. Potamianos and A. Krüger, editors, The Handbook of Multimodal-Multisensor Interfaces, Volume 2: Signal Processing, Architectures, and Detection of Emotion and Cognition Morgan Claypool Publishers, San Rafael, CA. 29
[104]
E. Zoltan-Ford. 1991. How to get people to say and type what computers can understand. International Journal of Man-Machine Studies, 34:527--547. 37, 38

Cited By

View all
  • (2024)Unlocking Understanding: An Investigation of Multimodal Communication in Virtual Reality CollaborationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642491(1-16)Online publication date: 11-May-2024
  • (2024)Less-is-more: auditory strategies for reduced realityPersonal and Ubiquitous Computing10.1007/s00779-024-01808-628:5(713-725)Online publication date: 15-Jun-2024
  • (2023)Evaluation of the effectiveness of preschool English learning applications based on touch and voice multimodal interaction techniqueUniversal Access in the Information Society10.1007/s10209-023-01075-xOnline publication date: 19-Dec-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Books
The Handbook of Multimodal-Multisensor Interfaces: Foundations, User Modeling, and Common Modality Combinations - Volume 1
April 2017
662 pages
ISBN:9781970001679
DOI:10.1145/3015783

Publisher

Association for Computing Machinery and Morgan & Claypool

Publication History

Published: 24 April 2017

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Chapter

Appears in

ACM Books

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)99
  • Downloads (Last 6 weeks)80
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Unlocking Understanding: An Investigation of Multimodal Communication in Virtual Reality CollaborationProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642491(1-16)Online publication date: 11-May-2024
  • (2024)Less-is-more: auditory strategies for reduced realityPersonal and Ubiquitous Computing10.1007/s00779-024-01808-628:5(713-725)Online publication date: 15-Jun-2024
  • (2023)Evaluation of the effectiveness of preschool English learning applications based on touch and voice multimodal interaction techniqueUniversal Access in the Information Society10.1007/s10209-023-01075-xOnline publication date: 19-Dec-2023
  • (2023)An Approach to Model Haptic Awareness in Groupware SystemsHuman-Computer Interaction10.1007/978-3-031-24709-5_1(1-14)Online publication date: 22-Jan-2023
  • (2021)I Know What You Know: What Hand Movements Reveal about Domain ExpertiseACM Transactions on Interactive Intelligent Systems10.1145/342304911:1(1-26)Online publication date: 15-Mar-2021
  • (2021)ENHANCED ACCESSIBILITY: AN ELEVATOR WITH AN INTERACTIVE MEDIA SURFACEProceedings of the Design Society10.1017/pds.2021.1381(1383-1390)Online publication date: 27-Jul-2021
  • (2020)'Pataphysical SoftwareProceedings of the 2020 ACM Designing Interactive Systems Conference10.1145/3357236.3395526(1859-1871)Online publication date: 3-Jul-2020
  • (2018)Multimodal learning analyticsThe Handbook of Multimodal-Multisensor Interfaces10.1145/3107990.3108003(331-374)Online publication date: 1-Oct-2018
  • (2018)Multimodal behavioral and physiological signals as indicators of cognitive loadThe Handbook of Multimodal-Multisensor Interfaces10.1145/3107990.3108002(287-329)Online publication date: 1-Oct-2018
  • (2017)Perspectives on learning with multimodal technologyThe Handbook of Multimodal-Multisensor Interfaces10.1145/3015783.3015798(547-570)Online publication date: 24-Apr-2017
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media