skip to main content
10.1145/3050385.3050390acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesergo-iaConference Proceedingsconference-collections
research-article

Why and how to study multimodal interaction in cockpit design

Published: 06 July 2016 Publication History

Abstract

Technological evolution opens up new ways of interacting and broadens the design perspectives for the use of multi-modality in the future cockpit. The use of adaptive multimodality is expected to provide more natural and intuitive interactions and should increase pilots' physical and cognitive performance within a given context. But this augmented set of input/output modalities and their combination highly increases the number of possible design solutions to be considered. Consequently, the design becomes more complex, requires more iterative loops for thinking, developing and validating interaction concepts. So, there is a strong need (1) to define a method to mark out the boundaries of design by giving recommendations and guidelines and (2) to develop a multi-agent platform to quickly iterate on potentially interesting multimodal design solutions that could enhance human performance in future cockpits. This paper presents a preliminary approach to address multimodal interaction in cockpit design.

References

[1]
Barbé, J., Chatrenet, N., Mollard, R., Bérard, P., Wolff, M. Physical ergonomics approach for touch screen interaction in an aircraft cockpit. In Proceedings of the Ergo'IHM 2012 conference. New-York: ACM Press (2012), 9--18.
[2]
Barbé, J., Wolff, M., Mollard, R. Human centered design approach to integrate touch screen in future aircraft cockpits. M. Kurosu (Ed.): Human-Computer Interaction, Part IV, HCII 2013, LNCS 8007, pp., 2013. © Springer-Verlag Berlin Heidelberg (2013), 429--438.
[3]
Begault, D.R. and Pittman, M.T. Three dimensional Audio Versus Head-Down Traffic Alert and Collision Avoidance System Display. International Journal of Aviation Psychology, (1996), 79--93.
[4]
Begault, D.R.; Bittner, R.M.; Anderson, M.R. Multimodal Information Management: Evaluation of Auditory and Haptic Cues for NextGen Communication Displays. JAES, Volume 62, Issue 6, (2014), 375--385.
[5]
Bolt, R.A. "put-that-there": Voice and gesture at the graphics interface. In SIGGRAPH' 80: Proceedings of the 7th annual conference on Computer graphics and interactive techniques, New York, NY, USA, ACM Press (1980), 262--270.
[6]
D'Ulizia, A. Exploring Multimodal Input Fusion Strategies. In Handbook of Research on Multimodal Human Computer Interaction and Pervasive Services, (2009), 34--57.
[7]
Grifoni, P. Multimodal Fission. In Handbook of Research on Multimodal Human Computer Interaction and Pervasive Services. IGI Global, (2009), 103--120.
[8]
Helleberg, J.R. & Wickens, C. D. Effects of data link modality and display redundancy on Pilot Performance: An Attentional Perspective. The International Journal of Aviation Psychology, 13(3), (2003), 189--210.
[9]
Ko, A.J., Myers, B.A., Aung, H.H. Six learning barriers in end-user programming systems. In Visual Languages and Human Centric Computing, IEEE Symposium on. IEEE, (2004), 199--206.
[10]
Martin, J.-C. On the Use of the Multimodal Clues in Observed Human Behavior for the Modeling of Agent Cooperative Behavior. Workshop on "Autonomy, Delegation, and Control: From Interagent to Groups", Eighteenth National Conference, (2002).
[11]
Nigay, L. C. & Coutaz J. A design space for multimodal systems: concurrent processing and data fusion. In INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems, (1993), 172--178.
[12]
Nigay, L. C & Coutaz J. Multifeature systems: The CARE properties and their impact on software design intelligence and multimodality. In J. Lee, editor, Multimedia Interfaces: Research and Applications, chapter 9. AAAI Press (1997).
[13]
Oviatt, S. Multimodal interfaces. In Handbook of Human-Computer Interaction, (ed. by J. Jacko & A. Sears), Lawrence Erlbaum: New Jersey, (2002), 286--304.
[14]
Oviatt, S. Toward a theory of organized multimodal integration patterns during human-computer. In Proceedings of ICMI, ACM Press, (2003), 44--51.
[15]
Oviatt, S. C. When Do We Interact Multimodally? Cognitive Load and Multimodal Communication Patterns. In Proceedings of IEEE International Conference on Multimodal Interfaces (ICMI'04), (2004), 129--136.
[16]
Rasmussen, J.V. & Vicente, K. J. Coping with human errors through system design: implications for ecological interface design. Int. J. Man-Machine Studies. 31, (5), (1989), 517--534.
[17]
Stanton, N.A., Harvey, C., Plant K.L., Bolton, L. To twist, roll, stroke or poke? A study of input devices for menu navigation in the cockpit. Ergonomics. 56, (4), (2013), 590--611.
[18]
Vernier, F. & Nigay, L. A Framework for the Combination and Characterization of Output Modalities. In Proc. of DSV-IS2000, LNCS, Springer-Verlag., (2000), 32--48.
[19]
Wickens, C. D. Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science, 3(2), (2002), 159--177.
[20]
Wickens, C. D. Multiple resources and mental workload. . Human Factors, 50(3), (2008), 449--455.

Cited By

View all
  • (2023)Advanced Multimodal Interfaces Design Using Speech ControlE3S Web of Conferences10.1051/e3sconf/202344605006446(05006)Online publication date: 10-Nov-2023
  • (2021)Control Rooms from a Human-Computer Interaction PerspectiveSense, Feel, Design10.1007/978-3-030-98388-8_25(281-289)Online publication date: 30-Aug-2021
  • (2018)IngeScapeProceedings of the 16th Ergo'IA “Ergonomie Et Informatique Avancée” Conference10.1145/3317326.3317330(1-8)Online publication date: 3-Oct-2018

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
Ergo'IA '16: Proceedings of the 15th Ergo'IA "Ergonomie Et Informatique Avancée" Conference
July 2016
163 pages
ISBN:9781450347853
DOI:10.1145/3050385
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 July 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. HCI theory
  2. cockpit design
  3. concepts and models
  4. interaction paradigms
  5. multimodality

Qualifiers

  • Research-article

Conference

ERGO'IA
ERGO'IA: Ergo'IA 2016
July 6 - 8, 2016
Bidart, France

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)3
Reflects downloads up to 16 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Advanced Multimodal Interfaces Design Using Speech ControlE3S Web of Conferences10.1051/e3sconf/202344605006446(05006)Online publication date: 10-Nov-2023
  • (2021)Control Rooms from a Human-Computer Interaction PerspectiveSense, Feel, Design10.1007/978-3-030-98388-8_25(281-289)Online publication date: 30-Aug-2021
  • (2018)IngeScapeProceedings of the 16th Ergo'IA “Ergonomie Et Informatique Avancée” Conference10.1145/3317326.3317330(1-8)Online publication date: 3-Oct-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media