Skip to main content

Involving Hearing, Haptics and Kinesthetics into Non-visual Interaction Concepts for an Augmented Remote Tower Environment

  • Conference paper
  • First Online:
Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2019)

Abstract

We investigated the contribution of specific HCI concepts to provide multimodal information to Air Traffic Controlers in the context of Remote Control Towers (i.e. when an airport is controlled from a distant location). We considered interactive spatial sound, tactile stimulation and body movements to design four different interaction and feedback modalities. Each of these modalities have been designed to provide specific solutions to typical Air Traffic Control identified use cases. Sixteen professional Air Traffic Controllers (ATCos) participated in the experiment, which was structured in four distinct scenarios. ATCos were immersed in an ecological setup, in which they were asked to control (i) one airport without augmentations modalities, (ii) two airports without augmentations, (iii) one airport with augmentations and (iv) two airports with augmentations. These experimental conditions constituted the four distinct experimental scenarios. Behavioral results shown a significant increase in overall participants’ performance when augmentation modalities were activated in remote control tower operations for one airport.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In such a case we talk about Remote Tower Centers (RTC), which should emerge at the operational level in the relatively near future.

  2. 2.

    Enroute ATC must be distinguished from approach ATC, which is considered for this study. It concerns aircraft in the cruise phase, while approach ATC concerns aircraft in the approach, descent, landing or take-off phases.

  3. 3.

    The terms embodiment [18] and sense of presence can also be used.

  4. 4.

    It should be noted that this type of sound cannot normally be heard in a physical control tower; it is therefore an augmentation.

  5. 5.

    Muret aerodrome, near Toulouse, South of France.

  6. 6.

    Lasbordes aerodrome, South of France.

  7. 7.

    This solution was chosen because we had demonstration constraints; this way, the platform could be seen and heard by several people at the same time when it was demonstrated.

  8. 8.

    This can be seen as unsuitable and expensive, however we could not use an inertial unit because of the magnetic field induced by the two transducers. The HoloLens was only used to measure the participants’ head orientation, and its augmented reality features were not considered for this experiment.

  9. 9.

    These transformation consists in computing the Arcsin of the square root of the value considered between 0 and 1.

References

  1. Aricó, P., et al.: Human-machine interaction assessment by neurophysiological measures: a study on professional air traffic controllers. In: 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 4619–4622, July 2018. https://doi.org/10.1109/EMBC.2018.8513212

  2. Blattner, M.M., Sumikawa, D.A., Greenberg, R.M.: Earcons and icons: their structure and common design principles (abstract only). SIGCHI Bull. 21(1), 123–124 (1989). https://doi.org/10.1145/67880.1046599

    Article  Google Scholar 

  3. Bolt, R.A.: Gaze-orchestrated dynamic windows. SIGGRAPH Comput. Graph. 15(3), 109–119 (1981). https://doi.org/10.1145/965161.806796

    Article  Google Scholar 

  4. Braathen, S.: Air transport services in remote regions. International Transport Forum Discussion Paper (2011)

    Google Scholar 

  5. Brewster, S., Brown, L.M.: Tactons: Structured tactile messages for non-visual information display. In: Proceedings of the Fifth Conference on Australasian User Interface, AUIC 2004, vol. 28, pp. 15–23. Australian Computer Society Inc., Darlinghurst (2004)

    Google Scholar 

  6. Brewster, S.A., Wright, P.C., Edwards, A.D.N.: Experimentally derived guidelines for the creation of earcons (1995)

    Google Scholar 

  7. Brewster, S.A.: Providing a structured method for integrating non-speech audio into human-computer interfaces. Technical report (1994)

    Google Scholar 

  8. Brungart, D.S., Simpson, B.D.: Auditory localization of nearby sources in a virtual audio display. In: Proceedings of the 2001 IEEE Workshop on the Applications of Signal Processing to Audio and Acoustics (Cat. No. 01TH8575), pp. 107–110, October 2001. https://doi.org/10.1109/ASPAA.2001.969554

  9. Buisson, M., et al.: Ivy: un bus logiciel au service du développement de prototypes de systèmes interactifs. In: IHM 2002, 14ème Conférence Francophone sur l’Interaction Homme-Machine, p. 223 (2002)

    Google Scholar 

  10. Calvo, J.: SESAR solution regulatory overview - single airport remote tower. Technical report (2009)

    Google Scholar 

  11. Carter, T., Seah, S.A., Long, B., Drinkwater, B., Subramanian, S.: Ultrahaptics: multi-point mid-air haptic feedback for touch surfaces. In: Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology, UIST 2013, pp. 505–514. ACM, New York (2013). https://doi.org/10.1145/2501988.2502018

  12. Chatty, S.: The ivy software bus. White paper. www.tls.cena.fr/products/ivy/documentation/ivy.pdf (2003)

  13. Cheng, C.I., Wakefield, G.H.: Introduction to head-related transfer functions (HRTFs): representations of hrtfs in time, frequency, and space. J. Audio Eng. Soc. 49(4), 231–249 (2001)

    Google Scholar 

  14. Cordeil, M., Dwyer, T., Hurter, C.: Immersive solutions for future air traffic control and management. In: Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces, ISS Companion 2016, pp. 25–31. ACM, New York (2016). https://doi.org/10.1145/3009939.3009944

  15. Craig, J.C., Sherrick, C.E.: Dynamic tactile displays. In: Tactual Perception: A Sourcebook, pp. 209–233 (1982)

    Google Scholar 

  16. Crispien, K., Fellbaum, K., Savidis, A., Stephanidis, C.: A 3D-auditory environment for hierarchical navigation in non-visual interaction. Georgia Institute of Technology (1996)

    Google Scholar 

  17. Crispien, K., Würz, W., Weber, G.: Using spatial audio for the enhanced presentation of synthesised speech within screen-readers for blind computer users. In: Zagler, W.L., Busby, G., Wagner, R.R. (eds.) ICCHP 1994. LNCS, vol. 860, pp. 144–153. Springer, Heidelberg (1994). https://doi.org/10.1007/3-540-58476-5_117

    Chapter  Google Scholar 

  18. Csordas, T.J.: Embodiment as a paradigm for anthropology. Ethos 18(1), 5–47 (1990)

    Article  MathSciNet  Google Scholar 

  19. Dourish, P.: Where the Action Is. MIT Press, Cambridge (2001)

    Book  Google Scholar 

  20. Erp, J.B.F.V., Veen, H.A.H.C.V., Jansen, C., Dobbins, T.: Waypoint navigation with a vibrotactile waist belt. ACM Trans. Appl. Percept. 2(2), 106–117 (2005). https://doi.org/10.1145/1060581.1060585

    Article  Google Scholar 

  21. Fogtmann, M.H., Fritsch, J., Kortbek, K.J.: Kinesthetic interaction: revealing the bodily potential in interaction design. In: Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat, OZCHI 2008, pp. 89–96. ACM, New York (2008). https://doi.org/10.1145/1517744.1517770

  22. FĂĽrstenau, N. (ed.): Virtual and Remote Control Tower: Research, Design, Development and Validation. RTA. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-28719-5

    Book  Google Scholar 

  23. Fürstenau, N., Schmidt, M., Rudolph, M., Möhlenbrink, C., Papenfuß, A., Kaltenhäuser, S.: Steps towards the virtual tower: remote airport traffic control center (raice). Reconstruction 1(2), 14 (2009)

    Google Scholar 

  24. Gaver, W.W.: The sonicfinder an interface that uses auditory icons (abstract only). SIGCHI Bull. 21(1), 124 (1989). https://doi.org/10.1145/67880.1046601

    Article  Google Scholar 

  25. Guldenschuh, M., Sontacchi, A.: Application of transaural focused sound reproduction. In: 6th Eurocontrol INO-Workshop 2009 (2009)

    Google Scholar 

  26. Gunther, E., O’Modhrain, S.: Cutaneous grooves: composing for the sense of touch. J. New Music Res. 32(4), 369–381 (2003). https://doi.org/10.1076/jnmr.32.4.369.18856

    Article  Google Scholar 

  27. Gunther, E.E.L.: Skinscape: a tool for composition in the tactile modality. Ph.D. thesis, Massachusetts Institute of Technology (2001)

    Google Scholar 

  28. Hamdan, N.A.h., Wagner, A., Voelker, S., Steimle, J., Borchers, J.: Springlets: expressive, flexible and silent on-skin tactile interfaces. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, pp. 488:1–488:14. ACM, New York (2019). https://doi.org/10.1145/3290605.3300718

  29. Hart, S.G.: Nasa-task load index (nasa-tlx); 20 years later. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 50(9), 904–908 (2006). https://doi.org/10.1177/154193120605000909

    Article  Google Scholar 

  30. Hart, S.G., Staveland, L.E.: Development of NASA-TLX (task load index): results of empirical and theoretical research. In: Hancock, P.A., Meshkati, N. (eds.) Human Mental Workload, Advances in Psychology, vol. 52, pp. 139–183. North-Holland (1988). https://doi.org/10.1016/S0166-4115(08)62386-9

    Google Scholar 

  31. Hermann, T., Hunt, A., Neuhoff, J.G.: The Sonification Handbook. Logos Verlag Berlin, Germany (2011)

    Google Scholar 

  32. Hoshi, T., Takahashi, M., Iwamoto, T., Shinoda, H.: Noncontact tactile display based on radiation pressure of airborne ultrasound. IEEE Trans. Haptics 3(3), 155–165 (2010). https://doi.org/10.1109/TOH.2010.4

    Article  Google Scholar 

  33. Hurter, C., Lesbordes, R., Letondal, C., Vinot, J.L., Conversy, S.: Strip-TIC: exploring augmented paper strips for air traffic controllers. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, AVI 2012, pp. 225–232. ACM, New York (2012). https://doi.org/10.1145/2254556.2254598

  34. Hutchins, E.L., Hollan, J.D., Norman, D.A.: Direct manipulation interfaces. Hum. Comput. Inter. 1(4), 311–338 (1985). https://doi.org/10.1207/s15327051hci0104_2

    Article  Google Scholar 

  35. Keen, P.G., Scott Morton, M.S.: Decision support systems; an organizational perspective. Technical report (1978)

    Google Scholar 

  36. Klemmer, S.R., Hartmann, B., Takayama, L.: How bodies matter: five themes for interaction design. In: Proceedings of the 6th Conference on Designing Interactive Systems, DIS 2006, pp. 140–149. ACM, New York (2006). https://doi.org/10.1145/1142405.1142429

  37. Kramer, G.: Auditory Display: Sonification, Audification, and Auditory Interfaces. Perseus Publishing, New York (1993)

    Google Scholar 

  38. Lackner, J.R., DiZio, P.: Vestibular, proprioceptive, and haptic contributions to spatial orientation. Annu. Rev. Psychol. 56, 115–147 (2005)

    Article  Google Scholar 

  39. Loftin, R.B.: Multisensory perception: beyond the visual in visualization. Comput. Sci. Eng. 5(4), 56–58 (2003). https://doi.org/10.1109/MCISE.2003.1208644

    Article  Google Scholar 

  40. Long Jr., A.C., Landay, J.A., Rowe, L.A.: Implications for a gesture design tool. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 1999, pp. 40–47. ACM, New York (1999). https://doi.org/10.1145/302979.302985

  41. Maclean, K., Enriquez, M.: Perceptual design of haptic icons. In: In Proceedings of Eurohaptics, pp. 351–363 (2003)

    Google Scholar 

  42. Mertz, C., Chatty, S., Vinot, J.L.: The influence of design techniques on user interfaces: the DigiStrips experiment for air traffic control. In: HCI-Aero (2000)

    Google Scholar 

  43. Moehlenbrink, C., Papenfuss, A.: ATC-monitoring when one controller operates two airports: research for remote tower centres. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 55, pp. 76–80. Sage Publications, Los Angeles (2011)

    Article  Google Scholar 

  44. Nene, V.: Remote tower research in the United States. In: Fürstenau, N. (ed.) Virtual and Remote Control Tower. RTA, pp. 279–312. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-28719-5_13

    Chapter  Google Scholar 

  45. Papenfuss, A., Friedrich, M.: Head up only - a design concept to enable multiple remote tower operations. In: 2016 IEEE/AIAA 35th Digital Avionics Systems Conference (DASC), pp. 1–10, September 2016. https://doi.org/10.1109/DASC.2016.7777948

  46. Parés, N., Carreras, A., Soler, M.: Non-invasive attitude detection for full-body interaction in mediate, a multisensory interactive environment for children with autism. In: VMV, pp. 37–45. Citeseer (2004)

    Google Scholar 

  47. Petermeijer, S., Cieler, S., de Winter, J.: Comparing spatially static and dynamic vibrotactile take-over requests in the driver seat. Accid. Anal. Prev. 99, 218–227 (2017). https://doi.org/10.1016/j.aap.2016.12.001

    Article  Google Scholar 

  48. Raj, A.K., Kass, S.J., Perry, J.F.: Vibrotactile displays for improving spatial awareness. Proc. Hum. Factors Ergon. Soc. Ann. Meet. 44(1), 181–184 (2000). https://doi.org/10.1177/154193120004400148

    Article  Google Scholar 

  49. Reynal, M., et al.: Investigating multimodal augmentations contribution to remote control tower contexts for air traffic management. In: Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, HUCAPP, vol. 2, pp. 50–61. INSTICC, SciTePress (2019). https://doi.org/10.5220/0007400300500061

  50. Reynal, M., Imbert, J.P., Aricó, P., Toupillier, J., Borghini, G., Hurter, C.: Audio focus: interactive spatial sound coupled with haptics to improve sound source location in poor visibility. Int. J. Hum. Comput. Stud. 129, 116–128 (2019). https://doi.org/10.1016/j.ijhcs.2019.04.001

    Article  Google Scholar 

  51. Savidis, A., Stephanidis, C., Korte, A., Crispien, K., Fellbaum, K.: A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction. In: Proceedings of the Second Annual ACM Conference on Assistive Technologies, Assets 1996, pp. 117–123. ACM, New York (1996). https://doi.org/10.1145/228347.228366

  52. Schmidt, M., Rudolph, M., Werther, B., Fürstenau, N.: Remote airport tower operation with augmented vision video panorama HMI. In: 2nd International Conference Research in Air Transportation, pp. 221–230. Citeseer (2006)

    Google Scholar 

  53. Schwalk, M., Kalogerakis, N., Maier, T.: Driver support by a vibrotactile seat matrix - recognition, adequacy and workload of tactile patterns in take-over scenarios during automated driving. Procedia Manuf. 3, 2466–2473 (2015). https://doi.org/10.1016/j.promfg.2015.07.507. 6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences, AHFE 2015

    Article  Google Scholar 

  54. Simpson, B.D., Brungart, D.S., Gilkey, R.H., McKinley, R.L.: Spatial audio displays for improving safety and enhancing situation awareness in general aviation environments. Technical report, Wright State University, Department of Psychology, Dayton, OH (2005)

    Google Scholar 

  55. Smith, D., et al.: Overhear: augmenting attention in remote social gatherings through computer-mediated hearing. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2005, pp. 1801–1804. ACM, New York (2005). https://doi.org/10.1145/1056808.1057026

  56. Stiefelhagen, R., Yang, J., Waibel, A.: Estimating focus of attention based on gaze and sound. In: Proceedings of the 2001 Workshop on Perceptive User Interfaces, pp. 1–9. ACM (2001)

    Google Scholar 

  57. Van Erp, J., Jansen, C., Dobbins, T., Van Veen, H.: Vibrotactile waypoint navigation at sea and in the air: two case studies. In: Proceedings of EuroHaptics, pp. 166–173 (2004)

    Google Scholar 

  58. Van Schaik, F., Roessingh, J., Lindqvist, G., Fält, K.: Assessment of visual cues by tower controllers, with implications for a remote tower control centre (2010)

    Google Scholar 

  59. Weick, K.E.: The vulnerable system: an analysis of the tenerife air disaster. J. Manag. 16(3), 571–593 (1990). https://doi.org/10.1177/014920639001600304

    Article  Google Scholar 

  60. Wenzel, E.M., Arruda, M., Kistler, D.J., Wightman, F.L.: Localization using nonindividualized head-related transfer functions. Acoust. Soc. Am. J. 94, 111–123 (1993). https://doi.org/10.1121/1.407089

    Article  Google Scholar 

  61. Wickens, C.D., Mavor, A.S., McGee, J., Council, N.R., et al.: Panel on human factors in air traffic control automation. In: Flight to the Future: Human Factors in Air Traffic Control (1997)

    Google Scholar 

  62. Wilson, E., et al.: The arcsine transformation: has the time come for retirement (2013)

    Google Scholar 

Download references

Fundings and Acknowledgment

This work was co-financed by the European Commission as part of the Horizon 2020 project Sesar-06-2015 “the embodied reMOte TOwer”, namely MOTO, GA n. 699379. We would like to thank all the ATCos who participated in this experiment during their working time, as well as all the people involved in the MOTO project, in France, Italy and in the Netherlands.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maxime Reynal .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Reynal, M. et al. (2020). Involving Hearing, Haptics and Kinesthetics into Non-visual Interaction Concepts for an Augmented Remote Tower Environment. In: Cláudio, A., et al. Computer Vision, Imaging and Computer Graphics Theory and Applications. VISIGRAPP 2019. Communications in Computer and Information Science, vol 1182. Springer, Cham. https://doi.org/10.1007/978-3-030-41590-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-41590-7_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-41589-1

  • Online ISBN: 978-3-030-41590-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics