Abstract
Mobile users lose a lot of time on their smartphones. They interact even with busy hands (hands-free interactions), distracted eyes (eyes-free interactions) and in different life situations (while walking, eating, working, etc.). The Time Well Spent (TWS) is a movement that aims to design applications which respect the users choices and availability. In this paper, we discuss how the multimodal mobile interactions and the TWS movement can support each other to protect the users time. We start by giving an overview of mobile multimodal interaction and highlighting the TWS concept. Then, we present our vision about applying mobile multimodality as a means to protect the users time. We show that multimodality can support the TWS by encouraging self-restraint, while the TWS supports multimodality by making the interaction modalities meaningful. Finally, we discuss our future works in this context.
Similar content being viewed by others
Notes
References
Bolt R (1980) Put that there: voice and gesture at the graphics interface. In: Proceeding SIGGRAPH ’80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques. ACM, New York, pp 262–270
Elouali N, Rouillard J, Le Pallec X, Tarby J-C (2013) Multimodal interaction: a survey from model driven engineering and mobile perspectives. J Multimodal User Interfaces 7:351–370
Coutaz J, Nigay L, Salber D, Blandford A, May JY (1995) Four easy pieces for assessing the usability of multimodal interaction: the CARE properties. In: INTERACT’95, pp 115–120
Zhang M, Li P, Yang P, Xiong J, Tian C (2016) Poster: Sonicnect: accurate hands-free gesture input system with smart acoustic sensing. In: MobiSys (Companion Volume)
Tinwala H, MacKenzie IS (2009) Eyes-free text entry on a touchscreen phone. In: IEEE Toronto international conference science and technology for humanity TIC-STH 2009, pp 83–89
Dicke C, Wolf K, Tal Y (2010) Foogue: eyes-free interaction for smartphones. In: MobileHCI’10
Elouali N, Le Pallec X, Rouillard J, Tarby JC (2014) MIMIC: leveraging sensor-based interactions in multimodal mobile applications. In: CHI ’14 extended abstracts on human factors in computing systems, pp 2323–2328
Falaki H, Mahajan R, Kandula S, Lymberopoulos D, Govindan R, Estrin D (2010) Diversity in smartphone usage. In: ACM MobiSys
Roberts JA, Pullig C, Manolis C (2015) I need my smartphone: a hierarchical model of personality and cell-phone addiction. Personal Individ Differ 79:139
Mark G, Shamsi TI, Czerwinski M, Johns P, Sano A, Yuliya L (2016) Email duration, batching and self-interruption: patterns of email use on productivity and stress. In: CHI’2016, pp 1717–1728
Obrenovic Z, Starcevic D (2004) Modelling multimodal human-computer interaction. Computer 37:65–72
Bellik Y, Teil D (1992) Definitions Terminologiques pour la Communication Multimodale. In: IHM92, Quatrimes Journes sur lIngnierie des Interfaces Homme-Machine
Nigay L, Coutaz J (1997) Multifeature systems: the CARE properties and their impact on software design. In: First international workshop on intelligence and multimodality in multimedia interfaces: research and applications, AAAI Press
Kvale K, Warakagoda ND (2010) Multimodal interfaces to mobile terminals a design-for-all approach. In: User interfaces
Bordegoni M, Faconti G, Feiner S, Maybury MT, Rist T, Ruggieri S, Trahanias P, Wilson M (1997) A standard reference model for intelligent multimedia presentation systems. In: Rist et al., pp 477–496
Bellik Y (1995) Interface Multimodales: Concepts, Modles et Architectures. Ph.D. Thesis. University of Paris XI, France
Bouchet J, Nigay L (2004) ICARE: a component-based approach for the design and development of multimodal interfaces. In: CHI extended abstracts, pp. 1325–1328
Martin JC (1999) TYCOON: six primitive types of cooperation for observing, evaluating and specifying cooperations. In: AAAI fall, symposium on psychological models of communication in collaborative systems
Martin JC (1997) Towards intelligent cooperation between modalities. The example of a system enabling multimodal interaction with a map. In: IJCAI-97 workshop on intelligent multimodal systems
McNab T, James DA, Rowlands D (2011) iphone sensor platforms: applications to sports monitoring. Proc Eng 13:507–512
Lane ND, Miluzzo E, Lu H, Peebles D, Choudhury T, Campbell AT (2010) A survey of mobile phone sensing. Commun Mag 48(9):140–150
Phansalkar N, Kumthekar N, Mulay M, Khedkar S, Shinde GR (2014) Air Gesture Library for Android using Camera. Int J Eng Res Technol (IJERT)
Elouali N (2014) Approche base de modles pour la construction dapplications mobiles multimodales. Ph.D. thesis, Lille 1 University
Karlson AKBBB (2006) Understanding single-handed mobile device interaction. Technical report
Roudaut A (2010) Conception et valuation de techniques dinteraction pour dispositifs mobiles. Ph.D. thesis, Telecom ParisTech, France
Foucault C, Micaux M, Bonnet D, Beaudouin-Lafon M (2014) Spad: a bimanual interaction technique for productivity applications on multi-touch tablets. In: CHI 14 extended abstracts on human factors in computing systems. CHI EA 14. ACM, New York, NY, USA, pp 1879–884
Naumann A, Wechsung I, Hurtienne J (2009) Multimodality, inclusive design, and intuitive use. Is prior experience the same as intuition in the context of inclusive design
Abdallah El Ali A (2013) Minimal mobile human computer interaction. Ph.D. thesis, University van Amsterdam, Pays-Bas
Oulasvirta A, Reichel A, Li W, Zhang Y, Bachynskyi M, Vertanen K, Kristensson PO (2013) Improving two-thumb text entry on touchscreen devices. In: SIGCHI conference on human factors in computing systems, CHI 13, pp. 2765–2774. ACM, New York, NY, USA
Xiao B, Girand C, Oviatt S (2002) Multimodal integration patterns in children. In: ICSLP02, pp. 629–632
Xiao B, Lunsford R, Coulston R, Wesson M, Oviatt S (2003) Modeling multimodal integration patterns and performance in seniors: toward adaptive processing of individual differences. In: The fifth international conference on multimodal interfaces. ACM Press, pp. 265–272
Dabbish L, Mark G, Gonzalez V (2011) Why do i keep interrupting myself?: environment, habit and self-interruption. In: CHI’11
Patil P, Sawant K, Desai S, Shinde A (2018) Task trigger: reminder application based on location. Int Res J Eng Technol (IRJET)
Khamis M, Hassib M, Zezschwitz E, Bulling A, Alt F (2017) GazeTouchPIN: protecting sensitive data on mobile devices using secure multimodal authentication. In: Proceedings of the 19th ACM international conference on multimodal interaction (ICMI 2017)
Bastien JC, Scapin DL (1993) Ergonomic criteria for the evaluation of human-computer interfaces. Doctoral dissertation. Inria. France
Turk M (2014) Multimodal interaction: a review. Pattern Recognit Lett 36:189–195
Elouali N (2018) How young Algerians interact with their smartphones. In: Third international conference on multimedia information processing CITIM, Algeria
Bellal Z, Elouali N, Benslimane SM (2017) Une approche de programmation par dmonstration pour lintgration de la multimodalitsous mobile. IHM’17, Poitiers
Bedjaoui M, Elouali N, Benslimane S (2018) User time spent between persuasiveness and usability of social networking mobile applications: a case study of Facebook and YouTube. In: The 16th international conference on advances in mobile computing and multimedia (MoMM)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Elouali, N. Time Well Spent with multimodal mobile interactions. J Multimodal User Interfaces 13, 395–404 (2019). https://doi.org/10.1007/s12193-019-00310-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-019-00310-1