Skip to main content
Log in

Time Well Spent with multimodal mobile interactions

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Mobile users lose a lot of time on their smartphones. They interact even with busy hands (hands-free interactions), distracted eyes (eyes-free interactions) and in different life situations (while walking, eating, working, etc.). The Time Well Spent (TWS) is a movement that aims to design applications which respect the users choices and availability. In this paper, we discuss how the multimodal mobile interactions and the TWS movement can support each other to protect the users time. We start by giving an overview of mobile multimodal interaction and highlighting the TWS concept. Then, we present our vision about applying mobile multimodality as a means to protect the users time. We show that multimodality can support the TWS by encouraging self-restraint, while the TWS supports multimodality by making the interaction modalities meaningful. Finally, we discuss our future works in this context.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. http://www.pokemongo.com/.

  2. https://www.fatsecret.com/.

  3. https://mobile.yahoo.com/mail.

  4. http://urlz.fr/5I20.

  5. http://urlz.fr/5I21.

  6. http://urlz.fr/5I1W.

  7. http://urlz.fr/5I1X.

  8. http://www.timewellspent.io/.

  9. http://www.tristanharris.com/.

  10. http://urlz.fr/5I1Z.

  11. http://urlz.fr/5I1Y.

  12. https://fr.slideshare.net/cvs26/sensors-on-android-10220894.

  13. https://www.technobezz.com/the-advantages-and-disadvantages-of-smartphones/.

  14. http://urlz.fr/5Ii2.

  15. https://en.wikipedia.org/wiki/Phantom_vibration_syndrome.

  16. http://www.pbs.org/newshour/bb/phone-trying-control-life/.

  17. http://urlz.fr/5I1X.

  18. https://en.wikipedia.org/wiki/Reinforcement.

  19. https://www.ncbi.nlm.nih.gov/pubmed/15761167.

  20. http://urlz.fr/5I1X.

  21. http://urlz.fr/5IiD.

  22. https://menthal.org/.

  23. https://inthemoment.io/.

  24. https://fossbytes.com/youtube-take-a-break-feature-activate/.

  25. https://www.theatlantic.com/magazine/archive/2016/11/the-binge-breaker/501122/.

References

  1. Bolt R (1980) Put that there: voice and gesture at the graphics interface. In: Proceeding SIGGRAPH ’80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques. ACM, New York, pp 262–270

  2. Elouali N, Rouillard J, Le Pallec X, Tarby J-C (2013) Multimodal interaction: a survey from model driven engineering and mobile perspectives. J Multimodal User Interfaces 7:351–370

    Article  Google Scholar 

  3. Coutaz J, Nigay L, Salber D, Blandford A, May JY (1995) Four easy pieces for assessing the usability of multimodal interaction: the CARE properties. In: INTERACT’95, pp 115–120

  4. Zhang M, Li P, Yang P, Xiong J, Tian C (2016) Poster: Sonicnect: accurate hands-free gesture input system with smart acoustic sensing. In: MobiSys (Companion Volume)

  5. Tinwala H, MacKenzie IS (2009) Eyes-free text entry on a touchscreen phone. In: IEEE Toronto international conference science and technology for humanity TIC-STH 2009, pp 83–89

  6. Dicke C, Wolf K, Tal Y (2010) Foogue: eyes-free interaction for smartphones. In: MobileHCI’10

  7. Elouali N, Le Pallec X, Rouillard J, Tarby JC (2014) MIMIC: leveraging sensor-based interactions in multimodal mobile applications. In: CHI ’14 extended abstracts on human factors in computing systems, pp 2323–2328

  8. Falaki H, Mahajan R, Kandula S, Lymberopoulos D, Govindan R, Estrin D (2010) Diversity in smartphone usage. In: ACM MobiSys

  9. Roberts JA, Pullig C, Manolis C (2015) I need my smartphone: a hierarchical model of personality and cell-phone addiction. Personal Individ Differ 79:139

    Article  Google Scholar 

  10. Mark G, Shamsi TI, Czerwinski M, Johns P, Sano A, Yuliya L (2016) Email duration, batching and self-interruption: patterns of email use on productivity and stress. In: CHI’2016, pp 1717–1728

  11. Obrenovic Z, Starcevic D (2004) Modelling multimodal human-computer interaction. Computer 37:65–72

    Article  Google Scholar 

  12. Bellik Y, Teil D (1992) Definitions Terminologiques pour la Communication Multimodale. In: IHM92, Quatrimes Journes sur lIngnierie des Interfaces Homme-Machine

  13. Nigay L, Coutaz J (1997) Multifeature systems: the CARE properties and their impact on software design. In: First international workshop on intelligence and multimodality in multimedia interfaces: research and applications, AAAI Press

  14. Kvale K, Warakagoda ND (2010) Multimodal interfaces to mobile terminals a design-for-all approach. In: User interfaces

  15. Bordegoni M, Faconti G, Feiner S, Maybury MT, Rist T, Ruggieri S, Trahanias P, Wilson M (1997) A standard reference model for intelligent multimedia presentation systems. In: Rist et al., pp 477–496

  16. Bellik Y (1995) Interface Multimodales: Concepts, Modles et Architectures. Ph.D. Thesis. University of Paris XI, France

  17. Bouchet J, Nigay L (2004) ICARE: a component-based approach for the design and development of multimodal interfaces. In: CHI extended abstracts, pp. 1325–1328

  18. Martin JC (1999) TYCOON: six primitive types of cooperation for observing, evaluating and specifying cooperations. In: AAAI fall, symposium on psychological models of communication in collaborative systems

  19. Martin JC (1997) Towards intelligent cooperation between modalities. The example of a system enabling multimodal interaction with a map. In: IJCAI-97 workshop on intelligent multimodal systems

  20. McNab T, James DA, Rowlands D (2011) iphone sensor platforms: applications to sports monitoring. Proc Eng 13:507–512

    Article  Google Scholar 

  21. Lane ND, Miluzzo E, Lu H, Peebles D, Choudhury T, Campbell AT (2010) A survey of mobile phone sensing. Commun Mag 48(9):140–150

    Article  Google Scholar 

  22. Phansalkar N, Kumthekar N, Mulay M, Khedkar S, Shinde GR (2014) Air Gesture Library for Android using Camera. Int J Eng Res Technol (IJERT)

  23. Elouali N (2014) Approche base de modles pour la construction dapplications mobiles multimodales. Ph.D. thesis, Lille 1 University

  24. Karlson AKBBB (2006) Understanding single-handed mobile device interaction. Technical report

  25. Roudaut A (2010) Conception et valuation de techniques dinteraction pour dispositifs mobiles. Ph.D. thesis, Telecom ParisTech, France

  26. Foucault C, Micaux M, Bonnet D, Beaudouin-Lafon M (2014) Spad: a bimanual interaction technique for productivity applications on multi-touch tablets. In: CHI 14 extended abstracts on human factors in computing systems. CHI EA 14. ACM, New York, NY, USA, pp 1879–884

  27. Naumann A, Wechsung I, Hurtienne J (2009) Multimodality, inclusive design, and intuitive use. Is prior experience the same as intuition in the context of inclusive design

  28. Abdallah El Ali A (2013) Minimal mobile human computer interaction. Ph.D. thesis, University van Amsterdam, Pays-Bas

  29. Oulasvirta A, Reichel A, Li W, Zhang Y, Bachynskyi M, Vertanen K, Kristensson PO (2013) Improving two-thumb text entry on touchscreen devices. In: SIGCHI conference on human factors in computing systems, CHI 13, pp. 2765–2774. ACM, New York, NY, USA

  30. Xiao B, Girand C, Oviatt S (2002) Multimodal integration patterns in children. In: ICSLP02, pp. 629–632

  31. Xiao B, Lunsford R, Coulston R, Wesson M, Oviatt S (2003) Modeling multimodal integration patterns and performance in seniors: toward adaptive processing of individual differences. In: The fifth international conference on multimodal interfaces. ACM Press, pp. 265–272

  32. Dabbish L, Mark G, Gonzalez V (2011) Why do i keep interrupting myself?: environment, habit and self-interruption. In: CHI’11

  33. Patil P, Sawant K, Desai S, Shinde A (2018) Task trigger: reminder application based on location. Int Res J Eng Technol (IRJET)

  34. Khamis M, Hassib M, Zezschwitz E, Bulling A, Alt F (2017) GazeTouchPIN: protecting sensitive data on mobile devices using secure multimodal authentication. In: Proceedings of the 19th ACM international conference on multimodal interaction (ICMI 2017)

  35. Bastien JC, Scapin DL (1993) Ergonomic criteria for the evaluation of human-computer interfaces. Doctoral dissertation. Inria. France

  36. Turk M (2014) Multimodal interaction: a review. Pattern Recognit Lett 36:189–195

    Article  Google Scholar 

  37. Elouali N (2018) How young Algerians interact with their smartphones. In: Third international conference on multimedia information processing CITIM, Algeria

  38. Bellal Z, Elouali N, Benslimane SM (2017) Une approche de programmation par dmonstration pour lintgration de la multimodalitsous mobile. IHM’17, Poitiers

  39. Bedjaoui M, Elouali N, Benslimane S (2018) User time spent between persuasiveness and usability of social networking mobile applications: a case study of Facebook and YouTube. In: The 16th international conference on advances in mobile computing and multimedia (MoMM)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nadia Elouali.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Elouali, N. Time Well Spent with multimodal mobile interactions. J Multimodal User Interfaces 13, 395–404 (2019). https://doi.org/10.1007/s12193-019-00310-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-019-00310-1

Keywords

Navigation