Skip to main content
Log in

Multi-modal natural interaction in game design: a comparative analysis of player experience in a large scale role-playing game

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

Previous work on player experience research has focused on identifying the major factors involving content creation and interaction. This has encouraged a large investment in new types of physical interaction artefacts (e.g. Wiimote™, Rock Band™, Kinect™). However, these artefacts still require custom interaction schemes to be developed for them, which critically limits the number of commercial videogames and multimedia applications that can benefit from those. Moreover, there is currently no agreement as to which factors better describe the impact that natural and complex multi-modal user interaction schemes have on users’ experiences—a gap in part created by the limitations in adapting this type of interaction to existing software. Thus, this paper presents a generic middleware framework for multi-modal natural interfaces which enables game-independent data acquisition that encourages further advancement on this domain. Furthermore, our framework can then redefine the interaction scheme of any software tool by mapping body poses and voice commands to traditional input means (keyboard and mouse). We have focused on digital games, where the use of physical interaction artefacts has become mainstream. The validation methods for this tool consisted of a series of increasing difficulty stress tests, with a total of 25 participants. Also, a pilot study was conducted on a further 16 subjects which demonstrated mainly positive impact of natural interfaces on player’s experience. The results supporting this were acquired when subjects played a complex commercial role-playing game whose mechanics were adapted using our framework; statistical tests on the obtained Fun ratings, along with subjective participant opinions indicate that this kind of natural interaction indeed has a significant impact on player’s experience and enjoyment. However, different impact patterns emerge from this analysis, which seem to fit with standing theories of player experience and immersion.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. When we cite the FAAST framework throughout the paper we always refer to the version available on early 2012.

  2. The Kinect was introduced by a tech-demo game called MILO project in 2009, which explored a completely new interaction paradigm.

  3. OpenInterface is available at http://www.openinterface.org/.

  4. Kinect Open Interface is available at http://koi.codeplex.com/.

References

  1. Bianchi-Berthouze N et al (2007) Does body movement engage you more in digital game play? And why?. In: Proceedings of the international conference of affective computing and intelligent interaction, pp 102–113

  2. Immersence website (2012) Text and videos available at the Immersence website. http://www.immersence.com. Accessed 10 April 2012

  3. Lazzaro N (2004) Why we play games: four keys to more emotion without story. In: Technical report, XEO Design Inc

  4. Suma EA et al (2013) Adapting user interfaces for gestural interaction with the flexible action and articulated skeleton toolkit. J Comput Graph 37(3):247–248

    Google Scholar 

  5. Teófilo LF, Nogueira PA, Silva PB (2013) Springer advances in intelligent systems and computing. In: WorldCIST’13 International Conferences, vol 206, pp 873–884. Algarve, Portugal

  6. Bernsen NO, Dybkjær L (2001) Exploring natural interaction in the car. In: Proceedings of the CLASS workshop on natural interactivity and intelligent interactive information representation, pp 75–79

  7. Santos ES (2011) Interaction in augmented reality environments using kinect. In: Proceedings of the XIII symposium on virtual reality, pp 112–121

  8. Linder N (2011) LuminAR: a compact and kinetic projected augmented reality interface. M.Sc. thesis. MIT, Cambridge

  9. Vera L, Gimeno J, Coma I, Fernández M (2011) Augmented mirror: interactive augmented reality system based on kinect. In: Campos P, Graham N, Jorge J, Nunes N, Palanque P, Winckler M (eds) Human-Computer Interaction - INTERACT 2011, Lecture Notes in Computer Science, vol 6949. Springer, Heidelberg, pp 483–486. doi:10.1007/978-3-642-23768-3_63

  10. Kratz L , Smith M, Lee FJ (2007) Wiizards: 3D gesture recognition for game play input. In: Proceedings of the 2007 conference on future play, Future Play ’07. ACM, New York, pp 209–212. doi:10.1145/1328202.1328241

  11. Lockman J, Fisher RS, Olson DM (2011) Detection of seizure-like movements using a wrist accelerometer. Epilepsy Behav 20(4):638–641

  12. Zhang X et al (2009) Hand gesture recognition and virtual game control based on a 3D accelerometer and EMG sensors. In: International conference on intelligent user interfaces

  13. Yamada T, Hayamizu Y, Yamamoto Y, Yomogida Y, Izadi-Najafabadi A, Futaba DN, Hata K (2011) A stretchable carbon nanotube strain sensor for human-motion detection. Nat Nanotechnol 6:296–301

    Article  Google Scholar 

  14. Sakoe H, Chiba S (1978) Dynamic programming algorithm optimization for spoken word recognition. IEEE Trans Acoust Speech Signal Process 26(1):43–49

    Article  MATH  Google Scholar 

  15. Kinect SDK Dynamic Time Warping (DTW) Gesture Recognition Website (2011) http://kinectdtw.codeplex.com. Accessed 10 April 2012

  16. Schwarz LA et al (2012) Recognizing multiple human activities and tracking full-body pose in unconstrained environments. Pattern Recogn 45:11–23

    Article  Google Scholar 

  17. Arantes M, Gonzaga A (2011) Human gait recognition using extraction and fusion of global motion features. Multimed Tools Appl 55:655–675

  18. Saon G, Soltau H (2012) Boosting systems for large vocabulary continuous speech recognition. Speech Commun 54:212–218

  19. Jelinek F (1998) Statistical methods for speech recognition. MIT Press, New York (ISBN-13:978-0-262-1066-3)

  20. Shih PY, Lin PC, Wang JF, Lin YN (2011) Robust several-speaker speech recognition with highly dependable online speaker adaptation and identification. J Netw Comput Appl 34:1459–1467

  21. Microsoft Corporation (2012) Kinect sensor manual and warranty. http://download.microsoft.com/download/f/6/6/f6636beb-a352-48ee-86a3-abd9c0d4492a/kinectmanual.pdf. Accessed 9 April 2012

  22. Serrano M, Nigay L, Lawson J-YL, Ramsay A, Murray-Smith R, Denef S (2008) The openinterface framework: a tool for multimodal interaction. CHI ’08 extended abstracts on human factors in computing systems (CHI EA ’08). ACM, New York, pp 3501–3506

    Google Scholar 

  23. Schröder M (2010) The SEMAINE API: towards a standards-based framework for building emotion-oriented systems. Adv Hum Comput Interact 2010:article ID 319406

  24. McMahan Ryan P, Alexander Joel D, Alon Shaimaa Lazem (2010) Evaluating natural interaction techniques in video games. Proceedings of the 2010 IEEE symposium on 3D user interfaces (3DUI ’10). IEEE Computer Society, Washington, DC, pp 11–14

    Chapter  Google Scholar 

  25. Ilves M, Gizatdinova Y, Surakka V, Vankka E (2014) Head movement and facial expressions as game input. EntertainComput 5(3):147–156

    Google Scholar 

  26. Edward TSE, Greenberg S, Shen C, Forlines C (2007) Multimodal multiplayer tabletop gaming. Comput Entertain (CIE) Interact TV 5(2):article no. 12

  27. Chatzidaki E, Liapis A, Tsironis A (2014) Users’ emotional experience using different modalities: a comparative study. Int J Web Eng Technol 9(2):148–163

    Article  Google Scholar 

  28. Ermi L, Mäyrä F (2005) Fundamental components of the gameplay experience: analysing immersion. In: Changing views: worlds in play, selected papers of the 2005 digital games research association’s second international conference, Vancouver, Canada. DiGRA, pp 15–27

Download references

Acknowledgments

This research was partially funded by the Ph.D. grants with references: SFRH/BD/71598/2010, SFRH/BD/77688/2011 and SFRH/BD/73607/2010.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luís Filipe Teófilo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nogueira, P.A., Teófilo, L.F. & Silva, P.B. Multi-modal natural interaction in game design: a comparative analysis of player experience in a large scale role-playing game. J Multimodal User Interfaces 9, 105–119 (2015). https://doi.org/10.1007/s12193-014-0172-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-014-0172-1

Keywords

Navigation