Skip to main content

Multi-robot Interaction with Mixed Reality for Enhanced Perception

  • Conference paper
  • First Online:
Smart Applications and Data Analysis (SADASC 2024)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 2168))

Included in the following conference series:

  • 58 Accesses

Abstract

The growing development of Mixed Reality (MR) simulators in diverse areas such as vehicle operation, sports, and healthcare underscore the importance of high-quality immersion. Currently, evaluating the immersion quality of these simulators relies solely on post-simulation feedback from users, typically gathered through subjective questionnaires. However, this method fails to capture essential data for estimating the user's self-motion perception, which is crucial for optimizing immersion non-empirically. To address this gap, we propose a human homothetic perception framework. This framework aims to replicate human self-motion perception capabilities within a multimodal robotic system in MR environments. It incorporates a hybrid Gough-Stewart platform with a human-substitute NAO robot. This setup aims to recreate motion perception by integrating visual and vestibular information in a human perception model. We demonstrate this framework using trajectories from a Unity-based virtual ski simulator, designed to rehabilitate individuals with disabilities. The quality of immersion is evaluated by comparing the motion perception errors of the human model against the virtual trajectories. This innovative approach offers the first calibration tool for MR applications, eliminating the need for human feedback.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Sandoval-Gonzalez, O., et al.: Design and development of a hand exoskeleton robot for active and passive rehabilitation. Int. J. Adv. Robot. Syst. 13(2), 66 (2016)

    Article  Google Scholar 

  2. Ambrosini, E., et al.: The combined action of a passive exoskeleton and an EMG-controlled neuroprosthesis for upper limb stroke rehabilitation: first results of the retrainer project. In: 2017 International Conference on Rehabilitation Robotics (ICORR), pp. 56–61. IEEE (2017)

    Google Scholar 

  3. Vailland, G., et al.: Power wheelchair virtual reality simulator with vestibular feedback. Modell. Meas. Control C 81, 35–42 (2020)

    Article  Google Scholar 

  4. Vailland, G., et al.: Vestibular feedback on a virtual reality wheelchair driving simulator: a pilot study. In: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 171–179 (2020)

    Google Scholar 

  5. Burgar, T.C.G., Lum, P.S., Shor, P.C., Van der Loos, H.M., et al.: Development of robots for rehabilitation therapy: the Palo Altova/Stanford experience. J. Rehabil. Res. Dev. 37(6), 663–674 (2000)

    Google Scholar 

  6. Calabro, R.S., et al.: Robotic gait training in multiple sclerosis rehabilitation: can virtual reality make the difference? Findings from a randomized controlled trial. J. Neurol. Sci. 377, 25–30 (2017)

    Article  Google Scholar 

  7. Ma, W., Zhang, X., Yin, G.: Design on intelligent perception system for lower limb rehabilitation exoskeleton robot. In: 2016 13th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), pp. 587–592. IEEE (2016)

    Google Scholar 

  8. Boy, G.A.: Cognitive Function Analysis, vol. 2. Greenwood Publishing Group, Westport (1998)

    Google Scholar 

  9. Alshaer, A., Regenbrecht, H., O’Hare, D.: Immersion factors affecting perception and behaviour in a virtual reality power wheelchair simulator. Appl. Ergon. 58, 1–12 (2017)

    Article  Google Scholar 

  10. Vailland, G., Gaffary, Y., Devigne, L., Gouranton, V., Arnaldi, B., Babel, M.: Vestibular feedback on a virtual reality wheelchair driving simulator: a pilot study. In: Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, pp. 171–179 (2020)

    Google Scholar 

  11. Hock, P., Colley, M., Askari, A., Wagner, T., Baumann, M., Rukzio, E.: Introducing vampire–using kinesthetic feedback in virtual reality for automated driving experiments. In: Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 204–214 (2022)

    Google Scholar 

  12. Paulich, M., Schepers, M., Rudigkeit, N., Bellusci, G.: Xsens MTw Awinda: miniature wireless inertial-magnetic motion tracker for highly accurate 3d kinematic applications, pp. 1–9. Xsens, Enschede, The Netherlands (2018)

    Google Scholar 

  13. Furgale, P., Rehder, J., Siegwart, R.: Unified temporal and spatial calibration for multi-sensor systems. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1280–1286. IEEE (2013)

    Google Scholar 

  14. Campos, C., Elvira, R., Rodríguez, J.J.G., Montiel, J.M., Tardos, J.D.: ORB-SLAM3: an accurate open-source library for visual, visual–inertial, and multimap SLAM. IEEE Trans. Robot. 37(6), 1874–1890 (2021)

    Article  Google Scholar 

  15. Bescos, B., Facil, J.M., Civera, J., Neira, J.: DynaSLAM: tracking, mapping, and inpainting in dynamic scenes. IEEE Robot. Autom. Lett. 3(4), 4076–4083 (2018)

    Article  Google Scholar 

  16. Beghdadi, A., Mallem, M., Beji, L.: D2SLAM: semantic visual slam based on the influence of depth for dynamic environments, arXiv preprint arXiv:2210.08647 (2022)

  17. Hosman, R., Cardullo, F., Bos, J.: Visual-vestibular interaction in motion perception. In: AIAA Modeling and Simulation Technologies Conference, p. 6425 (2011)

    Google Scholar 

  18. Heerspink, Berkouwer, W., Stroosma, O., van Paassen, R., Mulder, M., Mulder, B.: Evaluation of vestibular thresholds for motion detection in the Simona research simulator. In: AIAA Modeling and Simulation Technologies Conference and Exhibit, p. 6502 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Taha Houda .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Houda, T., Amouri, A., Beghdadi, A., Beji, L. (2024). Multi-robot Interaction with Mixed Reality for Enhanced Perception. In: Hamlich, M., Dornaika, F., Ordonez, C., Bellatreche, L., Moutachaouik, H. (eds) Smart Applications and Data Analysis. SADASC 2024. Communications in Computer and Information Science, vol 2168. Springer, Cham. https://doi.org/10.1007/978-3-031-77043-2_3

Download citation

Publish with us

Policies and ethics