Skip to main content

Coordinated Motor Display System of ARM-COMS for Evoking Emotional Projection in Remote Communication

  • Conference paper
  • First Online:
Human Interface and the Management of Information (HCII 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14015))

Included in the following conference series:

  • 761 Accesses

Abstract

The authors have proposed a coordinated motor display system for called ARM-COMS (ARm-supported eMbodied COMmunication Monitor System) that detects the orientation of the subject’s head by face tracking through image processing technology and physically moves the monitor to mimic the subject. The idea behind this is that the remote subject’s avatar behaves as if it functions during video communication and interacts with the local subject. In addition, ARM-COMS responds appropriately with voice even when the remote subject’s head movements cannot be detected. Furthermore, ARM-COMS is a highly responsive system by responding to local subject’s voice. This paper introduces the basic concepts of ARM-COMS, describes how it was developed, and describes how the basic procedures were implemented in a prototype system. Then this paper introduces the results of teleconferencing experiments using ARM-COMS, and describes the findings obtained from the results, including the effect of physical interaction through ARM-COMS, camera shake problem, and evoking emotional projection in remote communication.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anotation. https://atonaton.com/. Accessed 12 Feb 2023

  2. Baltrušaitis, T., Robinson, P., Morency, L.-P.: OpenFace: an open source facial behavior analysis toolkit. In: 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, pp. 1–10 (2016). https://doi.org/10.1109/WACV.2016.7477553

  3. Bertrand, C., Bourdeau, L.: Research interviews by Skype: a new data collection method. In: Esteves, J. (ed.) Proceedings from the 9th European Conference on Research Methods, pp 70–79. IE Business School, Spain (2010)

    Google Scholar 

  4. Cabinet of Secretariat of Japan. https://corona.go.jp/en/. Accessed 12 Feb 2023

  5. Dionisio, J.D.N., Burns III, W.G., Gilbert, R.: 3D virtual worlds and the metaverse: current status and future possibilities. ACM Comput. Surv. 45(3), 1–38 (2013). Article No 34. https://doi.org/10.1145/2480741.2480751

  6. Dlib C++ libraty. http://dlib.net/. Accessed 12 Feb 2023

  7. Ekman, P., Friesen, W.V.: The repertoire or nonverbal behavior: categories, origins, usage, and coding. Semiotica 1, 49–98 (1969)

    Article  Google Scholar 

  8. Gerkey, B., Smart, W., Quigley, M.: Programming robots with ROS. O’Reilly Media (2015)

    Google Scholar 

  9. Ito, T., Watanabe, T.: Motion control algorithm of ARM-COMS for entrainment enhancement. In: Yamamoto, S. (ed.) HIMI 2016. LNCS, vol. 9734, pp. 339–346. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-40349-6_32

    Chapter  Google Scholar 

  10. Ito, T., Kimachi, H., Watanabe, T.: Combination of local interaction with remote interaction in ARM-COMS communication. In: Yamamoto, S., Mori, H. (eds.) HCII 2019. LNCS, vol. 11570, pp. 347–356. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-22649-7_28

    Chapter  Google Scholar 

  11. Ito, T., Oyama, T., Watanabe, T.: Smart speaker interaction through ARM-COMS for health monitoring platform. In: Yamamoto, S., Mori, H. (eds.) HCII 2021. LNCS, vol. 12766, pp. 396–405. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-78361-7_30

    Chapter  Google Scholar 

  12. Kimachi, H., Ito, T., Watanabe, T.: Robotic arm control based on MQTT-based remote behavior communication. In: The Proceedings of Design & Systems Conference, vol. 27, Session ID 1206, p. 1206 (2017)

    Google Scholar 

  13. Kimachi, H., Ito, T.: Introduction of local interaction to head-motion based robot. In: The Proceedings of Design & Systems Conference. https://doi.org/10.1299/jsmedsd.2018.28.2204

  14. Kubi. https://www.kubiconnect.com/. Accessed 18 Feb 2023

  15. Kumar, A., Haider, Y., Kumar, M., et al.: Using WhatsApp as a quick-access personal logbook for maintaining clinical records and follow-up of orthopedic patients. Cureus 13(1), e12900 (2021). https://doi.org/10.7759/cureus.12900

  16. Lee, A., Kawahara, T.: Recent development of open-source speech recognition engine Julius. In: Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC) (2009)

    Google Scholar 

  17. Lokshina, I., Lanting, C.: A qualitative evaluation of IoT-driven eHealth: knowledge management, business models and opportunities, deployment and evolution. In: Kryvinska, N., Greguš, M. (eds.) Data-Centric Business and Applications. LNDECT, vol. 20, pp. 23–52. Springer, Cham (2019). https://doi.org/10.1007/978-3-319-94117-2_2

    Chapter  Google Scholar 

  18. Helmke, M., Joseph, E., Rey, J.A.: Official Ubuntu Book, Pearson, Edition 9 (2016)

    Google Scholar 

  19. Medical Alert Advice. www.medicalalertadvice.com. Accessed 12 Feb 2023

  20. Quigley, M., Gerkey, B., Smart, W.D.: Programming Robots with ROS: A practical introduction to the Robot Operating System. O’Reilly Media (2015)

    Google Scholar 

  21. OpenCV. http://opencv.org/. Accessed 18 Feb 2023

  22. OpenFace API Documentation. http://cmusatyalab.github.io/openface/. Accessed 18 Feb 2023

  23. Osawa, T., Matsuda, Y., Ohmura, R., Imai, M.: Embodiment of an agent by an-thropomorphization of a common object. Web Intel. Agent Syst. Int. J. 10, 345–358 (2012)

    Google Scholar 

  24. oVice. https://www.ovice.com/. Accessed 12 Feb 2023

  25. Rviz. https://carla.readthedocs.io/projects/ros-bridge/en/latest/rviz_plugin/. Accessed 18 Feb 2023

  26. Schoff, F., Kalenichenko, D., Philbin, J.: FaceNet: a unified embedding for face recognition and clustering. In: IEEE Conference on CVPR 2015, pp. 815–823 (2015)

    Google Scholar 

  27. Society 5.0. https://www.japan.go.jp/abenomics/_userdata/abenomics/pdf/society_5.0.pdf. Accessed 12 Feb 2023

  28. Ubuntu. https://www.ubuntu.com/. Accessed 18 Feb 2023

  29. urdf/XML/Transmission. http://wiki.ros.org/urdf/XML/Transmission. Accessed 12 Feb 2023

  30. Watanabe, T.: Human-entrained embodied interaction and communication technology. In: Fukuda, S. (eds.) Emotional Engineering, pp. 161–177. Springer, London (2011). https://doi.org/10.1007/978-1-84996-423-4_9

  31. Wongphati, M., Matsuda, Y., Osawa, H., Imai, M.: Where do you want to use a robotic arm? And what do you want from the robot? In: International Symposium on Robot and Human Interactive Communication, pp. 322–327 (2012)

    Google Scholar 

Download references

Acknowledgement

This work was partly supported by JSPS KAKENHI Grant Numbers JP22K12131, Science and Technology Award 2022 of Okayama Foundation for Science and Technology, Original Research Grant 2022 of Okayama Prefectural University. The author would like to acknowledge Dr. Takashi Oyama, Mr. Hiroki Kimachi, Mr. Shuto Misawa, Mr. Kengo Sadakane, Mr. Tetsuo Kasahara for implementing the basic modules, and all members of Kansei Information Engineering Labs at Okayama Prefectural University for their cooperation to conduct the experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Teruaki Ito .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ito, T., Watanabe, T. (2023). Coordinated Motor Display System of ARM-COMS for Evoking Emotional Projection in Remote Communication. In: Mori, H., Asahi, Y. (eds) Human Interface and the Management of Information. HCII 2023. Lecture Notes in Computer Science, vol 14015. Springer, Cham. https://doi.org/10.1007/978-3-031-35132-7_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35132-7_28

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35131-0

  • Online ISBN: 978-3-031-35132-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics