ABSTRACT
Traveling to different places simultaneously is a dream for several people, but it is difficult to realize this aspiration because of our physical space limits. On one hand, virtual reality technologies can help alleviate such limits. According to the best of the authors’ knowledge, there is no study attempt to operate multiple telepresence robots in remote places simultaneously, with presenting walk sensation feedback to the operator for an immersive multispace experience. In this study, we used autonomous mobile robots; a dog and wheel type one, where their movements’ direction can be controlled by an operator (Fig. 1). The operator can alternatively choose/re-choose the space (or robot) to attend and can move the viewpoint using a head-mounted display (HMD) controller. A live video image with 4 K resolution is transmitted to the HMD via web real-time communication (WebRTC) network from a 360° camera placed to the top of each robot. The operator perceives viewpoint movement feedback as a visual cue and vestibular feeling via waist motion and proprioception on the legs. Our system also allows viewpoint sharing in which fifty users can enjoy omnidirectional viewing of the remote environments through the HMD without walk-like sensation feedback.
- Lucas Bruck, Bruce Haycock, and Ali Emadi. 2021. A Review of Driving Simulation Technology and Applications. IEEE Open Journal of Vehicular Technology 2 (2021), 1–16. https://doi.org/10.1109/OJVT.2020.3036582Google ScholarCross Ref
- Markku Suomalainen, Basak Sakcak, Adhi Widagdo, Juho Kalliokoski, Katherine J. Mimnaugh, Alexis P. Chambers, Timo Ojala, and Steven M. LaValle. 2022. Unwinding Rotations Improves User Comfort with Immersive Telepresence Robots. CoRR abs/2201.02392(2022). arXiv:2201.02392https://arxiv.org/abs/2201.02392Google Scholar
- Susumu Tachi. 2016. Telexistence: Enabling Humans to Be Virtually Ubiquitous. IEEE Computer Graphics and Applications 36, 1 (2016), 8–14. https://doi.org/10.1109/MCG.2016.6Google ScholarDigital Library
- Minori Unno, Ken Yamaoka, Vibol Yem, Tomohiro Amemiya, Michiteru Kitazaki, and Yasushi Ikei. 2021. Novel Motion Display for Virtual Walking. 482–492. https://doi.org/10.1007/978-3-030-78361-7_37Google ScholarDigital Library
- Vibol Yem, Reon Nashiki, Tsubasa Morita, Fumiya Miyashita, Tomohiro Amemiya, and Yasushi Ikei. 2019. TwinCam Go: Proposal of Vehicle-Ride Sensation Sharing with Stereoscopic 3D Visual Perception and Vibro-Vestibular Feedback for Immersive Remote Collaboration. In SIGGRAPH Asia 2019 Emerging Technologies (Brisbane, QLD, Australia) (SA ’19). Association for Computing Machinery, New York, NY, USA, 53–54. https://doi.org/10.1145/3355049.3360540Google ScholarDigital Library
Recommendations
Meta Avatar Robot Cafe: Linking Physical and Virtual Cybernetic Avatars to Provide Physical Augmentation for People with Disabilities
SIGGRAPH '22: ACM SIGGRAPH 2022 Emerging TechnologiesMeta avatar robot cafe is a cafe that fuses cyberspace and physical space to create new encounters with people. We create a place where people with disabilities who have difficulty going out can freely switch between their physical bodies and virtual ...
Demonstrating poimo as Inflatable, Inclusive Mobility Devices with a Soft Input Interface
SIGGRAPH '22: ACM SIGGRAPH 2022 Emerging TechnologiesIn this demo, we showcase poimo, a series of POrtable and Inflatable MObility devices made of a balloon-like soft material called drop-stitch fabric. Poimo has two merits: (1) inflatability of a soft and lightweight body allowing users to deflate and ...
Demonstration of Electrical Head Actuation: Enabling Interactive Systems to Directly Manipulate Head Orientation
SIGGRAPH '22: ACM SIGGRAPH 2022 Emerging TechnologiesWe demonstrate a novel interface concept in which interactive systems directly manipulate the user’s head orientation. We implement this using electrical-muscle-stimulation (EMS) of the neck muscles, which turns the head around its yaw (left/right) and ...
Comments