Abstract:
Tangible user interfaces are frequently utilized in projection-based VR to merge virtual space into real large-scale environments. In this work, we introduce a LIDAR-base...Show MoreMetadata
Abstract:
Tangible user interfaces are frequently utilized in projection-based VR to merge virtual space into real large-scale environments. In this work, we introduce a LIDAR-based touch system that enables tangible touch interactions with five surfaces (four-sided walls and a floor) of a large room for projection-based VR. We develop a network of LIDAR distance sensors to identify touch events by multiple human users that can detect surrounding obstacles in any omnidirectional direction. We experiment with single-user and multi-user, multi-touch interaction scenarios in our specialized room-scale settings. As a result, we verified that our LIDAR network could provide appropriate feedback in real- time, with minimized human perception delay.
Published in: 2022 13th International Conference on Information and Communication Technology Convergence (ICTC)
Date of Conference: 19-21 October 2022
Date Added to IEEE Xplore: 25 November 2022
ISBN Information: