ABSTRACT
We have been working on "twirling" interfaces, featuring affordances spun in "padiddle" or "poi" style. The affordances, crafted out of mobile devices (smartphones and tablets) embedded into twirlable toys, sense their orientation and allow "mobile-ambient" individual control of public display, such as a large format screen. Typically one or two users will face such graphical display and twirl their manipulables, whilst a representation of them and their respective toys are projected into a fantasy scene. The projected scene is rendered egocentrically by centering the projection on the avatars as a virtual camera orbits around in an arcing, spin-around "inspection gesture." Besides dorsal, "tethered" viewing perspectives, the projection is intended for frontal, "mirror" perspectives as well and also intermediate camera angles. The fantasy scene is not a totally faithful mapping of the real-life "meatspace" scene, since both avatar handedness and affordance phase are adjusted to flatter frontal views and other camera angles. Environmental lighting is deployed in the user space as a token indicating the relative position of the virtual camera in the fantasy scene relative to the self-identified avatars. Awareness of virtual camera position and presumed perspective of human users causes supernatural ambidextrous hand-switching and continuous, deliberate distortion of affordance azimuth, parameterized by unwrapped phase of orbiting camera. But even with the environmental lighting as an indication of subjective viewpoint, such projection can be difficult for users to understand, so we developed an alternate view on the scene. To clarify the reality distortion, a 1st-order scene display of the pre-warped user space is hereby introduced, logically interpolating between the 0-order user space and the 2nd-order fantasy scene, featuring an exocentric perspective with basically fixed camera position (except for allowing interocular displacement to accommodate stereoscopic views). We use Alice (v. 3) for both the 1st- and 2nd-order renderings, associated by mixed virtuality rigging with affordance-embedded mobile devices through some middleware. The 1st-order scene is more like a faithful rendering of the user space, stripped of the fantasy scene elements (setting, costume, props, etc.), but highlighting the virtual-camera-controlled avatar ambidexterity and affordance phase modulation with "ghost" appendages. It features simulation of the orbiting virtual camera, displaying its accumulated phase as a coil, making explicit the unresolved tension ("borrowed" but "unreturned") introduced by the orbit and manifesting as consequent phase perturbation of the projected affordance. Simulation of the environmental lighting is included as well, showing the quadrant-wise determination of the demultiplexed light as the virtual camera sweeps around. The phase coil is portrayed as a helix attached to the orbiting virtual camera, the better to appreciate its unwrapped phase. The height of the camera is fixed (and slightly pointing down) at the height of the avatar head. The idea of meta-scenes and explicit, objective ``full citizen' cameras within scenes is not new. It has been used, for instance, in movie blocking simulations and visualization of computer graphic projections. The research described here applies such ideas to to mixed virtuality environments with fluid perspective. Our prototype is contextualized as a proof-of-concept of mobile-ambient interfaces, using personal devices to control public displays, and supported by the notions of hierarchies of perspective and subject-object relationships.
- M. Cohen, R. Ranaweera, B. Ryskeldiev, T. Oyama, A. Hashimoto, N. Tsukida, and M. Toshimune. Multimodal mobile-ambient transmedial twirling with environmental lighting to complement fluid perspective with phase-perturbed affordance projection. In SIGGRAPH Asia Symp. on Mobile Graphics and Interactive Applications, Shenzhen, China, Dec. 2014. Google ScholarDigital Library
Index Terms
- Exocentric Rendering of "Reality Distortion" User Interface to Illustrate Egocentric Reprojection
Recommendations
Multimodal mobile-ambient transmedial twirling with environmental lighting to complement fluid perspective with phase-perturbed affordance projection
SA '14: SIGGRAPH Asia 2014 Mobile Graphics and Interactive ApplicationsTo illuminate the alignment between mixed reality juggling toys and ambidextrous vactors twirling a projection of those toys, roomware lighting control is deployed to show the modeled position of a virtual camera spinning around each player, even while ...
Exploring Interaction Fidelity in Virtual Reality: Object Manipulation and Whole-Body Movements
CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing SystemsHigh degrees of interaction fidelity (IF) in virtual reality (VR) are said to improve user experience and immersion, but there is also evidence of low IF providing comparable experiences. VR games are now increasingly prevalent, yet we still do not ...
Device Registration for 3D Geometry-Based User-Perspective Rendering in Hand-Held Video See-Through Augmented Reality
Proceedings of the Second International Conference on Augmented and Virtual Reality - Volume 9254User-perspective rendering in Video See-through Augmented Reality V-AR creates a view that always shows what is behind the screen, from the user's point of view. It is used for better registration between the real and virtual world instead of the ...
Comments