skip to main content
10.1145/2788940.2794357acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
abstract

Exocentric Rendering of "Reality Distortion" User Interface to Illustrate Egocentric Reprojection

Published:08 August 2015Publication History

ABSTRACT

We have been working on "twirling" interfaces, featuring affordances spun in "padiddle" or "poi" style. The affordances, crafted out of mobile devices (smartphones and tablets) embedded into twirlable toys, sense their orientation and allow "mobile-ambient" individual control of public display, such as a large format screen. Typically one or two users will face such graphical display and twirl their manipulables, whilst a representation of them and their respective toys are projected into a fantasy scene. The projected scene is rendered egocentrically by centering the projection on the avatars as a virtual camera orbits around in an arcing, spin-around "inspection gesture." Besides dorsal, "tethered" viewing perspectives, the projection is intended for frontal, "mirror" perspectives as well and also intermediate camera angles. The fantasy scene is not a totally faithful mapping of the real-life "meatspace" scene, since both avatar handedness and affordance phase are adjusted to flatter frontal views and other camera angles. Environmental lighting is deployed in the user space as a token indicating the relative position of the virtual camera in the fantasy scene relative to the self-identified avatars. Awareness of virtual camera position and presumed perspective of human users causes supernatural ambidextrous hand-switching and continuous, deliberate distortion of affordance azimuth, parameterized by unwrapped phase of orbiting camera. But even with the environmental lighting as an indication of subjective viewpoint, such projection can be difficult for users to understand, so we developed an alternate view on the scene. To clarify the reality distortion, a 1st-order scene display of the pre-warped user space is hereby introduced, logically interpolating between the 0-order user space and the 2nd-order fantasy scene, featuring an exocentric perspective with basically fixed camera position (except for allowing interocular displacement to accommodate stereoscopic views). We use Alice (v. 3) for both the 1st- and 2nd-order renderings, associated by mixed virtuality rigging with affordance-embedded mobile devices through some middleware. The 1st-order scene is more like a faithful rendering of the user space, stripped of the fantasy scene elements (setting, costume, props, etc.), but highlighting the virtual-camera-controlled avatar ambidexterity and affordance phase modulation with "ghost" appendages. It features simulation of the orbiting virtual camera, displaying its accumulated phase as a coil, making explicit the unresolved tension ("borrowed" but "unreturned") introduced by the orbit and manifesting as consequent phase perturbation of the projected affordance. Simulation of the environmental lighting is included as well, showing the quadrant-wise determination of the demultiplexed light as the virtual camera sweeps around. The phase coil is portrayed as a helix attached to the orbiting virtual camera, the better to appreciate its unwrapped phase. The height of the camera is fixed (and slightly pointing down) at the height of the avatar head. The idea of meta-scenes and explicit, objective ``full citizen' cameras within scenes is not new. It has been used, for instance, in movie blocking simulations and visualization of computer graphic projections. The research described here applies such ideas to to mixed virtuality environments with fluid perspective. Our prototype is contextualized as a proof-of-concept of mobile-ambient interfaces, using personal devices to control public displays, and supported by the notions of hierarchies of perspective and subject-object relationships.

References

  1. M. Cohen, R. Ranaweera, B. Ryskeldiev, T. Oyama, A. Hashimoto, N. Tsukida, and M. Toshimune. Multimodal mobile-ambient transmedial twirling with environmental lighting to complement fluid perspective with phase-perturbed affordance projection. In SIGGRAPH Asia Symp. on Mobile Graphics and Interactive Applications, Shenzhen, China, Dec. 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Exocentric Rendering of "Reality Distortion" User Interface to Illustrate Egocentric Reprojection

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SUI '15: Proceedings of the 3rd ACM Symposium on Spatial User Interaction
      August 2015
      152 pages
      ISBN:9781450337038
      DOI:10.1145/2788940

      Copyright © 2015 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 8 August 2015

      Check for updates

      Qualifiers

      • abstract

      Acceptance Rates

      SUI '15 Paper Acceptance Rate17of48submissions,35%Overall Acceptance Rate86of279submissions,31%

      Upcoming Conference

      SUI '24
      ACM Symposium on Spatial User Interaction
      October 7 - 8, 2024
      Trier , Germany

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader