Loading [a11y]/accessibility-menu.js
DRCmpVis: Visual Comparison of Physical Targets in Mobile Diminished and Mixed Reality | IEEE Journals & Magazine | IEEE Xplore

DRCmpVis: Visual Comparison of Physical Targets in Mobile Diminished and Mixed Reality


Abstract:

Numerous physical objects in our daily lives are grouped or ranked according to a stereotyped presentation style. For example, in a library, books are typically grouped a...Show More

Abstract:

Numerous physical objects in our daily lives are grouped or ranked according to a stereotyped presentation style. For example, in a library, books are typically grouped and ranked based on classification numbers. However, for better comparison, we often need to re-group or re-rank the books using additional attributes such as ratings, publishers, comments, publication years, keywords, prices, etc., or a combination of these factors. In this article, we propose a novel mobile DR/MR-based application framework named DRCmpVis to achieve in-context multi-attribute comparisons of physical objects with text labels or textual information. The physical objects are scanned in the real world using mobile cameras. All scanned objects are then segmented and labeled by a convolutional neural network and replaced (diminished) by their virtual avatars in a DR environment. We formulate three visual comparison strategies, including filtering, re-grouping, and re-ranking, which can be intuitively, flexibly, and seamlessly performed on their avatars. This approach avoids breaking the original layouts of the physical objects. The computation resources in virtual space can be fully utilized to support efficient object searching and multi-attribute visual comparisons. We demonstrate the usability, expressiveness, and efficiency of DRCmpVis through a user study, NASA TLX assessment, quantitative evaluation, and case studies involving different scenarios.
Published in: IEEE Transactions on Visualization and Computer Graphics ( Volume: 30, Issue: 12, December 2024)
Page(s): 7672 - 7686
Date of Publication: 25 January 2024

ISSN Information:

PubMed ID: 38271164

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.