Abstract
The OMERO 2.0 system (Organized Multimodal Exploration of Relevant Virtual Objects) is an innovative system that enables visually impaired users to explore and edit 3D virtual models. It involves three interaction modalities: visual, haptic and auditory. Virtual models are properly designed to convey the information of interest in a polymorphous and redundant way: the user can therefore choose the sensorial modalities best suited to his/her characteristics, accounting for specific limitations and/or impairments. Virtual models are specially organized to help visually impaired people in building an integrated mental scheme of complex realities (cultural heritage objects and sites, large buildings, abstract concepts in fields such as geometry or chemistry etc.): a challenging task when using a serial sense such as touch. Different semantic layers of the scene (scenarios) convey logically different views of the scene at hand and can be selected separately or in combination depending on the user’s needs: that prevents users from being overwhelmed by too many simultaneous details. The software tools used in this new version of OMERO increase the generality of the system and support a larger number of haptic devices. Moreover, the completely new Interactive Haptic Editor of OMERO offers an innovative haptic interface: the haptic properties of the virtual models can be edited even without using the GUI. This redundant combination of vision and touch improves the efficiency for sighted people and enables visually impaired users (that cannot use a GUI) to modify autonomously the rendering of virtual scenes. This results in their active involvement even in the design phase, improving their ability to match the rendering with their specific and individual needs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
De Boeck, J., Cuppens, E., De Weyer, T., Raymaejers, C., Coninx, K.: Multisensory interaction metaphors with haptics and propioception in virtual environments. In: Proceedings of NordiCHI 2004, Tampere, FI, October 2004
De Boeck, J., Raymaekers, C., Coninx, K.: Are existing metaphors in virtual environments suitable for haptic interaction. In: Proceedings of the 7th International Conference on Virtual Reality, pp. 261–268 (2005)
De Felice, F., Gramegna, T., Renna, F., Attolico, G., Distante, A.: A portable system to build 3D models of culturale heritage and to allow their explorations by blind people. In: IEEE Proceedings of HAVE 2005, Ottawa, Ontario, Canada, October 2005
De Felice, F., Renna, F., Attolico, G., Distante, A.: A haptic/acoustic application to allow blind the access to spatial information. In: Proceeding of WorldHaptics 2007, Tzukuba, Japan, March 2007
De Felice, F., Attolico, G., Distante, A.: Configurable design of multimodal non-visual interfaces for 3D VEs. In: Proceedings of HAID 2009 4th International Conference on Haptic and Audio Interaction Design, Dresden, Germany, September 2009, p. 71 (2009)
Lahav, O., Mioduser, D.: Haptic-feedback support for cognitive mapping of unknown spaces by people who are blind. Int. J. Hum Comput Stud. 66, 23–25 (2008)
Lecuyer, A., Mobuchon, P., Megard, C., Perret, J., Andriot, C., Colinot, J.: Homere: a multimodel system for visually impaired people to explore virtual environments. In: Proceedings of IEEE Virtual Reality, pp. 251–258 (2003)
Okamura, A., Cutkosky, M.: Haptic exploration of fine surface features. In: Proceedings of IEEE International Conference on Robotics and Automation, pp. 2930–2936 (1999)
Ott, R., Vexo, F., Thalmann, D.: Two-handed haptic manipulation for CAD and VR applications. Comput. Aided Des. Appl. 7(1), 125–138 (2010)
Badcock, D.R., Palmisano, S., May, J.G.: Vision and virtual environments. In: Hale, K.S., Stanney, K.M. (eds.) Handbook of Virtual Environments: Design, Implementation, and Applications, 2nd edn, pp. 39–86. Taylor & Francis Group Inc., Boca Raton (2014)
Wuillemin, D., van Doom, G., Richardson, B., Symmons, M.: Haptic and visual size judgements in virtual and real environments. In: Proceedings of IEEE World Hapics Conference, pp. 86–89 (2005)
Yoon, W.J., Hwang, W.-Y., Perry, J.C.: Study on effects of surface properties in haptic perception of virtual curvature. Int. J. Comput. Appl. Technol. 53, 236–243 (2016)
Maidenbaum, S., Levy-Tzedek, S., Chebat, D.R., Amedi, A.: Increasing accessibility to the blind of virtual environments, using a virtual mobility aid based on the “EyeCane”: feasibility study. PLoS One 8(8), e72555 (2013). https://doi.org/10.1371/journal.pone.0072555
Picinali, L., Afonso, A., Denis, M., Katz, B.F.G.: Exploration of architectural spaces by blind people using auditory virtual reality for the construction of spatial knowledge. Int. J. Hum Comput Stud. 72(4), 393–407 (2014). https://doi.org/10.1016/j.ijhcs.2013.12.008
Jaimes, A., Sebe, N.: Multimodal human-computer interaction: A survey. Comput. Vis. Image Underst. 108(1–2), 116–134 (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Palieri, M., Guaragnella, C., Attolico, G. (2018). Omero 2.0. In: De Paolis, L., Bourdot, P. (eds) Augmented Reality, Virtual Reality, and Computer Graphics. AVR 2018. Lecture Notes in Computer Science(), vol 10850. Springer, Cham. https://doi.org/10.1007/978-3-319-95270-3_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-95270-3_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-95269-7
Online ISBN: 978-3-319-95270-3
eBook Packages: Computer ScienceComputer Science (R0)