Keywords

1 Introduction

The Advanced Exploitation of Mixed Reality Community of Interest (AEMR COI) is an initiative that started as an effort from the Battlespace Exploitation of Mixed Reality (BEMR) Laboratory at SSC Pacific to create a community of interest in the research area of mixed reality technology. This includes the spectrum of technologies ranging from virtual reality to augmented reality. The AEMR COI is designed to include all government and non-government organizations or entities holding a significant stake in mixed reality activities, including smart avatar development, augmented and virtual reality research development, and any mixed reality related technology. The goal of the AEMR COI is to discuss, explore, and demonstrate augmented and virtual reality technology, applications, standards, use cases, and to provide appropriate guidance with recommendations to ensure effective and efficient community interoperability and collaboration. The community started in June of 2015 and has quickly grown to over 250 representatives from organizations representing nine different countries across government, industry, and academia. The community is organized into four major working groups focused on User Requirements, Technology, Human Performance, and Smart Avatars. Upcoming emerging working groups include Models, Information Exchange, and Reference Implementation Frameworks.

2 Community Approach

The community intends to further evolve science and technology in the virtual reality and mixed reality domain by: (1) Encouraging scientists to discuss current development, immediate and potential uses, advantages and limitations; (2) Establishing potential collaborations; and (3) Identifying community standards such as interoperability and technology evaluation tools. This community also intends to identify and evolve potential research areas for further development by: (1) Establishing a platform for research sponsors to present their needs to the scientific community; (2) Developing research gap analyses in different areas such as smart avatar, human performance, and technology assessment; and (3) Generating research ideas and proposals to foster innovative research.

3 Mission and Venue

The community mission is to foster innovation, collaboration, and interoperability in the community. Moreover, the mission is to also guide the development of new technologies towards areas of interest. The community intends to facilitate a collaborative environment to explore the rapidly evolving mixed reality technologies and facilitate their application into different domains. The mission includes facilitating the implementation of mixed reality technology in operational and support (maintenance, logistics, and training) environments by demonstrating those technologies and applications to industry leaders, technical specialists, academia, and government program managers and officials. The community will assess the technical requirements necessary to bring these capabilities to the operational domain, and support tasks that would benefit substantially from them.

The AEMR COI uses the All Partners Access Network (APAN) site to host its community. APAN is the Unclassified Information Sharing Service for the U.S. Department of Defense. Information about the community, related videos, documentations, and wikis are being held in this website. Discussions, news, updates, and other important information can also be found in this site. Meetings are being held over a telecom number and screen sharing is being offered over Defense Collaboration Services (DCS). Members are kept informed via email subscription to the different working groups, and the overall group distribution list.

4 Working Groups

There are four active working groups which concentrate their efforts in determining use cases and vignettes, evaluating current technologies, finding research gaps, determining useful metrics and tools to evaluate performance with VR technologies, and developing smart avatar technology. These four working groups were developed to support each other in their mission to enhance the community knowledge and practice in this research domain.

4.1 The User Requirements Working Group (URWG)

The User Requirement Working Group unites experts in industry, academia, and defense to explore how virtual and augmented technologies are, or can be, applied to real word problems to enhance human performance. Core themes have centered on disaster relief, manufacturing, medical, and military applications. The group is currently collaborating to document use cases and vignettes for different areas and applications, as well as to highlight the research gaps for each application. Discussions include ongoing work in the medical field. For instance, at Naval Health Research Center, virtual reality technology is used in operationally relevant ways, to define capabilities and limits of the warfighter, as well as to help wounded warriors to heal. The Computer Assisted Rehabilitation Environment (CAREN) system is being used to enhance performance of injured and healthy warfighters. This includes measurements of performance, both cognitively and physically, of persons with lower limb amputation and traumatic brain injury, as well as healthy warfighters who may show performance deficits due to wearing gear (PPE), loads carried, or fatigue. The researchers take what they learn in the lab to the field, where only a handful of things can be measured, but using this technique they can draw parallels between the two phases of testing [1, 2] (Fig. 1).

Fig. 1.
figure 1

The CAREN system being used at the Naval Health Research Center[1]

In the maintenance world, there is now considerable effort being devoted to this technology. This is due, in large part, to the introduction of augmented reality HMDs (i.e., MicroSoft HoloLense, PLATS, and other types) where the Computer Generated Imagery (CGI) is either presented on a display system with see-through lenses (or on a display-visor) such that the CGI is superimposed over the real world, or the CGI is combined with real-world imagery from HMD-mounted cameras. In other application areas, such as disaster relief, the participants in the system can be in different physical locations while collaborating on brining in rescue teams and coordinating resource distribution. The shared environment database is transmitted to each participant prior to the meeting, and stored locally for use when needed. This approach dramatically reduces bandwidth requirements, and only requires the real-time transmission of avatar movement and position data, interactions with the virtual environment, including the addition of features and objects, and voice communications (Fig. 2).

Fig. 2.
figure 2

Maintenance application demonstrations at the BEMR’s Laboratory [3]

4.2 The Technology Assessment Working Group (TAWG)

The Technology Assessment Working Group unites experts in industry, academia, and government to explore, evaluate, and assess virtual and augmented reality technologies. Members share the latest ongoing activities in augmented and virtual reality technology, and discuss the uses and advantages of that technology. Core themes are centered on evaluating a wide range of technologies, products, and applications in virtual and augmented reality. The group is currently collaborating to create VR resources that members have developed or improved on in their own work. This catalogue includes the Berkeley Teleimmersion experience, which uses 3D cameras for creating a dynamic avatar in real time, and projecting it at the remote location into a shared virtual environment, to facilitate an experience similar to face-to-face interaction [4]. The users can take advantage of 3D interaction and display technologies to collaborate with their remote partners. Moreover, it also includes discussions on alternatives to control virtual and augmented reality via gesture and LED tracking solutions by Phase Space [5], and light field near-eye displays developed by Stanford Computational Imaging Laboratory at Stanford University [6]. A key concern has been the need for a new human-computer interface for the new generation of 3D augmented and virtual reality technologies (Fig. 3).

Fig. 3.
figure 3

Teleimmersion experience designed at UC Berkeley [4]

4.3 The Human Performance Working Group (HPWG)

The Human Performance Working Group unites experts in industry, academia, and government to explore how virtual and augmented technologies can enhance different facets of human endeavor. This group provides the human factors engineering perspective on performance measures and evaluation techniques that can be used for assessing the benefits of using mixed reality technologies to improve human performance. Core themes have centered on virtual reality and augmented reality based training platforms, user interfaces, research tools, and human-autonomy teaming. The group is currently collaborating to create a compendium of new and existing virtual reality resources that members have developed or improved in their own work. This catalogue includes innovative biosensing devices such as the EEG enhanced Oculus-based Samsung Gear VR system. This device was recently created at the Swartz Center for Computational Neuroscience at University of California, San Diego (UCSD) to record EEG and eye movement using a smart phone during full body engagement in a virtual environment [7]. A second example by UCSD is the Chronoviz, which supports the analysis and annotation of videographer data synchronized with other simultaneously recorded time-series measures [8]. Also included in the catalogue are guidance and tools on topics such as agent transparency and trust [9], and decision making performance changes following motion sickness [10]. A key concern that has been raised is the transfer of virtual reality-based learning to a real-world context (Fig. 4).

Fig. 4.
figure 4

UCSD Samsung Gear VR goggle adapted with integrated EEG sensing technology and eye movement system [7].

4.4 The Smart Avatar Working Group (SAWG)

The Smart Avatar Working Group unites experts in industry, academia, and government to explore and advance the discussion and documentation of the state-of-the-art smart avatar technology, key components, gaps in related technologies, and standards of interoperability requirements which can then be aligned with ongoing partner and community efforts to develop mixed reality solutions and improved information sharing and interoperability. Reusability and interoperability are key goals for this group. The group is currently collaborating to create virtual technology resources that members have developed or improved in their own work. This includes innovative software systems such as the Virtual Human Toolkit by the Institute for Creative Technologies (ICT) at University of Southern California (USC) [11], which enables the creation of nuanced virtual characters who move, talk, and interact in true to life forms. It also includes telepresence medical applications by Capitola Netherlands which allows doctors to follow elderly patients around in their daily life to assess the impact of their activities on their health [12]. Various relevant topics were also discussed during the meetings, including machine learning techniques for intelligent systems [13] (Fig. 5).

Fig. 5.
figure 5

USC ICT Virtual Human Toolkit [11]

5 Future Working Groups

There is a plan to open more working groups as the community expands; the groups below are just some examples of future working groups. Notice that the working groups are linked and developed in order to help support each other.

5.1 Models and Algorithms Working Group (MAWG)

The MAWG participants would collaborate on the development and evaluation of the algorithms and theoretical models used to identify and track objects, predict their trajectories, and predict potential conjunctions. These algorithms, and the object identification taxonomy, are part of the foundation of the AEMR community.

5.2 Adoption and Best Practices Working Group (ABPWG)

The ABPWG would coordinate the AEMR COI activities such as managing the COI Wiki, managing member list and associated e-mail groups, and producing the newsletter. The ABPWG would also run the COI configuration management effort, as well as related efforts, and tasks. They would use JIRA to oversee the changes to the data model, ensuring that the correct version of the data model is saved in the Document Library. They would conduct outreach efforts to other entities (e.g., U.S. Government organizations, NGAs, industry, academia, etc.) and coalition partners (e.g., AUS, CAN, NZ, UK, etc.) to enlist new contributors to the community, and identify conference and training opportunities for COI participants. The ABPWG would also support the creation and management of a common 3D model repository for U.S. Government users to minimize the duplication of effort and cost associated with creating, or buying, the 3D models needed by each virtual environment developer.

5.3 Data Model Working Group (DMWG)

The DMWG would include personnel from different organizations, including government and contractor computer scientists, who analyze the details of the XML, schema, etc. used by the legacy systems. One of the significant problems in the application of virtual reality and augmented reality technology is the large number of different data-formats for 3D models. The goal of the DMWG is to identify duplication, or similarity, of actual data models formats and descriptions. They would then resolve these overlaps to satisfy the functional requirements. The DMWG members would coordinate with the MAWG, URWG, and the ABPWG to ensure that the existing and future requirements are integrated.

6 Community Roadmap

The AEMR roadmap illustrates the flow of products and information across the AEMR community working groups. The requirements emerging from the User Requirements group inform the technologists and vice-versa. The human performance metrics and insights from the Human Performance group are combined with the avatar improvements from the Smart Avatar group which together provide the needed input to the Model and Algorithms and Data Model groups to produce interoperability solutions for models and applications. To validate the usability of the community recommendations and guidance, a reference implementation is needed. This example implementation of the core components can support community wide demonstrations and experiments in a virtual, distributed, interoperability testbed. The recommendations, guidance, and results can be packaged in various forms of documentation (briefs, documents, wiki entries, and videos) to assist with education and outreach by the Adoption and Best Practices group. Finally, the documentation stored in APAN can feed member repositories, such as the DoD Data Services Environment (Fig. 6).

Fig. 6.
figure 6

AEMR COI roadmap

Fig. 7.
figure 7

Enterprise Data Model

One of the long terms goals of the community is to organize the research domain effectively so that past research can be used to support future research. In order to do that, the community would have to support the development of an Enterprise Data Model (EDM). The EDM will provide a framework for a common data model enabling net-centric information exchange among the AEMR Stakeholders. The Common Data Model refers to data structures, definitions, attributes, XML schema, WSDLs, namespace, etc. which are common across multiple virtual reality and mixed reality systems (Fig. 7).

7 Community Opportunities

The AEMR community will be represented at the RIMPAC 2016 exercise as part of the SSC Pacific Battlespace Exploitation of Mixed Reality (BEMR) capability demonstration. The Rim of the Pacific Exercise (RIMPAC) is the world’s largest international maritime exercise and is hosted by the U.S. Navy’s Pacific Fleet. Held every two years in the June July timeframe, the 2014 RIMPAC includes 22 nations, 49 surface ships, 6 submarines, more than 200 aircraft, and 25,000 personnel. RIMPAC is a unique training opportunity that helps participants foster and sustain the cooperative relationships that are critical to ensuring the safety of sea lanes and security on the world’s oceans.

At the upcoming RIMPAC 2016 exercise, BEMR representatives will be showcasing the latest AR and VR gear, along with demonstrations tailored for common Humanitarian Assistance and Disaster Relief (HADR) use-cases such as en-route training, parts-maintenance repair and setup, situational awareness, and emergency management collaborations. The value of Mixed Reality and the community partnerships will be emphasized when BEMR representatives describe the equipment, capabilities, use cases and emerging standards. The AEMR community is a great example of successful and innovative collaboration across industry, academia, government, and coalition partners (Fig. 8).

Fig. 8.
figure 8

RIMPAC exercise

8 Sponsor Opportunities and Contact Information

Funding opportunities are available through our sponsor program. This program enables sponsors to support this community in exchange for a community package which includes areas of potential interest, with goal-oriented group discussion, and deliverables such as gap analysis in that domain, and potential white papers.

To hear more about the AEMR COI, to join, or to request a presentation in one of our four working group meetings, please contact us at aemr-admin@spawar.navy.mil.