Keywords

1 Introduction

Display is an important issue in exhibition design. The typical way to display exhibits is to put them in showcases with transparent glass or plastic [1, 2]. However, although the showcases function as good protection for the exhibits, the poor design of current showcases indirectly intervenes the interaction between visitors and the exhibits, resulting in much negative visiting experience to users. For instance, if a visitor wants to get the description of an exhibit while observing it, normally he needs to check a label card with description attached beside the showcase, which distracts his attention from the exhibit [2]. This negative experience will be even more serious in large showcases. Mostly, large showcases are used to display plentiful of exhibits [1, 3]. However, it will cause a visitor to feel difficult to distinguish and identify a specific exhibit while reading the description. In these situations, not only the visitor’s experience is negatively influenced, but the information of the exhibits is also hardly conveyed to the visitor.

In order to improve visitor experience in exhibitions, an AR smart window system using directive information fusion technology is developed. The window is integrated with a transparent AMOLED display on its front side, and visitors can select any object behind the AMOLED display to make object-related information show on the display [4, 5]. In this way, visitors can check the description directly while observing the exhibits [1, 6, 7]. Moreover, the AR smart window combines the function of directive information fusion technology. This is the technology that allows the information of exhibits to be placed at a proper position which is always aligned with a visitor’s eyesight and the object. When a visitor selects an exhibit behind the AMOLED display and touches the display, directive information fusion technology makes the touch point directive by connecting the location of the touch point, visitor’s eyes and the display. Through the directive touch point, the information of the exhibit will always show at the correct place no matter how the visitor moves, which is a more intuitive way to get the information of the exhibit [6].

2 Related Work

The science fiction movies revealed many applications for the transparent display system such as communication, advertisement, and entertainment etc. As a result, a lot of studies emerged to explore the possibilities of the potential amplifications. Heo et al. proposed the TransWall which is a transparent display that can show the information on the both sides of the display [6]. Its touch function enables the user to interact with the other one on the opposite side. Li et al. proposed a two-sided collaborative transparent display that the users on the both sides of the transparent display can work on the same project using the gesture, gaze and touch etc. [4]. Moreover, the information shown on the display is also available for the two users. Kim et al. presented the TransLayer, which is a transparent touch display floating of a large display table [8]. Through the TransLayer, the user can interact with the display table by touch. Xi et al. introduced the interactive lecture system using the transparent display that it makes the instructor to seem being in the same place with the students [9].

Based on an intensive market research and evaluation process, Li et al. revealed that the interactive show-window is a designed solution that the connection between the customer and the product is reinforced by emphasizing customized product visualization and direct interactivity [10]. To select an object behind the display, the relative distance between the length of the arm and the height of the eye are used by Hirakawa et al. [5]. However, the length of the arm and the height of the eye are different between each person, and it leads to the inaccuracy of the selection. To avoid the binocular parallax, Lee et al. proposed a quantitative measure called Binocular Selectability Discriminant (BSD) [11]. However, the working distance is fixed and therefore the user is confined to a fixed range. Yoshimura et al. proposed binocular interface for the far working distance that the user is unable to reach the transparent display [12]. Its prototype uses the basis of the positions of eyes and a finger detected by an RGB-D camera to achieve pointing positions.

Augmented reality (AR) technology was first introduced in 1990. It refers to the position and angle of image acquired through the camera. Through image analysis technology, virtual objects are combined with real-world scenes on the display device to provide interaction. Milgram and Kishino proposed a concept of Reality-Virtuality Continuum, which treats the real environment and the virtual environment as two ends of a continuous system [13]. Virtual reality is combined virtual object in a totally virtual environment, while augmented reality is superimposed virtual object into a real environment. Azuma proposed that augmented reality should include three characteristics: combining virtual objects with the real environment, instant interaction, and presenting in three dimensions [14]. Augmented reality technology requires both hardware and software support. The hardware part contains the processor, display, sensor and input device. The software part is mainly the algorithm of coordinate operation and virtual object interaction that has many software development kits as support, such as Unity Vuforia, ARKit, ARCore, ARToolKit etc.

Due to the increase in processor computing, the application of augmented reality is becoming more diverse. The application of augmented reality is spread across industries such as military [15], industry [16], education [17] and game [18]. Andersen et al. proposed the STAR which is a novel surgical AR system that using a virtual transparent display with telementoring to train the skills of trainee on surgeons [19]. Selvam, Yap, Ng, Tong, and Ho developed an information application that using AR technology to enhance users’ experience in museum exhibition [20]. Their system provided an interactive experience to users who could hold the smartphone to see the information of a specific marker. But the tracking of the markers is unreliable when the system doesn’t have 3D tracking. Rodrigues et al. proposed the adaptive user interface (UI) for implementing AR museum. In their study presented a framework and an initial object recognition of the marker for the museum which was integrated AR technology [21].

3 Method

3.1 Implementation

In this section, we will elaborate how the AR smart window system operates and how directive information fusion technology is adopted in the system. The construction of the window system is also illustrated as below.

Software.

Figure 1 shows the software structure of the AR smart window. Two important functions are adopted in the AR smart window system using directive information fusion technology: object recognition and eye tracking. The system keeps recognizing objects behind the AMOLED display and calculating their locations. After their locations are identified, the system will create initial region of interest (ROI) areas of the objects by the process of coordinate transformation. Meanwhile, the system is also detecting users in front of the AMOLED display, and calculating the eye position of a user. The calculated eye position will be used as a variable to calibrate the ROI areas of the objects. Next, the system matches the user’s touch point on the display and the accurate ROI areas of the objects to show the information of the objects at a proper position on the display. In doing so, the user will always see the information on his eyesight to the object no matter how he moves.

Fig. 1.
figure 1

The software structure of the AR smart window

As shown in Fig. 2, A user from left side wants to realize more about the butterfly, and another user from right side wants to get the information of the dragonfly. After they both touch the display, the system is able to respond correct information to the corresponding users because the users’ eye positions are also calculated. Hence, although the positions of their touch point are close, due to the difference of the eye positions, the information will also be different and show to its correspondent target users.

Fig. 2.
figure 2

User interaction of the AR smart window system

Hardware.

The AR smart window consists of two camera modules, one touch module, a display module and a host. The first camera module is called in-camera module, which is used to detect user’s eye position; the other camera, or out-camera module, functions as a module to process object recognition. These two are 3D stereo OV-580 camera. The touch module used in the system is Airbar, which is an optical touch sensor. The physical arrangement of hardware modules used in the AR smart window as shown in Fig. 3. The display module is a transparent AMOLED display, whose size is 17 inch with a resolution of 640 * 480 and a frame rate of 60 Hz. The host is a gaming computer with RAM of 16G and GPU of 6G. The integration of the AR smart window system is shown in Fig. 4.

Fig. 3.
figure 3

The physical arrangement of hardware modules used in the AR smart window system

Fig. 4.
figure 4

The integration of the AR smart window system

Another good application of the system is aquariums, as shown in Fig. 5. Users might find it hard to identify a certain species in an aquarium tank only by checking the labels attached beside the aquarium since the aquatic animals are usually moving around and hiding from the aquatic plants and stones in the tank. However, by using the transparent display system, the aquarium is not only a tank to display aquatic species, but also a smart display to show their description. In this way, users can check the description of moving species directly while observing them through the tank. Therefore, no matter how the species move in the tank, once a user selects each of them, the directive information fusion technology of the system enables it to precisely identify the specie that is distant from the display and show its information at a correct position on the display.

Fig. 5.
figure 5

The AR smart window system of aquariums

3.2 Evaluation

The evaluation was conducted at Taipei Nangang Exhibition Center during Aug 29th to 31th, 2018 to evaluate the AR smart window by real visitors. In order to create a setting similar to a science exhibition as shown in Fig. 6, the two AR smart windows were designed into two different wall-mounted display cases, as well as the specimens of 6 different insects and two different species of fishes were placed behind the display. Visitors are able to touch the display to identify a certain specimen, and the description of the specimen will directly show on the display, as shown in Fig. 7.

Fig. 6.
figure 6

One of the AR smart window system in the exhibition center

Fig. 7.
figure 7

Evaluation of the AR smart window system in the exhibition center

Experiment Settings.

32 visitors in an age of 25–40 were invited to take the evaluation test. 18 of them were male, and 14 were female. These visitors were asked to finish a subjective questionnaire after they had tried the AR smart window system. The questionnaire was divided into 3 sections: overall review of the system, guiding function of the system and system stability. Each section consisted of 4 to 7 items, and each item was a five-scale rating scheme ranging from “very disagree (1)” to “very agree (5)”.

4 Findings

Results of the questionnaire are shown in Fig. 8. The average scores for the system stability, the guiding function and the overall review were 3.27, 4.4, and 4.34 respectively. The scores above 4.0 indicated that the visitors had a positive attitude toward the AR smart window system. The visitors were satisfied with the guiding function of the system. It might attribute to a fact that the AR smart window supported an intuitive and simple way for the visitors to get the information of the exhibits. They could directly get the information of the exhibits on the display rather than move beside to check label cards. Moreover, since the system was able to detect a visitor’s eye position, the information was always showing at a proper position, which was a strength they agreed the most.

Fig. 8.
figure 8

The result of the questionnaire

“the information is always showing at the intersection of my eyesight to the exhibit I am looking at and the display”, a 29-year-old male visitor said.

However, the system stability might be a concern for the visitors. The touch function didn’t work very fluently. Users might need to touch more than once to enable the information to show on the display successfully. Also, some visitors replied it was hard for them to focus on both the information on the display and the exhibits behind the display.

“I feel a bit dizzy to focus on the information and the exhibit simultaneously”, a 32-year-old female visitor said.

5 Discussion

In general, the AR smart window system serves as a great display tool for exhibitions and museums. The directive information fusion technology of the system makes it easier and smarter for visitors to get the information of the exhibits.

On the other hands, there are some issues to be solved to improve the system stability. First, the insensitive touch function of current system might rest on the use of Airbar. Airbar is an optical touch senor which relies on infrared ray (IR) to send and receive signal. However, the sensitivity of IR sensor is unstable due to physical factors from its surroundings. Hence, in order to improve the touch sensitivity for further systems, a capacitive touch panel module might be a better alternative. Second, the issue that the visitors felt dizzy might come from a fact that the information and the exhibit were not located at the same distance from the user’s eye position, making it hard for the user to concentrate on both. To solve this issue, the display can be attached with Fresnel lens to change the focal length of the information on the display [22]. Since Fresnel lens are able to lengthen the focal length of the information, users might focus on both the information and the exhibits more easily.

6 Conclusion

The AR smart window system with directive information fusion technology is an innovative showcase design for exhibition use. Through the eye-tracking and object recognition technology, the AR smart window was able to identify the object that a user selects and show the object-related information at the correct position from the user’s eyesight. Users can get the information of the object more intuitively by touching the display while observing the object, rather than moving beside to check the label card of the objects. Therefore, the users can interact with the objects behind the display in any distance and at any direction.

For further improvement of the AR smart window system, we might replace the current modules with the ones that meet the requirements of different exhibits and exhibitions. For instance, a capacitive touch panel and a wider-angle camera module might be considered to be implemented into a further AR smart window system. In this way, we believe that the AR smart window can be a well-operated system that brings more intuitive experience to its users.