Keywords

1 Introduction

Autism Spectrum Disorder (ASD) is a lifelong developmental disability that may impact people’s understanding of their environment. It can result in difficulties with social relationships, communication and behavior [2, 3]. The latest studies show that today, about 1 in 68 children is identified with ASD [4]. Because of the limiting properties of ASD, it is usually harder for individuals with autism to find jobs and succeed in them without proper training. There are many scientific studies for job training of individuals with ASD and many of them are using virtual reality. However, there is little work to understand which virtual reality interaction techniques are acceptable and useful for individuals with ASD.

Virtual reality training applications for individuals with ASD usually reported to have positive effects [5, 6] but they show different results in terms of effectiveness and acceptance by the users. One of the main reasons for these differences could be the interaction techniques that are used in these systems. The literature has not yet extensively examined different virtual reality interaction techniques for individuals with ASD. Although there are many possible advantages of using virtual reality systems for job training, effective interaction techniques must be implemented for the users to truly benefit from these advantages of virtual reality. Since perception and behaviors of the individuals with ASD are different from neuro typical individuals, using the same interaction techniques that work well for neuro typical individuals may not be a good practice for individuals with ASD.

In our VR4VR system, individuals with ASD are trained on six vocational skills in highly immersive virtual environments. These skills are: cleaning, shelving, environmental awareness, loading the back of a truck, money management and social skills. Several different interaction techniques for object selection and manipulation and locomotion were implemented in these skills of our VR4VR system. These skills and interaction techniques were selected based on discussions with the job coaches who are experts in vocational training of individuals with ASD. We have worked with six different professional job coaches who currently provide vocational training services to individuals with ASD. In selection of tasks, prevalence in employment of individuals with ASD played an important role. For interaction techniques, the most appropriate methods that allow for acclimatization to tasks were identified. These interaction techniques were designed following both research and discussions with the job coaches. For object selection and manipulation; tangible object manipulation, haptic device interaction, touch and snap technique and touchscreen interaction techniques were designed and implemented. For locomotion; real walking and walk in place techniques were designed and implemented. In this paper, we present our design considerations for the virtual reality interaction techniques for individuals with ASD in our VR4VR system with our preliminary user testing results and indications.

2 Related Work

Advances in technology have been used for assisting individuals with ASD recently. Current applications usually use touchscreen devices since they are easy to use, affordable and available. Furthermore, one of the recent studies showed that tablet applications with multi touch interactions can make children with ASD more verbally and physically engaged as compared to the traditionally performed similar activities [7]. In a study conducted by Madsen et al., researchers developed touch screen applications for teaching children with ASD to recognize facial expressions [8]. In this study, lessons learned about the software and hardware design of touch screen applications for this specific population were shared very briefly. In a study on developing an expression recognition game for individuals with ASD using touch enabled mobile devices, the authors explored available ASD games and tried to consolidate guidelines for designing user interfaces for children with autism [9].

Another popular approach in designing applications for individuals with ASD is using touchless interactions. The availability of depth sensors, such as Microsoft Kinect and their use for skeleton tracking made this technique easily implemented and popular. Moreover, some researchers suggest not using wearable sensors since some individuals with ASD may not prefer to wear any sensors on them [10]. A study by Bartoli et al. showed that games with touchless interaction helped in improving the attention skills for children with autism [11]. Another recent study for individuals with ASD was aiming at improving their motor skills [12]. With this goal, Garzotto et al. developed and evaluated a motion based touchless application.

There are also some applications that use more than one interaction technique. One study focused on full body interaction techniques for low functioning children with ASD [13]. An environment similar to a virtual reality cave was developed with projectors, cameras and sensors. Some touchless interaction techniques as well as touch based interaction techniques were implemented, and the children’s acceptance of the system was discussed. Most of the children accepted the system and used it effectively.

With the emerging technology of virtual reality, some researchers are integrating virtual reality interaction techniques into training applications for people with ASD. In a study, researchers used a virtual reality system to teach street-crossing skills to children with ASD [5]. The results showed that training in virtual reality improved these skills. In another study, a virtual reality driving training system was developed [6].

Another study that aims at increasing social engagement of children with ASD used two different games with two different interaction techniques [14]. One used multiple mice while the other used a Diamond touch surface. Another recent study showed observations on the usability of basic 3D interactions such as translation and rotation for the adolescents with ASD [15]. The authors tried to find the differences in use of 3D user interaction techniques between neuro typical individuals and individuals with autism. The results showed that the deficits in hand-eye coordination in individuals with ASD caused some difficulties in using these 3D interaction techniques. The authors suggest that developers should add some assistive cues to aid individuals with ASD with the hand-eye coordination.

Although different interaction techniques and their effects on user experience have been thoroughly examined for neuro typical individuals so far, only limited research in this area has been explored for individuals with ASD. It must also be noted that most of these studies were designed for children. Previous studies that utilize some interaction techniques for virtual reality applications for individuals with ASD do not examine user experience or provide insight on suitable interaction techniques for this specific group of audience.

3 Interaction Techniques

In our VR4VR system, there are six modules that were developed for the vocational training of individuals with ASD. The most convenient interaction technique for each skill was decided by research and discussions with the job coaches. The job coaches who gave input in designing the system currently give professional vocational training services for individuals with ASD. These job coaches are experts in vocational training of individuals with ASD. In this study, the interaction techniques that were tested within the VR4VR system are categorized into two as selection/manipulation and locomotion.

3.1 Object Selection and Manipulation

For object selection and manipulation, four different interaction techniques were implemented and tested. These are used in different skill modules to interact with the virtual world. These four interaction techniques are presented in the following subsections.

Tangible Object Manipulation.

In this interaction technique, two types of real tangible objects are tracked and represented in the virtual world: (1) identical looking real boxes that are shown in the virtual world with different textures or labels, and (2) a broomstick handle that is represented in the virtual world as a vacuum cleaner or a mop that the user uses to clean the virtual environments.

The users tested these interaction techniques while training on some shelving and cleaning tasks. There were three different shelving tasks: the first task was to rotate the boxes on the shelves such that the front sides faced the user. In the second task, the users were asked to put the required boxes on the table and then place the boxes on the correct levels of the correct shelves according to their projected label textures. In the third task, instead of the label textures, labels with different code numbers were projected on the boxes. These code numbers indicated the shelf/level the boxes belonged to.

Fig. 1.
figure 1

Tangible object manipulation. Left: the user rotates a real box in shelving tasks. Right: the user uses a broomstick as a virtual mop in cleaning tasks.

An immersive tangible object manipulation technique was implemented and tested with the shelving tasks. With this technique, the users could move and rotate the real tangible boxes in the tracked area (Fig. 1 Left). Head mounted display (HMD) was used along with hand bands with reflective markers on them. This enabled real time head and hand tracking.

Two different cleaning tasks were implemented: vacuuming and mopping in the virtual world. The objectives of these tasks were to clean the virtual warehouse by using the real broomstick handle to vacuum clean the dry dirt piles and to mop the wet dirt piles. The users used a tangible broomstick handle to interact with the virtual world. The real broomstick handle was replaced with a virtual vacuum cleaner or a virtual mop in different tasks (Fig. 1 Right). To be able to track the real stick with the cameras in real-time, we attached three pieces of reflector marker tape around the cylinder. Since the cylinder was symmetric along its longitudinal axis, we used software calculations to visualize the cleaning head (nozzle or mop) according to the angle between the cylinder and the ground. This time, in addition to HMD and hand bands, feet bands with reflective markers were also worn by the user. This enabled real time head, hand and feet tracking.

Haptic Device.

Haptic device interaction was also tested by individuals with ASD in our VR4VR system. Haptic devices utilize force feedback to create a sense of touch. In this module, Phantom Omni® haptic device [16] was used to interact with the virtual world. Phantom Omni® creates a sense of weight for the users so that they can feel if they are interacting with a light or heavy object. This was suggested to help in increasing immersion by the job coaches. Although they were hesitant about the 3D nature of this interaction device, job coaches found it very promising for comfortable use of individuals with ASD with the provided force feedback. As an alternative, mouse interaction was also implemented in case users had difficulty with the haptic device interaction.

In the loading back of a truck task which used this interaction technique, the users were expected to fill up an empty truck with different sized 2D boxes in a virtual environment. The boxes had different properties such as weight, fragility and directional arrows. The aim of this module was to fit all of the boxes into a limited area by moving and rotating the boxes. The working area of the haptic device was restricted to a planar surface that was parallel to the display area. This helped the users to relate the haptic device input to the visual output easily and also removed the ambiguity coming from the extra degree of freedom for the sake of this task. The buttons on the haptic device handle were assigned for specific commands (Fig. 2). One of the buttons was used to hold the boxes similar to the vastly used mouse gesture for drag and drop. The other button was used to rotate the boxes by 90 degrees counterclockwise.

Touch and Snap.

Touch and snap interaction technique is often used in virtual reality applications. In this technique, a virtual object is snapped to a moving object which usually is selected to be the virtual hand of the user. To trigger the release of the snapped object, different techniques can be used such as time triggering, position triggering or gesture triggering.

Fig. 2.
figure 2

Haptic device interaction technique. Left: the user holds boxes with the haptic device. Right: the haptic device.

This interaction technique was used in the litter collection task of our VR4VR system. In this task, the users were asked to collect randomly distributed litter objects from the ground and throw them into virtual trash bins that were located around. User’s hands were equipped with reflective markers to be tracked in real time by the optical tracking system cameras. Virtual litter object was snapped to the user’s virtual hands when the user bended and hands came close to the litter. Users carried the litter objects in the virtual world and once litter arrived in the vicinity of a trash bin, it disengaged from the hand and fell into the trash bin. Required actions in this task were moving the hands close to the litter in the virtual world by bending and reaching out in the real word and then bringing the held litter close to a virtual trash bin to release it by extending the arm in the real world (Fig. 3).

Fig. 3.
figure 3

Touch and snap interaction technique. Left: the user tries to take the litter object from the ground with their right hand. Right: the user carries a litter object.

Touchscreen.

With the increasing number of mobile devices such as cell phones and tablet computers, touch interaction became one of the most popular and prevalent interaction techniques. In our project, touchscreen interaction was used in a module related to the cash register skills. Three tasks were used for testing: (1) recognizing money, (2) counting money, and (3) giving change. Currently, most of the digital cash registers use touchscreens to get input from the cashier, so this interaction technique was selected for this module to increase the realism and immersion.

In this module, only the single touch static touching technique was used instead of the more complicated dynamic or multi touch interactions. The tasks required the user: (1) to identify the given bill or coin amounts by touching on the corresponding value among the given options, (2) to type the sum of the presented bill and coin amounts by touching the numbers on the touchscreen keypad, and (3) touching the bill and coin visuals on the cash register to fetch the required change to be given to a customer based on the shopping simulation. A touchscreen keypad similar to the real cash register keypads was presented to the user. The only possible interaction technique in this module was the touch interaction.

3.2 Locomotion

Locomotion techniques are used to move the viewpoint (and the avatar, if used) of the user in the virtual world. There are many different locomotion techniques in virtual reality. In our study, we used two of those locomotion techniques; real walking and walk in place.

Real Walking.

To move the virtual avatar in the virtual world in this locomotion technique, the user really walks in the tracked area as they would do in real life. Although this is a very intuitive method, there is the significant restriction of the limited tracked area. The user is equipped with reflective markers on their hands and head so that the real position of the user is approximated by these tracked position values and transferred into the virtual world. The virtual world is viewed inside from a virtual camera that is attached to the position of the virtual head and this view is rendered to the HMD. The movement and the rotation of the real head affect the virtual camera’s position and rotation so that a realistic view of the virtual world can be displayed in the HMD.

In our project, real walking interaction technique was evaluated in a virtual warehouse environment. Since this technique is restricted by a limited tracking area, the user was surrounded by two physical shelves and one desk. All the tasks were designed so that they could be performed inside that limited area.

Walk in Place.

If the real tracked area is smaller than the virtual world, then real walking technique becomes hard to use due to this restriction. To overcome this limitation, walk in place technique is commonly used in virtual reality implementations. In this technique, the user marches in the same place while the virtual avatar walks in the virtual world in the direction the user faces. This way, the limitation of the tracked region can easily be overcome. But this comes with the additional gesture of walking in place instead of intuitive real walking motion. Our implementation of this technique included different walking speeds, depending on the speed of the walking in place gesture, so that the user could adjust the virtual speed of the avatar by modifying their real marching speed. The walking direction of the virtual avatar was controlled by the head direction of the user.

This technique was used in two modules. The first one was the cleaning module in which the users were required to go near the dirty areas to clean them or go near litter objects to collect them. The other module was the environmental awareness module. In this module, the users were required to walk to the specified check points in the parking lot of a virtual shopping mall environment. Both of these modules used only the “walk in place” technique for locomotion.

4 Preliminary Results

A user study is currently ongoing with individuals with ASD. So far, five individuals with ASD participated with ages ranging from 20 to 27. The participants had no secondary disabilities. All five participants were on the high functioning side of the autism spectrum. None of the participants had prior experience of virtual reality. These participants went through each module task of our VR4VR system. It took approximately four hours (two sessions of two hours on different days) for each participant to go through all six modules. Three job coaches accompanied the users during the testing sessions. Here, we present the preliminary results that were obtained during these testing sessions. Since the user study has not been finished yet, instead of providing statistical results, we provide our initial findings based on the statements of users with ASD and the job coaches. Our testing with individuals with ASD yield to the following results:

  • Our participants with ASD liked touchscreen interaction the most as compared to the other interaction techniques they tested in our VR4VR system.

  • Our participants with ASD stated that they found the haptic device difficult to use.

  • The users interacted with the tangible boxes and the broomstick very easily and stated that they enjoyed interacting with those.

  • Our participants with ASD liked real walking better than the walk in place. It took time for the users to learn the walk in place technique and even when learned, some of our participants were frustrated when using the walk in place technique for locomotion.

  • Our participants with ASD stated more problems with motion sickness in tasks that used walk in place locomotion technique as compared to the tasks that used real walking locomotion technique.

  • We did not encounter any problems with the acceptance of HMD by our participants with ASD.

5 Discussion

Interaction techniques constitute a crucial part of the user experience in virtual reality. Individuals with autism have their own characteristics and preferences related to their cognitive information processing. This makes the effectiveness of the previously proven virtual reality interaction techniques for neuro typical individuals speculative when it comes to individuals with autism.

In our preliminary testing sessions, individuals with ASD stated preference for touchscreen interaction. This may be caused by the users’ previous experiences since touchscreens are commonly found in everyday lives. The job coaches stated that the reason behind the haptic device’s creating confusion for some individuals with ASD might have been the third dimension as compared with the commonly used mouse interaction. Even though mouse interaction was implemented as an alternative to haptic device interaction, the job coaches observed positive effects of force feedback provided by the haptic device and suggested continuing using haptic device interaction.

Our participants with ASD stated preference for real walking. Again, this may be caused by the users’ previous everyday experiences. Some users got frustrated while they were using the walk in place technique for locomotion. It was hard for the users with ASD to comprehend the walk in place and keep doing that locomotion gesture without really walking forward. After their observations in the testing sessions, the job coaches stated negative comments about the suitability of the walk in place technique for individuals with autism and asked us to look for alternative techniques that might work better for individuals with autism. Hence, we interpret these views of the users and the job coaches as walk in place being a questionable locomotion technique to be used for individuals with ASD. On the other hand, real walking locomotion technique was found very comfortable by the users and the job coaches, hence it might be utilized into virtual reality implementations for individuals with autism. Of course this is not easy to achieve due to the limitation imposed by the motion tracking cameras but the tasks would be designed such that the users do not need to go outside the tracking area naturally. To sum up, our participants with ASD found the interaction techniques that are familiar to them from real life more comfortable to use in virtual reality.

6 Conclusion

This study aims at exploring suitability of the interaction techniques in our immersive virtual reality vocational training system for individuals with ASD. Several interaction techniques were implemented after discussions with the job coaches and testing sessions with five individuals with ASD were performed. Although preliminary testing results do not provide statistical results, opinions of the users with ASD and the job coaches state preference over touchscreen and tangible interaction techniques, and real walking locomotion technique. These techniques were the ones that had the most resemblance to everyday real life interaction. Participants with ASD had more difficulty in gesture based and more abstract interaction techniques.

Our future work will consist of finishing the ongoing user study and sharing the statistically valid results. After that, we will be implementing several isolated modules for evaluation of different virtual reality locomotion techniques for individuals with autism that will also be evaluated with a user study.