Abstract
Although recent research works have highlighted and demonstrated the applicability of the activity and behavioral pattern analysis mechanisms in offering early windows of opportunities in the assessment and intervention for individuals with autism spectrum disorder (ASD), the computational cost and sophistication of such behavioral modeling systems might prevent these automatic and semi-automatic systems from deploying, which might in turn restrict its actual use. As such, in this paper, we proposed an easily deployable automatic system to train joint attention (JA) skills, characterizing and evaluating JA and reciprocity patterns (i.e. the frequency and degree of reciprocity, initiating and responding to JA bids). Our proposed approach is different from most of earlier attempts in that we do not capitalize the sophisticated feature-space construction methodology; instead, the simple designs and in-game automatic data collection offers hassle-free benefits for such individuals as special education teachers and parents to use in both classrooms and at homes.
You have full access to this open access chapter, Download conference paper PDF
1 Introduction
Imagine John, a four-year-old boy and his mother, Alice, are eating at a local McDonalds when John points to the ketchup on the table (referred to as initiating joint attention bids, IJA) while looking at Alice, and saying “here is the ketchup”. In responding to John’s initiation, Alice looks at the ketchup and then back at John uttering “oh, yes, ketchup” (referred to as responding to joint attention bids, RJA). The eye shifting behaviors that Alice engaged between the ketchup and John are also included in such daily social interaction.
IJA and RJA are two key aspects of joint attention (JA) which must occur in social interaction [1]. It is regarded as the executive form of information processing from early in infancy through adulthood: it is predictive of later language development [2,3,4], theory of mind abilities [5] and social communicative skills [6]. Raver’s study on the social interaction between typically developing (TD) toddlers and their mothers revealed the link between JA and emotion regulation [7]. It is notably known that children with autism spectrum disorder (ASD) often exhibit atypical JA behaviors [8]. Specifically, they engage in fewer joint attention behaviors, including eye-gaze shifting [9], initiating and responding to joint attention bids [10], etc. Due to its criticality and motivated by the following two facts, we proposed the present study:
-
The ecological validity of an intervention
As White et al. argued that “Joint attention behaviors may vary across ethnicity, language, family structure, or socioeconomic status, and currently there is no assessment of how those vary” (pp. 1293 [11]); hence, there is a necessity of assessing (and charactering) such skills in a Chinese special education classroom.
-
Current assessment protocols are more inclined to focus on the more abstract and higher-level social skills where JA skills precede.
For example, mutual planning and joint performance in [12], turn-taking and negotiating in [13]. In our present study, the evaluation is measure in the context of the tasks—puzzle-making in a loosely coupled collaborative play environment to engage children with ASD so as to minimize their cognitive loads without enforced collaboration (EC) [14].
Our proposed approach presented in this paper is different from most of earlier attempts in that we do not capitalize the sophisticated feature-space construction methodology; instead, the simple designs and in-game automatic data collection offers hassle-free benefits for such individuals as special education teachers and parents to use in both classrooms and at homes.
The organization of this paper is as follows. In Sect. 2, relevant research will be presented; followed by the detailed descriptions of our training application including the defined IJA and RJA bids which can be utilized for behavioral pattern recognition. In Sect. 3, we will show the detailed in-game pattern analysis module. Section 4 includes a pilot testing in the lab with two typically developing (TD) adults for evaluating the feasibility of our behavioral pattern modeling module. We conclude our paper in Sect. 5 with discussions on our future research along this avenue.
2 Previous Works
Two indirect lines of past research are relevant to our present study.
2.1 IJA, RJA and Best Practices in Teaching JA Skills
Aligning with the two JA bids, Whalen and Schreibman [15] documented two phases of joint attention intervention in a non-computerized setting which has prevailed in such intervention: initiation and response training. The former includes coordinated gaze shifting and pro-declarative pointing. The latter sponsors five levels of responses as “response to hand on object”, “response to showing of object”, “eye contact”, “response to object being tapped”, “following a point”, and “following a gaze”. In addition, physical (i.e. touching a child’s hand to remind), verbal (utterances such as “you can drag the puzzle to here”) and gestural prompts were adopted to further assist children to engage with others during the response training phase [15]. Both IJA and RJA behaviors to JA bids had also been studied in a parent-child intervention setting (i.e. Parent-Mediated Communication-Focused Treatment in Children with Autism (PACT) [16]) and caregiver or parent mediated behavioral intervention (i.e. Joint Attention-Mediated Learning (JAML) [17, 18]).
Over the past years, computerized JA training applications have emerged, and many of them had been deployed in a collaborative play environment in a tabletop which allows larger space to afford joint performance [12,13,14, 19,20,21,22]. The majority of these earlier systems engage children in a tightly coupled collaborative play environment where only one work-space is deployed [1, 13, 19,20,21,22] except for [14] which does not enforce collaboration by providing private workspace for each child. These earlier works investigated the feasibility, usability and usefulness of the play environment in training JA skills, while the present study focuses on the automated pattern and data analysis to facilitate personalized training and intervention.
2.2 Behavioral Modeling and Pattern Analysis for Technology-Based ASD Intervention and Training
Users’ interaction and engagement in a virtual and physical space (including in a computerized application space) offers rich information on profiling and modeling the users in user-centered computing. In the pattern recognition and computer vision area, activity and behavioral analysis based on multimodal data has been much studied for a long time (for example among many references, [23,24,25,26]). The majority of these prior works focus on the recognition of single user activities and behaviors which often spans a considerable temporal duration. Recently, many works had focused on characterizing group activities at a coarse level [27,28,29,30,31, 34].
Among them, [28, 29, 31,32,33] targeted at children with ASD. For example, Chong et al. [28] attempted to measure and predict eye contact of infant with ASD via eye-gaze tracking during interaction sessions with the examiner who wears a pair of commercially-available glasses to capture infants’ face and head poses; such automatic system is beneficial and efficient to characterize atypical gaze behavior involving children with ASD in natural social settings. Anzulewicz et al. [29] focused on obtaining gesture data during ASD children’s (touch-sensitive) tablet gameplay sessions; the unique touch-sensitive screens and embedded inertial movement sensors are programmed to record movement kinematics and gesture forces. Winoto et al. [34] proposed to feature users’ movements in a naturalistic space which had been captured using depth-camera in the form of temporal skeleton data; and they argued that such data, if combined with other ambient sensing data could provide social-meter to predict social relationships. Prabhakar and Rehg [31] segmented and analyzed real-world social interaction videos to characterize turn-taking interactions between individuals. [32] documented a detailed study on the computational analysis of children’s social and communicative behaviors based on video and audio data in the dyadic social interaction between adults and children with ASD.
These recent earlier works demonstrated the applicability of the activity and behavioral pattern analysis mechanisms in the computer vision and pattern recognition area to assist therapists, care-givers and individuals with development disorders including ASD [32, 33].
Two recent studies focused on visualizing the behavioral patterns (including eye-gaze direction) during the social interaction between a child with ASD and a therapist [35, 36]. In both studies, sophisticated data capture system had been deployed. For example, in [36], the eye-gaze direction data were retrieved and analyzed via a high-definition video-recording system, followed by gaze-analysis based on facial landmark and head movement data. The computational cost of the system is inherently high; however, the authors claimed that compared with the manual rating and evaluation based on videos by therapist, the system can facilitate the medical specialist’s evaluation [36]; it is unclear, however, whether such behavioral visualizing system can easily be deployed. Unlike video-based data capturing system in [36], Kong et al. [35] utilizes Abaris to allow therapists using Anoto digital pen and paper technology and Nexidia voice recognition to create meaningful indices to videos [37].
Despite earlier efforts, however, the computational cost and sophistication of behavioral modeling systems in most of these works might prevent such automatic and semi-automatic systems from deploying, which might in turn restrict its actual use. Our proposed approach is different in that we do not capitalize the sophisticated feature-space construction; instead, the simple designs and in-game automatic data collection offers hassle-free benefits for such individuals as special education teachers and parents to use in both classrooms and at homes.
In the next two sections, detailed description of our system and the in-game data collection and automatic behavioral analysis model will be presented.
3 Our Joint Attention Training Application
3.1 The Training Application at a Glance
The two-player game is deployed on a 27 inches tabletop as a puzzle game (see Fig. 1, 2 for two screenshots). Figure 2 shows the general application design.
Each child has her/his own work space, where he/she needs to piece the puzzles together; the blue button (i.e. the help button) with a question mark can be tapped when either play cannot find a piece in his/her workspace (Figs. 1, 2 and 3). Upon tapping the help button, the puzzle will be blinked to prompt to alert the child to pass the puzzle that does not belong to his/her space to another. When the puzzle is being blinked in a child’s work space, he/she can ignore it or take actions to swipe it to another space. The border color of each puzzle corresponds to the color of each player’s work space (Figs. 1, 2 and 3), which serves two purposes: (a) providing visual cues for each player; (b) initiating JA bids for a player when he/she points to a puzzle piece in another player’s workspace (see Fig. 4).
3.2 The IJA and RJA Bids Defined in Our Application
As we discussed in the previous section, a help button has been placed at the right bottom of the screen for children to ask for help (see Fig. 3). Once they click the button, the puzzle piece on his/her own working space will be automatically moved to the correct place, the piece on other working space will blink to prompt another user to share it to his/her. Such blinking puzzle piece, in the form of a visual pattern provides visual cues to prompt for RJA. When the piece is in being blinked, the other child can ignore or take action to deliver the puzzle piece. As such, as a unique design in our training application, the behavior of clicking the help button is defined as a IJA bid. RJA bid occurs when the puzzle piece is blinking and (a) the player notices it; (b) the player passes the blinking piece to another play. Obviously, a child might have noticed a puzzle piece that does not belong to his/her works space, and swipe it to the other child, which is regarded as a indicative evidence of proactive help.
Table 1 below lists key IJA and RJA bids in the puzzle training game, where the ones below the green bar shows the unique designs in our application where such bids can be objectively assessed.
These bids can best be evaluated based on behavioral and speech analysis of their actions (recorded in a video) during the interactions. In cases that either child failed to initialize or respond to each other’s attention bids, such reminders can come from the teacher/parents who are present, in the form of verbal and bodily cues [15].
4 Behavioral Modeling and Preliminary Analysis in Our Joint Attention Application
4.1 In-Game Data Collection Module
It includes a built-in game data collection module to indirectly assess the quantitative degree of reciprocity as well as the overall performance of each children. The help button is specially designed as a visual cue and objective measurement of proactive help.
Figure 5 shows these parameters. Each player has 12 puzzle pieces, the pieces for left and right player is labelled from L1 (R1) to L12 (R12) respectively. X and Y represent the 2D index of each puzzle piece (see the image in Figs. 1, and 2 for the user interface of this version of the application).
Proactive help is essential for mutual planning and better joint performance. Each player’s behavioral data will be collected and stored as a data book. The data reflects the temporal movement of the puzzle pieces registered to each player (Left and Right player respectively). For each movement of a puzzle piece, the following data will be automatically collected: the piece’s index, the time stamp of the attempt, the duration of the operation, the final location of the piece and the wrong place that the piece was placed, if any, respectively. Figure 6 shows such a data example of player’s behaviors for further user modeling and analysis.
The data shown in Fig. 6 records the temporal manipulations of a puzzle piece by a given player (L or R). They are computed to measure the overall performance of the task and can be used to obtain the behavioral pattern of both players.
When all puzzle pieces had been placed on the correct place, our in-game data collector will generate one row of data which contains the performance of each player at a given level with the following additional behavioral data (Fig. 7).
Notice that in order to reduce stress for low-functioning ASD children, an advance help has been added: when such a button is pressed, the puzzle-filling operation will be automatically finished. Such a design is important in that individuals with ASD (including children) are much less reluctant to engage in eye-contact and close encounter of the face [38]. Hence, when such automatic operation is observed, the ASD child’s JA skills might need to be further trained.
4.2 The Feasibility Study: Preliminary Analysis
In order to assess the usability of the module before deploying it in a special education classroom, we conducted an in-lab study involving two TD adults (see Fig. 8). The test environment is similar to that in [14].
Due to limited space, in this paper, we focus on the RJA and IJA skills in terms of both proactive and non-proactive helps. To this end, we measure the quantity of these skills (see Fig. 5).
Two students were invited to participate in the study. They completed six levels of the games which consists of 6 (from level one to five) and 12 puzzle pieces (level six) respectively. We followed the study protocol as in [14].
Figures 9 and 10 show the temporal JA bid pattern of both players respectively. Some quick assessment of the quantity of the JA bids can be easily drawn from the figures. For example, the IJAs and RJAs of both players tend to show opposite patterns; overall, the left player is more socially active in terms of both IJAs and RJAs; more joint attention and proactive help patterns can be observed in when both players entered level four, five and six., etc.
Obviously, these behavioral marks provide rich information for therapists to assess the appropriateness of game activities as well as the social interaction patterns of players. The data collection and analysis can automatically be conducted at the background to allow tele-therapy and facilitate live behavioral marking by therapists in different physical location [37, 39, 40]. Our system has the advantage over the previous ones, including [28, 29, 31,32,33, 35, 36], in that it is lightweight and easily deployable, which thus made it an ideal use at home.
4.3 Discussion
Our in-game data collection module has been carefully designed to assess the performance of the tasks, measurement of reciprocity which is key to social interaction and JA skills [41].
We speculate that a good performance on a given level could indicate an intact or typical JA skill sets reflect compensatory strategies such as pressing the ‘auto-finish’ button. The preliminary in-lab study demonstrated the high feasibility of such an automated system from data collection to analysis.
Further sophisticated analysis such as finer-tuned eye-tracking is expected. However, such game design to facilitate compensatory strategy is necessary to avoid the melt-down of the child. A more challenging research path to pursue is to provide adaptive and personalized visual support based on such behavioral pattern recognition and analysis so as to enhance the quality of therapy and intervention [42].
5 Concluding Remarks and Future Works
Although recent research works have highlighted and demonstrated the applicability of the activity and behavioral pattern analysis mechanisms in offering early windows of opportunities in the assessment and intervention for individuals with ASD [32, 33], the computational cost and sophistication of such behavioral modeling systems might prevent such automatic and semi-automatic systems from deploying, which might in turn restrict its actual use.
Drawn from the findings from these earlier works, we proposed an easily deployable automatic system to train joint attention skills, assess the frequency and degree of reciprocity, characterizing the IJA and RJA behaviors. Our proposed approach is different from most of earlier attempts in that we do not capitalize the sophisticated feature-space construction methodology; instead, the simple designs and in-game automatic data collection offers hassle-free benefits for such individuals as special education teachers and parents to use in both classrooms and at homes. The preliminary in-lab study demonstrated the high feasibility of such an automated system from data collection to analysis.
The design of the game and activities followed our previous approach [14]; the revised system described in this paper (including the integration of the automated data collection and analysis module) had been developed based on our interview with the special education teachers.
We expect the system to be deployed in Chinese special education classrooms to evaluate its usability and applicability over an extended use of period.
References
Winoto, P., Tang, T.Y.: A multi-user tabletop application to train children with autism social attention coordination skills without forcing eye-gaze following. In: Proceedings of the 16th ACM Interaction Design and Children Conference (ACM IDC 2017), pp. 527–532. ACM Press (2017)
Brooks, R., Meltzoff, A.N.: The development of gaze following and its relation to language. Dev. Sci. 8, 535–543 (2005)
Mundy, P., Block, J., Delgado, C., Pomares, Y., Van Hecke, A.V., Parlade, M.: Individual differences and the development of joint attention in infancy. Child Dev. 78, 938–954 (2007)
Kasari, C., Paparella, T., Freeman, S., Jahromi, L.B.: Language outcome in autism: randomized comparison of joint attention and play interventions. J. Consult. Clin. Psychol. 76(1), 125–137 (2008)
Nelson, P.B., Adamson, L.B., Bakeman, R.: Toddlers’ joint engagement experience facilitates preschoolers’ acquisition of theory of mind. Dev. Sci. 11, 847–852 (2008)
Van Hecke, A.V., et al.: Infant joint attention, temperament, and social competence in preschool children. Child Dev. 78, 53–69 (2007)
Raver, C.: Relations between social contingency in mother–child interaction and 2-year-olds’ social competence. Dev. Psychol. 32, 850–859 (1996)
Bruinsma, Y., Koegel, R.L., Koegel, L.K.: Joint attention and children with autism: a review of the literature. Ment. Retard. Dev. Disabil. Res. Rev. 10, 169–175 (2004)
Carpenter, M., Pennington, B.F., Rogers, S.J.: Interrelations among social-cognitive skills in young children with autism. J. Autism Dev. Disord. 32, 91–106 (2002)
Stone, W., Ousley, O.Y., Yoder, P.J., Hogan, K.L., Hepburn, S.L.: Nonverbal communication in two- and three-year-old children with autism. J. Autism Dev. Disord. 6, 677–695 (1997)
White, P.J., et al.: Best practices for teaching joint attention: a systematic review of the intervention literature. Res. Autism Spectr. Disord. 5(4), 1283–1295 (2011)
Giusti, L., Zancanaro, M., Gal, E., Weiss, P.L.T.: Dimensions of collaboration on a tabletop interface for children with autism spectrum disorder. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2011), pp. 3295–3304 (2011)
Gal, E., Lamash, L., Bauminger-Zviely, N., Zancanaro, M., Weiss, P.L.T.: Using multitouch collaboration technology to enhance social interaction of children with high-functioning autism. Phys. Occup. Ther. Pediatr. 36(1), 46–58 (2016)
Winoto, P., Tang, T.Y., Guan, A.: “I will help you pass the puzzle piece to your partner if this is what you want me to”: the design of collaborative puzzle games to train Chinese children with autism spectrum disorder joint attention skills. In: Proceedings of the 15th ACM Interaction Design and Children Conference (ACM IDC 2016), pp. 601–606. ACM Press (2016)
Whalen, C., Schreibman, L.: Joint attention training for children with autism using behavior modification procedures. Phys. Occup. Ther. Pediatr. 44(3), 456–468 (2003)
Green, J., Charman, T., McConachie, H., PACT Consortium, et al.: Parent-mediated communication-focused treatment in children with autism (PACT): a randomized controlled trial. Lancet, 375, 2152–2160 (2010)
Kasari, C., et al.: Randomized controlled trial of parental responsiveness intervention for toddlers at high risk for autism. Infant Behav. Dev. 37(4), 711–721 (2014)
Schertz, H.H., Odom, S.L., Baggett, K.M., Sideris, J.H.: Effects of joint attention mediated learning for toddlers with autism spectrum disorders: an initial randomized controlled study. Early Child. Res. Q. 28(2), 249–258 (2013)
Battocchi, A., et al.: Collaborative Puzzle Game: a tabletop interactive game for fostering collaboration in children with Autism Spectrum Disorders (ASD). In: Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces (ITS 2009), pp. 197–204 (2009)
Goh, W.B., Shou, W., Tan, J., Lum, G.T.: Interaction design patterns for multi-touch tabletop collaborative games. In: Proceedings of CHI 2012 Extended Abstracts on Human Factors in Computing Systems (CHI 2012), pp. 141–150 (2012)
Piper, A.M., O’Brien, E., Morris, M.R., Winograd, T.: SIDES: a cooperative tabletop computer game for social skills development. In: Proceedings of the 20th Conference on Computer Supported Cooperative Work (ACM CSCW 2006), pp. 1–10 (2006)
Silva, G.F.M., Raposo, A., Suplino, M.: Exploring collaboration patterns in a multitouch game to encourage social interaction and collaboration among users with autism spectrum disorder. J. Comput. Support. Coop. Work 24, 149–175 (2015)
Ke, Y., Sukthankar, R., Hebert, M.: Volumetric features for video event detection. Int. J. Comput. Vis. 88(3), 339–362 (2010)
Laptev, I., Marszalek, M., Schmid, C., Rozenfeld, B.: Learning realistic human actions from movies. In: Proceedings of 2008 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2008), pp. 1–8 (2008)
Messing, R., Pal, C., Kautz, H.: Activity recognition using the velocity histories of tracked keypoints. In: Proceedings of IEEE 12th International Conference on Computer Vision (ICCV 2009), pp. 104–111 (2009)
Tran, D., Sorokin, A.: Human activity recognition with metric learning. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008. LNCS, vol. 5302, pp. 548–561. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-88682-2_42
Choi, W., Shahid, K., Savarese, S.: Learning context for collective activity recognition. In: Proceedings of 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2012), pp. 3273–3280 (2012)
Chong, E., et al.: Detecting gaze towards eyes in natural social interactions and its use in child assessment. In: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, no. 3, Article No. 43 (2017)
Anzulewicz, A., Sobota, K., Delafield-Butt, J.T.: Toward the autism motor signature: gesture patterns during smart tablet gameplay identify children with autism. Sci. Rep. 6 (2016). Article number 31107
Lan, T., Wang, Y., Yang, W., Mori, G.: Beyond actions: discriminative models for contextual group activities. In: Proceedings of NIPS, pp. 1216–1224 (2010)
Prabhakar, K., Rehg, James M.: Categorizing turn-taking interactions. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7576, pp. 383–396. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33715-4_28
Rehg, J.M., et al.: Decoding children’s social behavior. In: Proceedings of 2013 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2013), pp. 3414–3421 (2013)
Rehg, J.M., Rozga, A., Abowd, G.D., Goodwin, M.S.: Behavioral imaging and autism. IEEE Pervasive Comput. 13(2), 84–87 (2014)
Winoto, P., Chen, C.G., Tang, T.Y.: The development of a kinect-based online socio-meter for users with social and communication skill impairments: a computational sensing approach. In: Proceedings of IEEE International Conference on Knowledge Engineering and Applications (ICKEA 2016), pp. 139–143 (2016)
Kong, H.K., Lee, J., Ding, J., Karahalios, K.: EnGaze: designing behavior visualizations with and for behavioral scientists. In Proceedings of 2016 ACM Conference on Designing Interactive Systems (ACM DIS 2016), pp. 1185–1196 (2016)
Higuchi, K., et al.: Visualizing gaze direction to support video coding of social attention for children with autism spectrum disorder. In: Proceedings of 23rd International Conference on Intelligent User Interfaces (ACM IUI 2018), pp. 571–582 (2018)
Kientz, J.A., Boring, S., Abowd, G.D., Hayes, G.R.: Abaris: evaluating automated capture applied to structured autism interventions. In: Beigl, M., Intille, S., Rekimoto, J., Tokuda, H. (eds.) UbiComp 2005. LNCS, vol. 3660, pp. 323–339. Springer, Heidelberg (2005). https://doi.org/10.1007/11551201_19
Zwaigenbaum, L., et al.: Early identification and interventions for autism spectrum disorder: executive summary. Pediatrics 136(Suppl), S1–S9 (2015)
Cason, J., Richmond, T.: Telehealth opportunities in occupational therapy. In: Kumar, S., Cohn, E. (eds.) Telerehabilitation. Health Informatics, pp. 139–162. Springer, London (2013). https://doi.org/10.1007/978-1-4471-4198-3_10
Peretti, A., Amenta, F., Tayebati, S.K., Nittari, G., Mahdi, S.S.: Telerehabilitation: review of the state-of-the-art and areas of application. JMIR Rehabil. Assist. Technol. 4(2), e7 (2017)
Redcay, E., Kleiner, M., Saxe, R.: Look at this: the neural correlates of initiating and responding to bids for joint attention. Front. Hum. Neurosci. 6, 169 (2012)
Tang, T.Y., Winoto, P.: Providing adaptive and personalized visual support based on behavioural tracking of children with autism for assessing reciprocity and coordination skills in a joint attention training application. In: Proceedings of the 23rd International Conference on Intelligent User Interfaces Companion (ACM IUI 2018), Article No. 40. ACM Press (2018)
Acknowledgements
The authors gratefully acknowledge financial support from Zhejiang Provincial Natural Science Foundation of China (LGJ19F020001). Our thanks to Aonan Guan for implementing the system; Haoyu Yu for her design of the pictures used in the application. Thanks also go to Jie Chen, for participating in the preliminary test.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Tang, T.Y., Winoto, P. (2019). Automated Behavioral Modeling and Pattern Analysis of Children with Autism in a Joint Attention Training Application: A Preliminary Study. In: Zaphiris, P., Ioannou, A. (eds) Learning and Collaboration Technologies. Designing Learning Experiences. HCII 2019. Lecture Notes in Computer Science(), vol 11590. Springer, Cham. https://doi.org/10.1007/978-3-030-21814-0_22
Download citation
DOI: https://doi.org/10.1007/978-3-030-21814-0_22
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-21813-3
Online ISBN: 978-3-030-21814-0
eBook Packages: Computer ScienceComputer Science (R0)