Skip to main content

Fixation Based Object Recognition in Autism Clinic Setting

  • Conference paper
  • First Online:
Intelligent Robotics and Applications (ICIRA 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11743))

Included in the following conference series:

  • 2689 Accesses

Abstract

With the increasing popularity of portable eye tracking devices, one can conveniently use them to find fixation points, i.e., the location and region one is attracted by and looking at. However, region of interest alone is not enough to fully support further behavior and psychological analysis since it ignores the abundant information of visual information one perceives. Rather than the raw coordinates, we are interested to know the visual content one is looking at. In this work, we first collect a video dataset using a wearable eye tracker in an autism screening room setting with 14 different commonly used assessment tools. We then propose an improved fixation identification algorithm to select stable and reliable fixation points. The fixation points are used to localize and select object proposals in combination with object proposal generation methods. Moreover, we propose a cropping generation algorithm to determine the optimal bounding boxes of viewing objects based on the input proposals and fixation points. The resulted cropped images form a dataset for the subsequent object recognition task. We adopt the AlexNet based convolutional neural network framework for object recognition. Our evaluation metrics include classification accuracy and intersection-over-union (IoU), and the proposed framework achieves \(92.5\%\) and \(88.3\%\) recognition accuracy on different testing sessions, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Yarbus, A.L.: Eye Movements and Vision. Springer, US (1967). https://doi.org/10.1007/978-1-4899-5379-7

    Book  Google Scholar 

  2. Hayhoe, M., Ballard, D.: Eye movements in natural behavior. Trends Cogn. Sci. 9(4), 188–194 (2005)

    Article  Google Scholar 

  3. Rayner, K.: Eye movements and attention in reading, scene perception, and visual search. Q. J. Exp. Psychol. 62(8), 1457–1506 (2009)

    Article  Google Scholar 

  4. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vision 60(2), 91–110 (2004)

    Article  Google Scholar 

  5. Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005, vol. 1, pp. 886–893. IEEE (2005)

    Google Scholar 

  6. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Li, F.: ImageNet: a large-scale hierarchical image database. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, pp. 248–255 (2009)

    Google Scholar 

  7. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: International Conference on Neural Information Processing Systems, pp. 1097–1105 (2012)

    Google Scholar 

  8. Jia, Y., et al.: Caffe: convolutional architecture for fast feature embedding, pp. 675–678 (2014)

    Google Scholar 

  9. Redmon, J., Farhadi, A.: YOLO9000: better, faster, stronger, pp. 6517–6525 (2016)

    Google Scholar 

  10. Toyama, T.: Object recognition system guided by gaze of the user with a wearable eye tracker. In: Mester, R., Felsberg, M. (eds.) DAGM 2011. LNCS, vol. 6835, pp. 444–449. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-23123-0_46

    Chapter  Google Scholar 

  11. Toyama, T., Kieninger, T., Shafait, F., Dengel, A.: Gaze guided object recognition using a head-mounted eye tracker. In: Biennial Symposium on Eye Tracking Research Applications, ETRA 2012, pp. 91–98 (2012)

    Google Scholar 

  12. Shdaifat, M., Bukhari, S.S., Toyama, T., Dengel, A.: Robust object recognition in wearable eye tracking system. In: Pattern Recognition, pp. 650–654 (2016)

    Google Scholar 

  13. Positivescience eye tracker. http://positivescience.com

  14. Positive Science. Yarbus eye-tracking software user guide (2014)

    Google Scholar 

  15. Salvucci, D.D., Goldberg, J.H.: Identifying fixations and saccades in eye-tracking protocols, pp. 71–78 (2000)

    Google Scholar 

  16. Rayner K., Castelhano, M.S.: Eye movements during reading, scene perception, visual search, and while looking at print advertisements. Visual Advertising Hillsdale (2008)

    Google Scholar 

  17. Blignaut, P.: Fixation identification: the optimum threshold for a dispersion algorithm. Attention Percept. Psychophysics 71(4), 881 (2009)

    Article  Google Scholar 

  18. Howard, A.G.: Some improvements on deep convolutional neural network based image classification. Comput. Sci. (2013)

    Google Scholar 

Download references

Acknowledgement

This research was funded in part by the National Natural Science Foundation of China (61773413), Natural Science Foundation of Guangzhou City (201707010363), Six talent peaks project in Jiangsu Province (JY-074), and Science and Technology Program of Guangzhou City (201903010040).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ming Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Sun, S., Li, S., Liu, W., Zou, X., Li, M. (2019). Fixation Based Object Recognition in Autism Clinic Setting. In: Yu, H., Liu, J., Liu, L., Ju, Z., Liu, Y., Zhou, D. (eds) Intelligent Robotics and Applications. ICIRA 2019. Lecture Notes in Computer Science(), vol 11743. Springer, Cham. https://doi.org/10.1007/978-3-030-27538-9_53

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-27538-9_53

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-27537-2

  • Online ISBN: 978-3-030-27538-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics