Skip to main content

Advertisement

Log in

CrowdWatcher: an open-source platform to catch the eye of the crowd

  • Research Article
  • Published:
Quality and User Experience Aims and scope Submit manuscript

Abstract

This paper presents the open-source eye tracking platform CrowdWatcher. It enables researchers to measure gaze location and user engagement in a crowdsourcing context through traditional RGB webcams. The proposed platform particularly advances the field of Quality of Experience (QoE) research, as it allows the experimenter to collect remotely and with very limited effort novel information from crowds of participants, such as their commitment towards a task, attention and decision-making processes. Two different experiments are described that were conducted to demonstrate the platform’s potential. The first experiment addresses the measurement of participants’ behavior while performing a movie selection task. Results show that the platform provides complementary information to traditional self-reported data by taking gaze analysis into account. This is of particular relevance, since in a crowdsourcing context decision processes and attention are difficult to assess, and there is often limited control over the engagement of the test user with the task. A second experiment is conducted in the scenario of a multimedia QoE test. Prediction accuracy is compared to a professional infrared eye tracker. While CrowdWatcher performs less well than the professional eye tracker, it is still able to collect valuable gaze information in the far more challenging environment of crowdsourcing. As an outlook to further application domains, the usage of the platform to measure user engagement allows participants who do not pay attention to the task to be identified.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Notes

  1. https://webrtc.org Accessed Feb. 2019.

  2. Note that this average viewing distance is a configurable parameter in CrowdWatcher, and it can be easily changed to other values depending on the required test setup.

  3. https://github.com/Telecommunication-Telemedia-Assessment/CrowdWatcher Accessed Feb. 2019.

  4. http://www.imdb.com Accessed Feb. 2019.

  5. http://www.microworkers.com Accessed Feb. 2019.

  6. https://ffmpeg.org/ Accessed Feb. 2019.

  7. http://vq.kt.agh.edu.pl/metrics.html Accessed Feb. 2019.

  8. https://www.its.bldrdoc.gov/vqeg/projects/moavi/moavi.aspx Accessed Feb. 2019.

References

  1. Agaian SS, Lentz KP, Grigoryan AM (2000) A new measure of image enhancement. In: International conference on signal processing & communication

  2. Akamine WY, Farias MC (2014) Incorporating visual attention models into video quality metrics. In: SPIE-IS&T electronic imaging—image quality and system performance, vol 9016

  3. Alnajar F, Gevers T, Valenti R, Ghebreab S (2013) Calibration-free gaze estimation using human gaze patterns. In: IEEE international conference on computer vision, pp 137–144

  4. Bielikova M, Konopka M, Simko J, Moro R, Tvarozek J, Hlavac P, Kuric E (2018) Eye-tracking en masse: group user studies, lab infrastructure, and practices. J Eye Mov Res 11(3):6

    Google Scholar 

  5. Blignaut P (2010) Visual span and other parameters for the generation of heatmaps. In: Symposium on eye-tracking research & applications, pp 125–128

  6. Bylinskii Z, Judd T, Oliva A, Torralba A, Durand F (2018) What do different evaluation metrics tell us about saliency models? IEEE Trans Pattern Anal Mach Intell 3:740–757

    Google Scholar 

  7. Camgaze: Eye tracking in visible light from a webcam. https://github.com/wallarelvo/camgaze. Accessed Feb 2019

  8. Carrasco M (2011) Visual attention: the past 25 years. Vis Res 51(13):1484–1525

    Article  Google Scholar 

  9. Charness N, Dijkstra K, Jastrzembski T, Weaver S, Champion M (2008) Monitor viewing distance for younger and older workers. In: Human factors and ergonomics society annual meeting, vol 52, pp 1614–1617

  10. Cheng S, Sun Z, Ma X, Forlizzi JL, Hudson SE, Dey A (2015) Social eye tracking: Gaze recall with online crowds. In: 18th ACM conference on computer supported cooperative work & social computing, pp 454–463

  11. Choi IH, Jeong CH, Kim YG (2016) Tracking a driver’s face against extreme head poses and inference of drowsiness using a Hidden Markov Model. Appl Sci 6(5):137

    Article  Google Scholar 

  12. CrowdWatcher: An open source platform to catch the eye of the crowd. https://github.com/Telecommunication-Telemedia-Assessment/CrowdWatcher

  13. CVC: CVC eye tracker. https://github.com/tiendan/OpenGazer. Accessed Feb 2019

  14. De Vreede T, Nguyen C, De Vreede GJ, Boughzala I, Oh O, Reiter-Palmon R (2013) A theoretical model of user engagement in crowdsourcing. In: International conference on collaboration and technology, pp 94–109

  15. Drouard V, Horaud R, Deleforge A, Ba S, Evangelidis G (2017) Robust head-pose estimation based on partially-latent mixture of linear regressions. IEEE Trans Image Process 26(3):1428–1440

    Article  MathSciNet  Google Scholar 

  16. Egger-Lampl S, Redi J, Hoßfeld T, Hirth M, Möller S, Naderi B, Keimel C, Saupe D (2017) Crowdsourcing quality of experience experiments. In: Archambault D, Purchase H, Hoßfeld T (eds) Evaluation in the crowd. Crowdsourcing and human-centered experiments. Springer, Berlin, pp 154–190

    Chapter  Google Scholar 

  17. Engelke U, Barkowsky M, Callet PL, Zepernick HJ (2010) Modelling saliency awareness for objective video quality assessment. In: International workshop on quality of multimedia experience

  18. Engelke U, Pepion R, Callet PL, Zepernick HJ (2010) Linking distortion perception and visual saliency in h.264/avc coded video containing packet loss. In: SPIE 7744, Visual communications and image processing

  19. Engelke U, Zepernick HJ (2010) A framework for optimal region-of-interest based quality assessment in wireless imaging. J Electron Imaging 19(1):1–13

    Article  Google Scholar 

  20. EyeLink: 1000 Plus Eye Tracker. https://www.sr-research.com/products/eyelink-1000-plus/. Accessed Feb 2019

  21. EyeTribe: The Eye Tribe eye tracker. http://theeyetribe.com/theeyetribe.com/about/index.html. Accessed Feb 2019

  22. Ferhat O, Vilariño F (2016) Low cost eye tracking: the current panorama. Comput Intell Neurosci 5:2–14

    Google Scholar 

  23. Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

    Article  MathSciNet  Google Scholar 

  24. Gadiraju U, Kawase R, Dietze S, Demartini G (2015) Understanding malicious behavior in crowdsourcing platforms: the case of online surveys. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, pp 1631–1640

  25. Gadiraju U, Möller S, Nöllenburg M, Saupe D, Egger-Lampl S, Archambault D, Fisher B (2017) Crowdsourcing versus the laboratory: towards human-centered experiments using the crowd. In: Archambault D, Purchase H, Hoßfeld T (eds) Evaluation in the crowd. Crowdsourcing and human-centered experiments. Springer, Berlin, pp 6–26

    Chapter  Google Scholar 

  26. GazeHawk: Eye tracking for everyone. http://www.gazehawk.com/. Accessed Feb 2019

  27. Gazepoint: Eye tracking systems. https://www.gazept.com/. Accessed Feb 2019

  28. Glas N, Pelachaud C (2015) Definitions of engagement in human–agent interaction. In: International workshop on engagment in human computer interaction, pp 944–949

  29. Gomez S, Jianu R, Cabeen R, Guo H, Laidlaw D (2016) Fauxvea: crowdsourcing gaze location estimates for visualization analysis tasks

  30. Grier RA (2004) Visual attention and web design. Ph.D. Thesis, University of Cincinnati, Cincinnati, USA

  31. Hansen DW, Ji Q (2010) In the eye of the beholder: a survey of models for eyes and gaze. IEEE Trans Pattern Anal Mach Intell 32(3):478–500

    Article  Google Scholar 

  32. Hauser DJ, Schwarz N (2016) Attentive turkers: Mturk participants perform better on online attention checks than do subject pool participants. Behavior Res Methods 48(1):400–407

    Article  Google Scholar 

  33. Hernandez J, Liu Z, Hulten G, DeBarr D, Krum K, Zhang Z (2013) Measuring the engagement level of tv viewers. In: IEEE international conference on automatic face and gesture recognition, pp. 1–7

  34. Hirth M, Hoßfeld T, Mellia M, Schwartz C, Lehrieder F (2015) Crowdsourced network measurements: benefits and best practices. Comput Netw 90:85–98

    Article  Google Scholar 

  35. Hossfeld T, Keimel C, Hirth M, Gardlo B, Habigt J, Diepold K, Tran-Gia P (2014) Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. Trans Multimedia 16:541–558

    Article  Google Scholar 

  36. Huang J, White RW, Buscher G (2012) User see, user point: gaze and cursor alignment in web search. In: Conference on human factors in computing systems

  37. ITU: Open source gaze tracking library. https://sourceforge.net/projects/gazetrackinglib/. Accessed Feb 2019

  38. Janowski L, Papir Z (2009) Modeling subjective tests of quality of experience with a generalized linear model. In: International workshop on quality of multimedia experience

  39. Keimel C, Habigt J, Diepold K (2012) Challenges in crowd-based video quality assessment. In: Forth international workshop on quality of multimedia experience (QoMEX 2012), pp 13–18

  40. Kim NW, Bylinskii Z, Borkin MA, Gajos KZ, Oliva A, Durand F, Pfister H (2017) Bubbleview: an interface for crowdsourcing image importance maps and tracking visual attention. ACM Trans Comput Hum Interact 24(5):36

    Article  Google Scholar 

  41. Lebreton P, Hupont I, Mäki T, Skodras E, Hirth M (2015) Eye tracker in the wild, the delta between what is said and done in a crowdsourcing experiment. In: International ACM workshop on crowdsourcing for multimedia. Brisbane, Australia

  42. Lebreton P, Mäki T, Skodras E, Hupont I, Hirth M (2015) Bridging the gap between eye tracking and crowdsourcing. In: SPIE 9394, Human vision and electronic imaging XX

  43. Lindgaard G, Fernandes G, Dudek C, Brown J (2006) Attention web designers: you have 50 milliseconds to make a good first impression!. Behav Inf Technol 25(2):115–126

    Article  Google Scholar 

  44. Lu Z, Lin W, Ong E, Yang X, Yao S (2003) PQSM-based RR and NR video quality metrics. In: International society for optical engineering (SPIE), vol 5150, pp 633–640

  45. Lyu J, Yuan Z, Chen D (2018) Long-term multi-granularity deep framework for driver drowsiness detection. arXiv preprint arXiv:1801.02325

  46. Mancas M, Ferrera VP (2016) How to measure attention? In: From human attention to computational attention, pp 21–38

  47. Mao A, Kamar E, Horvitz E (2013) Why stop now? Predicting worker engagement in online crowdsourcing. In: AAAI conference on human computation and crowdsourcing

  48. Martin D, Carpendale S, Gupta N, Hoßfeld T, Naderi B, Redi J, Siahaan E, Wechsung I (2017) Understanding the crowd: ethical and practical matters in the academic use of crowdsourcing. In: Archambault D, Purchase H, Hoßfeld T (eds) Evaluation in the crowd. Crowdsourcing and human-centered experiments. Springer, Berlin, pp 27–69

    Chapter  Google Scholar 

  49. Meur OL, Ninassi A, Callet PL, Barba D (2010) Overt visual attention for free-viewing and quality assessment tasks impact of the regions of interest on a video quality metric. Signal Process Image Commun 25:547–558

    Article  Google Scholar 

  50. NEUROTechnology: SentiGaze SDK. http://www.neurotechnology.com/sentigaze.html. Accessed Feb 2019

  51. Ninassi A, Meur OL, Callet PL, Barba D, Tirel A (2006) Task impact on the visual attention in subjective image quality assessment. In: European signal processing conference

  52. Oliveira L, Cardoso JS, Lourenço A, Ahlström C (2018) Driver drowsiness detection: a comparison between intrusive and non-intrusive signal acquisition methods. In: 7th European workshop on visual information processing (EUVIP), pp 1–6

  53. OpenGazer: Open-source gaze tracker for ordinary webcams. http://www.inference.phy.cam.ac.uk/opengazer/. Accessed Feb 2019

  54. Papoutsaki A, Sangkloy P, Laskey J, Daskalova N, Huang J, Hays J (2016) Webgazer: scalable webcam eye tracking using user interactions. In: International joint conference on artificial intelligence, pp 3839–3845

  55. Peters C, Castellano G, de Freitas S (2009) An exploration of user engagement in HCI. In: International workshop on affective-aware virtual agents and social robots, p 9

  56. Poletti M, Rucci M (2016) A compact field guide to the study of microsaccades: challenges and functions. Vis Res 118:83–97

    Article  Google Scholar 

  57. PrincetonVision: TurkerGaze GitHub repository. https://github.com/PrincetonVision/TurkerGaze. Accessed Feb 2019

  58. PupilLabs: Platform for eye tracking and egocentric vision research. https://pupil-labs.com/pupil/. Accessed Feb 2019

  59. Redi JA, Povoa I (2013) The role of visual attention in the aesthetic appeal of consumer images: a preliminary study. In: Visual communications and image processing

  60. Rempel D, Willms K, Anshel J, Jaschinski W, Sheedy J (2007) The effects of visual display distance on eye accommodation, head posture, and vision and neck symptoms. Hum Factors 49(5):830–838

    Article  Google Scholar 

  61. Riegler M, Eg R, Calvet L, Lux M, Halvorsen P, Griwodz C (2015) Playing around the eye tracker—a serious game based dataset. In: GamifIR, pp 34–40

  62. Rodden K, Fu X, Aula A, Spiro I (2008) Eye-mouse coordination patterns on web search results pages. In: CHI’08 extended abstracts on Human factors in computing systems, pp 2997–3002

  63. Rudoy D, Goldman D, Shechtman E, Zelnik-Manor L (2012) Crowdsourcing gaze data collection. In: Collective intelligence conference

  64. Salam H, Celiktutan O, Hupont I, Gunes H, Chetouani M (2016) Fully automatic analysis of engagement and its relationship to personality in human-robot interactions. IEEE Access 5:705–721

    Article  Google Scholar 

  65. Salam H, Chetouani M (2015) A multi-level context-based modeling of engagement in human-robot interaction. In: 2015 11th IEEE international conference and workshops on automatic face and gesture recognition (FG), vol 3. IEEE, pp 1–6

  66. Savino PJ, Danesh-Meyer HV (2012) Color Atlas and Synopsis of Clinical Ophthalmology-Wills Eye Institute-Neuro-Ophthalmology. Lippincott Williams & Wilkins, Philadelphia

    Google Scholar 

  67. SightCorp: InSight SDK. http://sightcorp.com/insight/. Accessed Feb 2019

  68. Simko J, Bielikova M (2015) Gaze-tracked crowdsourcing. In: International workshop on semantic and social media adaptation and personalization, pp 1–5

  69. Sticky: Visual Measurement Tool. https://sticky.ai/. Accessed Feb 2019

  70. Stiefelhagen R (2002) Tracking focus of attention in meetings. In: IEEE international conference on multimodal interfaces, p 273

  71. Sugano Y, Matsushita Y, Sato Y, Koike H (2015) Appearance-based gaze estimation with online calibration from mouse operations. IEEE Trans Hum Mach Syst 45(6):750–760

    Article  Google Scholar 

  72. Tobii: Eye tracking products. https://www.tobii.com/. Accessed Feb 2019

  73. Viola P, Jones M (2004) Robust real-time face detection. Int J Comput Vis 57(2):137–154

    Article  Google Scholar 

  74. VisageTechnologies: FaceTrack SDK. http://visagetechnologies.com/products-and-services/visagesdk/facetrack/eye-and-gaze-tracking/. Accessed Feb 2019

  75. WebGazer: WebGazer library

  76. Wood E, Baltrusaitis T, Zhang X, Sugano Y, Robinson P, Bulling A (2015) Rendering of eyes for eye-shape registration and gaze estimation. In: IEEE international conference on computer vision

  77. xLabs: xLabs SDK for eye, gaze and head tracking. http://xlabsgaze.com/. Accessed Feb 2019

  78. Xu P, Ehinger KA, Zhang Y, Finkelstein A, Kulkarni SR, Xiao J (2015) TurkerGaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755

  79. You J (2013) Attention driven visual QOE: mechanism and methodologies. In: International conference on signal and information processing (ChinaSIP)

  80. Zielinski P, NetGazer. https://sourceforge.net/projects/netgazer/. Accessed Feb 2019

Download references

Acknowledgements

The authors thank Microworkers.com for sponsoring some of the crowdsourcing experiments. The research leading to these results received funding from the Deutsche Forschungsgemeinschaft (DFG) under Grants HO4770/2-2, TR257/38-2.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pierre Lebreton.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lebreton, P., Hupont, I., Hirth, M. et al. CrowdWatcher: an open-source platform to catch the eye of the crowd. Qual User Exp 4, 1 (2019). https://doi.org/10.1007/s41233-019-0024-6

Download citation

  • Received:

  • Published:

  • DOI: https://doi.org/10.1007/s41233-019-0024-6

Keywords