Eyelid contour detection and tracking for startle research related eye-blink measurements from high-speed video records

https://doi.org/10.1016/j.cmpb.2013.06.003Get rights and content

Abstract

Using the positions of the eyelids is an effective and contact-free way for the measurement of startle induced eye-blinks, which plays an important role in human psychophysiological research. To the best of our knowledge, no methods for an efficient detection and tracking of the exact eyelid contours in image sequences captured at high-speed exist that are conveniently usable by psychophysiological researchers.

In this publication a semi-automatic model-based eyelid contour detection and tracking algorithm for the analysis of high-speed video recordings from an eye tracker is presented. As a large number of images have been acquired prior to method development it was important that our technique is able to deal with images that are recorded without any special parametrisation of the eye tracker. The method entails pupil detection, specular reflection removal and makes use of dynamic model adaption.

In a proof-of-concept study we could achieve a correct detection rate of 90.6%. With this approach, we provide a feasible method to accurately assess eye-blinks from high-speed video recordings.

Introduction

Emotions are considered to be a readiness for action and a strong motivational force: emotional states activate the organism for a certain behaviour [1]. Furthermore, emotion is closely connected to other cognitive processes, to which environmental stimuli we attend to and how we process them [2]. For this reason, the study of emotion is of principal interest to the psychological science. However, the measurement of emotion is associated with certain methodological limitations [3]. While self-report remains useful for many experimental fields, it also entails some profound disadvantages. In many situations people are not totally aware of their affective states or unable to verbalise them. Furthermore, self-report might be biased by social desirability. These constraints led to the development of research tools that measure behavioural and physiological responses to emotional stimuli. Emotions can be inferred from vocal changes, facial expressions and body postures [3]. Reaction times and error rates in tasks that require fast behavioural responses to emotional stimuli allow conclusions about the individual's affective states and evaluations [4]. In addition, affective states can also be measured via physiological markers—an object of research for a sub-discipline named psychophysiology. A typical experimental setting usually requires the participant to view or listen to emotional information while changes in physiological parameters are recorded. Such a psychophysiological method is the startle eye-blink response, which is modulated by the perceptual processing of affective information [5].

Section snippets

Background

In this section the background of startle modulation and the eye-blink response as experimental standard measure will be described. Then, advantages and limitations of existing methods for eye-blink measurement will be given, followed by a description of the technical challenges of our proposed method.

Related work

Eyelid detection and tracking for the recognition of the eye's position is a widely discussed research topic. Some of the possible applications are iris recognition techniques used in biometrics [22], [23], [24], [25], human–computer interaction [26], [27], [28], e.g. used in consumer electronics [21], or facial expression recognition [29], [30], [28].

However, the exact determination of the eyelid contour in order to determine the exact eyelid distance during eye-blinks, which is of main

Methods

This section describes the technical aspects of the developed algorithm for eye-blink measurement including eyelid detection and tracking. At first the existing input data is described and the definition of a valid trial is given. Then the principal idea of the entire detection and tracking system is given followed by an in-depth explanation.

Ground-truth creation

For creating the ground truth two points in an image have been selected manually by the author. These points are selected in such a way that they lie on the eyelid contours of the upper (or the lower eyelid, respectively) and the distance of the two points in the vertical direction at image column cd is maximised. With that, the horizontal position of the points is determined by the highest point on the upper eyelid contour and the lowest point on the lower eyelid contour, respectively. The

Practical value

The algorithm has been implemented and evaluated in Matlab R2011b 64-bit on a Core 2 Duo 2.4 GHz processor with 4 GB RAM. The approximate processing of 96 trials, each comprising 500 frames, took 72 min, leading to an average processing time of 45 s per trial. As the processing of trials is performed after data acquisition, i.e. there is no requirement for real-time processing, a processing time of 45 s per trial is absolutely reasonable. The reduction of processing time is possible by processing

Discussion

The main problems that occur during eyelid contour detection have been identified in Section 5.2 as the detection of wrong edges and eyelashes.

The detection of wrong edges as eyelid edges is a challenging problem because our algorithm simply assumes that the strongest edge is the eyelid edge. We have addressed most of these cases by the restriction of the region where the eyelid contour is expected. Luckily, the remaining cases where wrong edges are detected were rather rare and thus no further

Conflicts of interest

There are no conflicts of interest.

Mode of availability of software

A prototype of the developed software can be issued for non-commercial use upon request per email to the corresponding author.

References (38)

  • P.J. Lang

    The emotion probe. Studies of motivation and attention

    The American Psychologist

    (1995)
  • P.J. Lang et al.

    Emotion, attention, and the startle reflex

    Psychological Review

    (1990)
  • W.K. Berg et al.

    Startle elicitation: stimulus parameters, recording techniques, and quantification

  • T.D. Blumenthal

    Committee report: guidelines for human startle eyeblink electromyographic studies

    Psychophysiology

    (2005)
  • Y.H. Bang

    The role of muller's muscle reconsidered

    Plastic and Reconstructive Surgery

    (1998)
  • C. Evinger

    Eyelid movements mechanisms and normal data

    Investigative Ophthalmology & Visual Science

    (1991)
  • K. Schmidtke et al.

    Nervous control of eyelid function. a review of clinical, experimental and pathological data

    Brain

    (1992)
  • L. Bour et al.

    Neurophysiological aspects of eye and eyelid movements during blinking in humans

    Journal of Neurophysiology

    (2000)
  • D. Mas et al.

    Noninvasive measurement of eye retraction during blinking

    Optics Letters

    (2010)
  • Cited by (15)

    • Blink detection using Adaboost and contour circle for fatigue recognition

      2017, Computers and Electrical Engineering
      Citation Excerpt :

      Fig. 1) In our blink recognition system, the classifiers are trained by using the Harr features as shown in Fig. 2 [23–28]. The average death of the traffic accidents is about 10 thousands in China in recent years, and most of the traffic accidents are caused by fatigue driving.

    • Multimodal analysis of startle type responses

      2016, Computer Methods and Programs in Biomedicine
      Citation Excerpt :

      While facial video capture has been previously used in manual analysis of startle responses [44], there are only a few papers that address automated analysis of startle response videos. Bernard and colleagues proposed a novel method for automated eyelid detection and tracking in the context of eye blink startle response measurements [10]. Derakshani and Lovelace introduced their method of eyelid tracking in high-speed videos together with classification of eye blink startle responses based on support vector machines [11].

    View all citing articles on Scopus
    View full text