Abstract:
This paper investigates the discriminative capabilities of facial action units (AUs) exhibited by an individual while performing a task on a tablet computer in a semi-unc...Show MoreMetadata
Abstract:
This paper investigates the discriminative capabilities of facial action units (AUs) exhibited by an individual while performing a task on a tablet computer in a semi-unconstrained environment. To that end, AUs are measured on a frame-by-frame basis from videos of 96 different subjects participating in a game-show-like quiz game that included a prize incentive. We propose a method that leverages the activation characteristics, as well as the temporal dynamics of facial behavior. In order to demonstrate the discriminative capabilities of the proposed approach, we perform identity matching across all subject pairs. Overall, the rank-1 matching performance of our algorithm ranges from 55% and up to 85%, on scenarios where the emotional disparity between the reference and query samples is largest and smallest, respectively. We believe these results represent a significant improvement relative to existing work relying on the use of AUs for human identification, in particular because the experimental settings guarantee that the facial expressions involved are spontaneous.
Date of Conference: 25-28 September 2016
Date Added to IEEE Xplore: 19 August 2016
ISBN Information:
Electronic ISSN: 2381-8549