Loading [a11y]/accessibility-menu.js
High Performance and Efficient Real-Time Face Detector on Central Processing Unit Based on Convolutional Neural Network | IEEE Journals & Magazine | IEEE Xplore

High Performance and Efficient Real-Time Face Detector on Central Processing Unit Based on Convolutional Neural Network


Abstract:

Face detection is crucial in the development of face recognition, expression, tracking, and classification. Conventional methods have accuracy constraints on several chal...Show More

Abstract:

Face detection is crucial in the development of face recognition, expression, tracking, and classification. Conventional methods have accuracy constraints on several challenging conditions, including nonfrontal faces, occlusions, and complex backgrounds. However, the convolutional neural network (CNN) methods produce high performances despite a large amount of computation. Therefore, CNN requires expensive hardware and is not suitable for low-cost central processing units (CPUs). This article develops a light architecture for a CNN-based real-time face detector. The proposed architecture consists of two main modules, the backbone to extract distinctive facial features and multilevel detection to perform prediction at multiple scales. Furthermore, it utilizes several approaches to enhance the training result, including balancing loss and tweaks on the training configuration. The proposed detector has one stage and is trained using the input of images from WIDER FACE with challenges, which contains more challenging images than other datasets. As a result, the detector achieves state-of-the-art performance on several benchmark datasets compared with the other CPU-based models. Then, its efficiency is superior to that of competitors, as it runs at 53 frames per second on a CPU for video graphics array resolution images.
Published in: IEEE Transactions on Industrial Informatics ( Volume: 17, Issue: 7, July 2021)
Page(s): 4449 - 4457
Date of Publication: 07 September 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.