Loading [a11y]/accessibility-menu.js
A Gated Peripheral-Foveal Convolutional Neural Network for Unified Image Aesthetic Prediction | IEEE Journals & Magazine | IEEE Xplore

A Gated Peripheral-Foveal Convolutional Neural Network for Unified Image Aesthetic Prediction


Abstract:

Learning fine-grained details is a key issue in image aesthetic assessment. Most of the previous methods extract the fine-grained details via random cropping strategy, wh...Show More

Abstract:

Learning fine-grained details is a key issue in image aesthetic assessment. Most of the previous methods extract the fine-grained details via random cropping strategy, which may undermine the integrity of semantic information. Extensive studies show that humans perceive fine-grained details with a mixture of foveal vision and peripheral vision. Fovea has the highest possible visual acuity and is responsible for seeing the details. The peripheral vision is used for perceiving the broad spatial scene and selecting the attended regions for the fovea. Inspired by these observations, we propose a gated peripheral-foveal convolutional neural network. It is a dedicated double-subnet neural network (i.e., a peripheral subnet and a foveal subnet). The former aims to mimic the functions of peripheral vision to encode the holistic information and provide the attended regions. The latter aims to extract fine-grained features on these key regions. Considering that the peripheral vision and foveal vision play different roles in processing different visual stimuli, we further employ a gated information fusion network to weigh their contributions. The weights are determined through the fully connected layers followed by a sigmoid function. We conduct comprehensive experiments on the standard Aesthetic Visual Analysis (AVA) dataset and Photo.net dataset for unified aesthetic prediction tasks: 1) aesthetic quality classification; 2) aesthetic score regression; and 3) aesthetic score distribution prediction. The experimental results demonstrate the effectiveness of the proposed method.
Published in: IEEE Transactions on Multimedia ( Volume: 21, Issue: 11, November 2019)
Page(s): 2815 - 2826
Date of Publication: 14 April 2019

ISSN Information:

Funding Agency:


References

References is not available for this document.