Abstract:
Convolutional neural networks (CNNs) have been widely used in a range of tasks because of its robust convolutional feature transformation ability. In this paper, we propo...Show MoreMetadata
Abstract:
Convolutional neural networks (CNNs) have been widely used in a range of tasks because of its robust convolutional feature transformation ability. In this paper, we propose a novel type of convolution called Boosting Attention Convolution (BAC) to improve the basic convolutional feature transformation process of CNNs. The proposed method is designed based on two principles, boosting and attention mechanism. Specifically, we design a set of simple yet effective Boosting Attention Modules (BAM) within grouped convolution, which progressively recalibrate distribution of feature map and enable the future filters nested in a convolution layer to focus more on the feature regions that are unactivated by previous filters. Thus, it can help CNNs generate more discriminative representations by explicitly incorporating richer information. The experimental results on various datasets verify that BAC outperforms state-of-the-art methods. More importantly, the proposed BAC is a general convolution that can be deployed to various modern networks without introducing much parameters and computational complexity.
Date of Conference: 05-09 July 2021
Date Added to IEEE Xplore: 09 June 2021
ISBN Information: