Regularizing neural networks with adaptive local drop | IEEE Conference Publication | IEEE Xplore

Regularizing neural networks with adaptive local drop


Abstract:

Neural network (NN) models have shown good performance on many image recognition benchmarks. Given large image datasets, these models typically have millions or billions ...Show More

Abstract:

Neural network (NN) models have shown good performance on many image recognition benchmarks. Given large image datasets, these models typically have millions or billions of parameters that can easily lead to over-fitting without regularization. Dropout and DropConnect show their effectiveness of regularizing large fully connected layers within neural networks. In Dropout, each neural activation within the network is randomly set to zero with a probability during training. In DropConnect, a generalization of Dropout, each connection weight within the network is randomly set to zero with a probability instead. Both of the probabilities in Dropout and DropConnect are universal predefined constants. We propose Adaptive Local Drop (ALDrop), a novel regularization method that sets each connection weight within the network with a learned probability adaptive to the input image dataset using a locality-based measure. Experiments on several image recognition benchmarks show that our model outperforms Dropout and DropConnect.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information:

ISSN Information:

Conference Location: Killarney, Ireland

Contact IEEE to Subscribe

References

References is not available for this document.