Loading [a11y]/accessibility-menu.js
Comparison of auto-encoders with different sparsity regularizers | IEEE Conference Publication | IEEE Xplore

Comparison of auto-encoders with different sparsity regularizers


Abstract:

Generally, in order to learn sparse representations for raw inputs via an auto-encoder, the Kullback-Leibler (KL) divergence as a sparsity regularizer is introduced to th...Show More

Abstract:

Generally, in order to learn sparse representations for raw inputs via an auto-encoder, the Kullback-Leibler (KL) divergence as a sparsity regularizer is introduced to the loss function for penalizing active code units. In fact, there exist other sparsity regularizers except the KL divergence. This paper introduces some classical sparsity regularizers into auto-encoders, and empirically gives a survey on the auto-encoders with different sparsity regularizers. Specifically, we analyze another two sparsity regularizers which are usually used in sparse coding. In addition, we also consider the effect of different activation functions and different sparsity regularizers on learning performance of auto-encoders. Our experiments are conducted on the datasets of MNIST and COIL.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information:

ISSN Information:

Conference Location: Killarney

Contact IEEE to Subscribe

References

References is not available for this document.