Skip to main content

Sparse hidden units activation in Restricted Boltzmann Machine

  • Conference paper

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 366))

Abstract

Sparsity has become a concept of interest in machine learning for many years. In deep learning sparse solutions play crucial role in obtaining robust and discriminative features. In this paper, we study a new regularization term for sparse hidden units activation in the context of Restricted Boltzmann Machine (RBM). Our proposition is based on the symmetric Kullback-Leibler divergence applied to compare the actual and the desired distribution over the active hidden units. We compare our method against two other enforcing sparsity regularization terms by evaluating the empirical classification error using two datasets: (i) for image classification (MNIST), (ii) for document classification (20-newsgroups).

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    In [5] such approach is called selectivity.

  2. 2.

    A i⋅  denotes the i th row of matrix A, A ⋅ j denotes the j th column of matrix A, and A ij is the element of matrix A.

  3. 3.

    http://yann.lecun.com/exdb/mnist/

  4. 4.

    In the experiments we used the small version of the original dataset: http://www.cs.nyu.edu/~roweis/data.html.

References

  1. Bengio, Y.: Learning Deep Architectures for AI. Foundations and Trends® in Machine Learning 2(1):1-127. (2009).

    Google Scholar 

  2. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer New York. (2006).

    Google Scholar 

  3. Cho, K., Ilin, A., & Raiko, T.: Tikhonov-Type regularization for Restricted Boltzmann Machines. In Artificial Neural Networks and Machine Learning (ICANN 2012). pp. 81-88. Springer Berlin Heidelberg. (2012).

    Google Scholar 

  4. Glorot, X., Bordes, A., & Bengio, Y.: Deep Sparse Rectifier Networks. In Proceedings of the 14th International Conference on Artificial Intelligence and Statistics (AISTATS 2011). Journal of Machine Learning Research Workshop & Conference Proceedings 15:315-323. (2011).

    Google Scholar 

  5. Goh, H., Thome, N., & Cord, M.: Biasing restricted Boltzmann machines to manipulate latent selectivity and sparsity. In NIPS workshop on deep learning and unsupervised feature learning. (2010).

    Google Scholar 

  6. Hinton, G. E.: Training products of experts by minimizing contrastive divergence. Neural Comput 14:1771-1800. (2002)

    Article  MATH  Google Scholar 

  7. Hinton, G. E.: A practical guide to training Restricted Boltzmann Machines. In Neural Networks: Tricks of the Trade. pp. 599-619. Springer Berlin Heidelberg. (2012).

    Google Scholar 

  8. Hinton, G. E., Srivastava, N., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. arXiv preprint arXiv:1207.0580. (2012).

    Google Scholar 

  9. Larochelle, H., & Bengio, Y.: Classification using discriminative restricted Boltzmann machines. In Proceedings of the 25th International Conference on Machine learning (ICML 2008). pp. 536-543. (2008, July).

    Google Scholar 

  10. Lee, H., Ekanadham, C., & Ng, A.: Sparse deep belief net model for visual area V2. In Advances in Neural Information Processing Systems (NIPS 2007). pp. 873-880. (2007).

    Google Scholar 

  11. Le, Q.V., Ngiam, J., Coates, A., Lahiri, A., Prochnow, B., & Ng, A.: On optimization methods for deep learning. In Proceedings of the 28th International Conference on Machine Learning (ICML 2011). pp. 265-272. (2011).

    Google Scholar 

  12. Le Roux, N., & Bengio, Y.: Representational power of Restricted Boltzmann Machines and deep belief networks. Neural Computation 20(6):1631-1649. (2008).

    Article  MathSciNet  MATH  Google Scholar 

  13. Lewicki, M. S., & Sejnowski, T. J.: Learning overcomplete representations. Neural Computation 12(2):337-365. (2000).

    Article  Google Scholar 

  14. Marlin, B.M., Swersky, K., Chen, B., & Freitas, N.D.: Inductive principles for restricted Boltzmann machine learning. In International Conference on Artificial Intelligence and Statistics (ICML 2010). pp. 509-516. (2010).

    Google Scholar 

  15. Martens, J., Chattopadhya, A., Pitassi, T., & Zemel, R.: On the Expressive Power of Restricted Boltzmann Machines. In Advances in Neural Information Processing Systems (NIPS 2013). pp. 2877-2885. (2013).

    Google Scholar 

  16. Nair, V., & Hinton, G. E.: 3D object recognition with deep belief nets. In Advances in Neural Information Processing Systems (NIPS 2009). pp. 1339-1347. (2009).

    Google Scholar 

  17. Nair, V., & Hinton, G. E.: Rectified linear units improve Restricted Boltzmann Machines. In Proceedings of the 27th International Conference on Machine Learning (ICML 2010). pp. 807-814. (2010).

    Google Scholar 

  18. Smolensky, P.: Information processing in dynamical systems: foundations of harmony theory. In: Parallel distributed processing: explorations in the microstructure of cognition, Vol. 1: Foundations, pp. 194281. MIT Press, Cambridge, MA, USA. (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jakub M. Tomczak .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Tomczak, J.M., Gonczarek, A. (2015). Sparse hidden units activation in Restricted Boltzmann Machine. In: Selvaraj, H., Zydek, D., Chmaj, G. (eds) Progress in Systems Engineering. Advances in Intelligent Systems and Computing, vol 366. Springer, Cham. https://doi.org/10.1007/978-3-319-08422-0_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-08422-0_27

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-08421-3

  • Online ISBN: 978-3-319-08422-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics