Skip to main content

Text Classification Based on ReLU Activation Function of SAE Algorithm

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 10261))

Abstract

In order to solve the deep self-coding neural network training process, the Sigmoid function back-propagation gradient is easy to disappear, a method based on ReLU activation function is proposed for training the self coding neural network. This paper analyzes the performance of different activation functions and comparing ReLU with traditional Tanh and Sigmoid activation function and in Reuters-21578 standard for experiments on the test set. The experimental results show that using ReLU as the activation function, not only can improve the network convergence speed, and can also improve the accuracy.

This work is supported by National nature science fund project (61373067); Inner Mongolia autonomous region, 2013 annual “prairie talent project”; Autonomous region “higher school youth science and technology talents” (NJYT-14-A09); Inner Mongolia natural science foundation (2013MS0911); Jilin province science and technology development fund project (20140101195JC); Inner Mongolia autonomous region higher school science and technology research (NJZY16177); Inner Mongolia autonomous nature science fund project (2016MS0624).

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Deng, W., Wang, G., Hong, Z.: Weighting naïve Bayesian mail filtering based on rough set. Comput. Sci. 02, 16–29 (2011)

    Google Scholar 

  2. Research on Automatic Summarization Based on Maximum Entropy. Kunming University of Science and Technology (2013)

    Google Scholar 

  3. Hinton, G.E., Salakhutdinov, R.R.: Reducing the dimensionality of data with neural networks. Science 313(5786), 504–507 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  4. Li, S.Z., Yu, B., Wu, W., et al.: Feature learning based on SAE-PCA net-work for human gesture recognition in RGBD images. Neuron Comput. 151(2), 565–573 (2015)

    Google Scholar 

  5. Jiang, G., Gu, N., Zang, X.: Study on Web Classification based on SAE-LBP. Miniat. Microcomput. Syst. 04, 223–276 (2016)

    Google Scholar 

  6. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Image-net classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systerms, pp. 1097–1105 (2012)

    Google Scholar 

  7. Glorot, X., Bordes, A., Bengio, Y.: Domain adaptation for large-scale sentiment classification: a deep learning approach. In: Proceedings of the 28th International Conference on Machine Learning, pp. 513–520 (2011)

    Google Scholar 

  8. Dahl, G.E., Sainnath, T.N., Hinton, G.E.: Improving deep neural networks for LVCSR using rectified linear units and dropout. In: Acoustics, Speech and Signal Processing (ICASSP), Piscataway, pp. 8609–8613 (2013)

    Google Scholar 

  9. Wang, Z., Ding, J.: Study on water body extraction method based on stacked self-coding. J. Comput. Application 35(9), 2706–2709 (2015)

    Google Scholar 

  10. Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier networks. J. Mach. Learn. Res. 15, 315–323 (2011)

    Google Scholar 

  11. Ge, S.S., Hang, C.C., Lee, T.H., et al.: Stable adaptive neural network control. Springer Publishing Company, Incorporated, Berlin (2010)

    MATH  Google Scholar 

  12. Xu, B., Wang, N., Chen, T., et al.: Empirical evaluation of rectified activations in convolutional network. arXiv preprint arXiv:1505.00853. (2015)

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Cui, Jl., Qiu, S., Jiang, My., Pei, Zl., Lu, Yn. (2017). Text Classification Based on ReLU Activation Function of SAE Algorithm. In: Cong, F., Leung, A., Wei, Q. (eds) Advances in Neural Networks - ISNN 2017. ISNN 2017. Lecture Notes in Computer Science(), vol 10261. Springer, Cham. https://doi.org/10.1007/978-3-319-59072-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-59072-1_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-59071-4

  • Online ISBN: 978-3-319-59072-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics