skip to main content
10.1145/3426826.3426830acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmlmiConference Proceedingsconference-collections
research-article

Cooperation of Neural Networks for Spoken Digit Classification

Published: 17 December 2020 Publication History

Abstract

Notably, all neural network models are trained by using gradient descent, and by far, the most successful approach for machine learning is to use gradient descent. However, this is a greedy algorithm and hits some of the biggest open problems in the neural networks. By using gradient descent, it is not guaranteed that a better solution cannot be found. Here, this article has presented an empirical study of the performance of two hidden layers’ neural networks. It gives practical methods to improve the accuracy of neural networks: cooperation method of neural network. In this study, our group applied the data augmentation method by adding noise into the training data set and compared 3 kinds of training methods: batch gradient descent (BGD), stochastic gradient descent (SGD), and batch stochastic gradient descent (BSGD). According to cooperating the neural networks, the performance of these neural networks has improved compared to baseline neural networks by 47% (PEG (generalization classification error probability) of 9 neural networks in cooperation is 0.071). Finally, the real-time classification using a cooperation method which has PEG equals 0.04 (single neural networks’ PEG is 0.104), further proves the results that cooperation improves the performance of neural networks.

References

[1]
Graves, A., Mohamed, A. R., and Hinton, G. 2013. Speech recognition with deep recurrent neural networks. In 2013 IEEE international conference on acoustics, speech and signal processing (pp. 6645-6649). IEEE.
[2]
Richard Szeliski. 2010. Computer Vision: Algorithms and Applications (1st. ed.). Springer-Verlag, Berlin, Heidelberg.
[3]
Ronan Collobert and Jason Weston. 2008. A unified architecture for natural language processing: deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning (ICML ’08). Association for Computing Machinery, New York, NY, USA, 160–167.
[4]
Fu, Q., Moreno-Daniel, A., Juang, B. H., Zhou, J. L., and Soong, F. K. 2006. Generalization of the minimum classification error (mce) training based on maximizing generalized posterior probability (gpp). In Ninth International Conference on Spoken Language Processing.
[5]
Dziugaite, G. K., and Roy, D. M. 2017. Computing nonvacuous generalization bounds for deep (stochastic) neural networks with many more parameters than training data. arXiv preprint arXiv:1703.11008.
[6]
Salamon, J., and Bello, J. P. 2017. Deep convolutional neural networks and data augmentation for environmental sound classification. IEEE Signal Processing Letters, 24 (3), 279-283.
[7]
Patrice Y. Simard, Dave Steinkraus, and John C. Platt. 2003. Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis. In Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2 (ICDAR ’03). IEEE Computer Society, USA, 958.
[8]
Nielsen, M. A. 2015. Neural networks and deep learning (Vol. 2018). San Francisco, CA: Determination press.
[9]
Liu, J. S., and Wu, Y. N. 1999. Parameter expansion for data augmentation. Journal of the American Statistical Association, 94 (448), 1264-1274.
[10]
Theodoros Evgeniou, Massimiliano Pontil, and Tomaso Poggio. 1999. A Unified Framework for Regularization Networks and Support Vector Machines. Technical Report. Massachusetts Institute of Technology, USA.
[11]
Wei, Q., Kasabov, N., Polycarpou, M., and Zeng, Z. 2019. Deep learning neural networks: Methods, systems, and applications, Neurocomputing, 2020, 396, 130-132.
[12]
Bottou, L. 2012. Stochastic gradient descent tricks. In Neural networks: Tricks of the trade (pp. 421-436). Springer, Berlin, Heidelberg.
[13]
Prechelt, L. 2012. Neural Networks: Tricks of the Trade. Lecture Notes in Computer Science.
[14]
Glorot, X., and Bengio, Y. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics (pp. 249-256).
[15]
Hinton, G., Srivastava, N., and Swersky, K. 2012. Neural networks for machine learning lecture 6a overview of mini-batch gradient descent. Cited on, 14 (8).
[16]
Bottou, L. 2010. Large-scale machine learning with stochastic gradient descent. In Proceedings of COMPSTAT'2010 (pp. 177-186). Physica-Verlag HD.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
MLMI '20: Proceedings of the 2020 3rd International Conference on Machine Learning and Machine Intelligence
September 2020
138 pages
ISBN:9781450388344
DOI:10.1145/3426826
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 December 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Batch stochastic gradient descent
  2. Cooperation of neural network
  3. Machine learning

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

MLMI '20

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 41
    Total Downloads
  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media