skip to main content
10.1145/3379247.3379253acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiccdeConference Proceedingsconference-collections
research-article

A Weight Initialization Method Associated with Samples for Deep Feedforward Neural Network

Authors Info & Claims
Published:07 March 2020Publication History

ABSTRACT

Artificial neural network is an important force to promote the development of artificial intelligence, but it usually needs to be trained before use. The initial weight given randomly is the most widely used method when training neural networks. However, randomly given initial weights are independent of the samples. A weight initialization method associated with samples for deep feedforward neural network (DFFNN) is proposed. The initial weights set by this method are a combination of the original weights given randomly and the weights obtained by the first epoch training. The initial weights not only have random characteristics, but are also closely related to the samples to be trained. The proposed method is tested with the bearing data provided by the Case Western Reserve University (CWRU) Bearing Data Center. The testing results show that the proposed method can accelerate the training of DFFNN to some extent.

References

  1. Hinton G. E. and Salakhutdinov R. R. 2006. Reducing the dimensionality of data with neural networks, Science 313, 504--507.Google ScholarGoogle ScholarCross RefCross Ref
  2. Wang L. N., Yang Y., Min R. Q., and Chakradhar S. 2017. Accelerating deep neural network training with inconsistent stochastic gradient descent, Neural Networks 93, 219--229.Google ScholarGoogle ScholarCross RefCross Ref
  3. Nur A. S., Radzi N. H. M., and Ibrahim A. O. 2014. Artificial neural network weight optimization: A review, TELKOMNIKA Indonesian Journal of Electrical Engineering 12, 6897--6902.Google ScholarGoogle ScholarCross RefCross Ref
  4. Combes R. T. D., Pezeshki M., Shabanian S., Courville A., and Bengio Yoshua, 2018. On the learning dynamics of deep neural networks, arXiv: 1809.06848v1.Google ScholarGoogle Scholar
  5. Ramos E. Z., Nakakuni M., and Yfantis E. 2017. Quantitative measures to evaluate neural network weight initialization strategies, in IEEE 7th Annual Computing and Communication Workshop and Conference (CCWC), Las Vegas, USA, 2017.Google ScholarGoogle Scholar
  6. Kumar S. K., On weight initialization in deep neural networks, arXiv:1704.08863v2, 2017.Google ScholarGoogle Scholar
  7. Daniely A., Frostig R., and Singer Y. 2016. Toward deeper understanding of neural networks: The power of initialization and a dual view on expressivity, in 30th Conference on Neural Information Processing Systems (NIPS 2016), Barcelona, Spain, 2016, pp. 1--9.Google ScholarGoogle Scholar
  8. Lee A., Geem Z. W., and Suh K. D., 2016. Determination of optimal initial weights of an artificial neural network by using the harmony search algorithm: Application to breakwater armor stones, Applied Sciences 6, 1--17.Google ScholarGoogle ScholarCross RefCross Ref
  9. Schmidhuber J. 2015. Deep learning in neural networks: An overview, Neural Networks 61, 85--117.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Case Western Reserve University Bearing Data Center Website. <http://csegroups.case.edu/bearingdatacenter/home>.Google ScholarGoogle Scholar
  11. DeepLearnToolbox website. <https://github.com/rasmusbergpalm/DeepLearnToolbox>.Google ScholarGoogle Scholar
  12. Yang Y. L., Fu P. Y., and He Y. C. 2018. Bearing faults automatic classification based on deep learning, IEEE Access 6, 71540--71554.Google ScholarGoogle ScholarCross RefCross Ref
  13. Yang Y. L. and Fu P. Y. 2018. Rolling-element bearing fault data automatic clustering based on wavelet and deep neural network, Shock and Vibration 2018, Article ID 3047830, 11 pages.Google ScholarGoogle Scholar

Index Terms

  1. A Weight Initialization Method Associated with Samples for Deep Feedforward Neural Network

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        ICCDE '20: Proceedings of 2020 6th International Conference on Computing and Data Engineering
        January 2020
        279 pages
        ISBN:9781450376730
        DOI:10.1145/3379247

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 7 March 2020

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article
        • Research
        • Refereed limited

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader