skip to main content
10.1145/3424978.3425009acmotherconferencesArticle/Chapter ViewAbstractPublication PagescsaeConference Proceedingsconference-collections
research-article

Classifying a Limited Number of the Bamboo Species by the Transformation of Convolution Groups

Authors Info & Claims
Published:20 October 2020Publication History

ABSTRACT

For agricultural special species, the labeled procedure of large-scale samples is costly, thus, the bamboo species only has a limited number for supervised learning. The fine-tuning strategy is important for deep neural network by transferring learning methods, which utilize the weight of the deep model of the source domain, and can solve the problem associated with insufficient samples to make the model more stability and robustness. In the manuscript, the novelty of the strategy, for images of bamboo species with low-shot classification, mainly proposed an idea that is the transfer of the convolutional group features of deep convolutional models. The deep models with a novel fine-tuning method and three optimizers that are stochastic gradient descent, Adaptive Moment estimation, and Adadelta respectively, are evaluated by the accuracy and the expected calibration error value for the analysis of deep model generalization. An analysis of the results showed that, based on the proportion of training dataset is only 30%, the innovative strategy for bamboo species classification achieved better performance that has an accuracy of 0.82, and the expected calibration error of 0.16, which were better stability and generalization than those of other fine-tuning strategies. Consequently, the novel fine-tuning strategy proposed in this manuscript transfers the features of deep convolutional groups, improves the accuracy and generalizability of the model, and resolves the problems associated with having insufficient samples of bamboo species for low-shot classification.

References

  1. Yaping H, Cuiwei Z, Benxiang L, Shimin F and Junming J (2012). Moderate Theory for Germplasm Conservation System of Bamboo in China. World Bamboo and Rattan, 10(6), 31--41. https://doi.org/10.13640/j.cnki.wbr.2012.06.011.Google ScholarGoogle Scholar
  2. Kamilaris A and Prenafeta-Boldú F X (2018). Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 147(February), 70--90. https://doi.org/10.1016/j.compag.2018.02.016.Google ScholarGoogle ScholarCross RefCross Ref
  3. Barbedo J G A (2018). Impact of dataset size and variety on the effectiveness of deep learning and transfer learning for plant disease classification. Computers and Electronics in Agriculture, 153(August), 46--53. https://doi.org/10.1016/j.compag.2018.08.013.Google ScholarGoogle ScholarCross RefCross Ref
  4. Simonyan K and Zisserman A (2015). Very Deep Convolutional Networks for Large-Scale Image Recognition. ICLR: 1--14. http://arxiv.org/abs/1409.1556.Google ScholarGoogle Scholar
  5. He K, Zhang X, Ren S and Sun J (2016). Deep residual learning for image recognition. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-Decem: 770--778. https://doi.org/10.1109/CVPR.2016.90.Google ScholarGoogle ScholarCross RefCross Ref
  6. Huang G, Liu Z, Van Der Maaten L and Weinberger K Q (2017). Densely connected convolutional networks. Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, 2017-Janua, 2261--2269. https://doi.org/10.1109/CVPR.2017.243.Google ScholarGoogle ScholarCross RefCross Ref
  7. Deng J, Dong W, Socher R, Li L, Li K and F F Li (2009). ImageNet: A Large-Scale Hierarchical Image Database, 248--255.Google ScholarGoogle Scholar
  8. Tan C, Sun F, Kong T, Zhang W, Yang C and Liu C (2018). A survey on deep transfer learning. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11141 LNCS, 270--279. https://doi.org/10.1007/978-3-030-01424-7_27.Google ScholarGoogle Scholar
  9. Huang J T, Li J, Yu D, Deng L and Gong Y (2013). Cross-language knowledge transfer using multilingual deep neural network with shared hidden layers. ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 1, 7304--7308. https://doi.org/10.1109/ICASSP.2013.6639081.Google ScholarGoogle Scholar
  10. Oquab M, Bottou L, Laptev I and Sivic J (2014). Learning and transferring mid-level image representations using convolutional neural networks. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1717--1724. https://doi.org/10.1109/CVPR.2014.222.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Akcay S, Kundegorski M E, Willcocks C G and Breckon T P (2018). Using deep convolutional neural network architectures for object classification and detection within x-ray baggage security imagery. IEEE Transactions on Information Forensics and Security, 13(9), 2203--2215. https://doi.org/10.1109/TIFS.2018.2812196.Google ScholarGoogle ScholarCross RefCross Ref
  12. Wen L, Li X and Gao L (2019). A transfer convolutional neural network for fault diagnosis based on ResNet-50. Neural Computing and Applications, 0123456789. https://doi.org/10.1007/s00521-019-04097-w.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Yosinski J, Clune J, Bengio Y and Lipson H (2014). How transferable are features in deep neural networks? Advances in Neural Information Processing Systems, 4(January), 3320--3328.Google ScholarGoogle Scholar
  14. Dean J, Corrado G S, Monga R, Chen K, Devin M. Le Q V, Mao M Z, Ranzato M A, Senior A, Tucker P, Yang K and Ng A Y (2012). Large scale distributed deep networks. Advances in Neural Information Processing Systems, 2(October 2014), 1223--1231.Google ScholarGoogle Scholar
  15. Reddi S J, Kale S and Kumar S (2018). On the Convergence of Adam and Beyond. ICLR: 1--23. http://arxiv.org/abs/1904.09237.Google ScholarGoogle Scholar
  16. Wang X, Zhang P, Zhao Q, Pan J and Yan Y (2016). Improved end-to-end speech recognition using adaptive per-dimensional learning rate methods. IEICE Transactions on Information and Systems, E99D, (10), 2550--2553. https://doi.org/10.1587/transinf.2016SLL0001.Google ScholarGoogle Scholar
  17. Guo C, Pleiss G, Sun Y and Weinberger K Q (2017). On calibration of modern neural networks. 34th International Conference on Machine Learning, ICML 2017, 3, 2130--2143.Google ScholarGoogle Scholar
  18. Han T, Liu C, Yang W and Jiang D (2019). Learning transferable features in deep convolutional neural networks for diagnosing unseen machine conditions. ISA Transactions, vol. 93, 341 -353. https://doi.org/10.1016/j.isatra.2019.03.017.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Classifying a Limited Number of the Bamboo Species by the Transformation of Convolution Groups

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      CSAE '20: Proceedings of the 4th International Conference on Computer Science and Application Engineering
      October 2020
      1038 pages
      ISBN:9781450377720
      DOI:10.1145/3424978

      Copyright © 2020 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 20 October 2020

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      CSAE '20 Paper Acceptance Rate179of387submissions,46%Overall Acceptance Rate368of770submissions,48%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader