Skip to main content

Comparisons of Deep Neural Networks in Multi-label Classification for Chinese Recipes

  • Conference paper
  • First Online:
Big Data (BigData 2020)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1320))

Included in the following conference series:

  • 1037 Accesses

Abstract

With the highly increasing demands, multi-label classification task has attracted more attention in recent years. However, most of traditional methods usually need tedious handcrafted features. Motivated by the remarkably strong performance of deep neural networks on the practically important tasks of natural language processing, we adopted various popular models, such as CNN, RNN and RCNN, to perform multi-label classification tasks for Chinese recipes. Based on the real Chinese recipe data extracted from websites, we compared the performance of deep neural networks in multi-label classification. We also compared them with the baseline models, such as Naive Bayes, MLKNN and fastText. In order to improve the performance of the these models, we adopted the data augmentation method and then conducted extensive experiments to compare different models in our task. The results showed that RCNN model performs the best and can get the highest score. The models based on deep neural networks all performed better than the baseline models. The results also showed that the data augmentation method is a practical method to improve the performance of all models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Cui, X., Goel, V., Kingsbury, B.: Data augmentation for deep neural network acoustic modeling. IEEE/ACM Trans. Audio Speech Lang. Process. 23(9), 1469–1477 (2015)

    Article  Google Scholar 

  2. Dembczynski, K., Cheng, W., Hüllermeier, E.: Bayes optimal multilabel classification via probabilistic classifier chains. In: ICML, pp. 279–286 (2010)

    Google Scholar 

  3. Dembszynski, K., Waegeman, W., Cheng, W., Hüllermeier, E.: On label dependence in multilabel classification. In: Proceedings of the LastCFP: ICML Workshop on Learning from Multi-Label Data, p. 8 (2010)

    Google Scholar 

  4. Ji, S., Tang, L., Yu, S., Ye, J.: A shared-subspace learning framework for multi-label classification. TKDD 4(2), 8:1–8:29 (2010)

    Article  Google Scholar 

  5. Joulin, A., Grave, E., Bojanowski, P., Mikolov, T.: Bag of tricks for efficient text classification. CoRR abs/1607.01759 (2016)

    Google Scholar 

  6. Kandola, E.J., Hofmann, T., Poggio, T., Shawe-Taylor, J.: A neural probabilistic language model. Stud. Fuzziness Soft Comput. 194, 137–186 (2006)

    Article  Google Scholar 

  7. Kim, Y.: Convolutional neural networks for sentence classification. CoRR abs/1408.5882 (2014)

    Google Scholar 

  8. Ko, T., Peddinti, V., Povey, D., Khudanpur, S.: Audio augmentation for speech recognition. In: INTERSPEECH, pp. 3586–3589 (2015)

    Google Scholar 

  9. Kobayashi, S.: Contextual augmentation: data augmentation by words with paradigmatic relations. CoRR abs/1805.06201 (2018)

    Google Scholar 

  10. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017)

    Article  Google Scholar 

  11. Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence, pp. 2267–2273 (2015)

    Google Scholar 

  12. Liu, P., Qiu, X., Huang, X.: Recurrent neural network for text classification with multi-task learning. CoRR abs/1605.05101 (2016)

    Google Scholar 

  13. Read, J., Pfahringer, B., Holmes, G., Frank, E.: Classifier chains for multi-label classification. Mach. Learn. 85(3), 333–359 (2011). https://doi.org/10.1007/s10994-011-5256-5

    Article  MathSciNet  Google Scholar 

  14. Rong, X.: word2vec parameter learning explained. CoRR abs/1411.2738 (2014)

    Google Scholar 

  15. Szegedy, C., et al.: Going deeper with convolutions. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR, pp. 1–9 (2015)

    Google Scholar 

  16. Tsoumakas, G., Katakis, I., Vlahavas, I.P.: Random k-labelsets for multilabel classification. IEEE Trans. Knowl. Data Eng. 23(7), 1079–1089 (2011)

    Article  Google Scholar 

  17. Wei, J.W., Zou, K.: EDA: easy data augmentation techniques for boosting performance on text classification tasks. CoRR abs/1901.11196 (2019)

    Google Scholar 

  18. Wu, X., et al.: Top 10 algorithms in data mining. Knowl. Inf. Syst. 14(1), 1–37 (2008). https://doi.org/10.1007/s10115-007-0114-2

    Article  Google Scholar 

  19. Xia, X., Yang, X., Li, S., Wu, C., Zhou, L.: RW.KNN: a proposed random walk KNN algorithm for multi-label classification. In: Proceedings of the 4th Workshop on Workshop for Ph.D. Students in Information & Knowledge Management, pp. 87–90 (2011)

    Google Scholar 

  20. Zhang, M., Zhou, Z.: ML-KNN: a lazy learning approach to multi-label learning. Pattern Recogn. 40(7), 2038–2048 (2007)

    Article  Google Scholar 

Download references

Acknowledgment

This work was supported by the project of Natural Science Foundation of China (No. 61402329, No. 61972456) and the Natural Science Foundation of Tianjin (No. 19JCYBJC15400).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chuitian Rong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Z., Rong, C., Zhang, X. (2021). Comparisons of Deep Neural Networks in Multi-label Classification for Chinese Recipes. In: Mei, H., et al. Big Data. BigData 2020. Communications in Computer and Information Science, vol 1320. Springer, Singapore. https://doi.org/10.1007/978-981-16-0705-9_12

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-0705-9_12

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-0704-2

  • Online ISBN: 978-981-16-0705-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics