Abstract
In this work we describe a multi-input Convolutional Neural Network for text classification which allows for combining text preprocessed at word level, byte pair encoding level and character level. We conduct experiments on different datasets and we compare the results obtained with other classifiers. We apply the developed model to two different practical use cases: (1) classifying ingredients into their corresponding classes by means of a corpus provided by Northfork; and (2) classifying texts according to the English level of their corresponding writers by means of a corpus provided by ProvenWord. Additionally, we perform experiments on a standard classification task using Yahoo! Answers and GermEval2017 task A datasets. We show that the developed architecture obtains satisfactory results with these corpora, and we compare results obtained for each dataset with different state-of-the-art approaches, obtaining very promising results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
https://bit.ly/2DwXyME L6 - Yahoo! Answers Comprehensive Questions and Answers version 1.0 (multi part).
- 3.
Note that maximum length of BPE and character sequences coincide. However, this does not mean that BPE splits sequences into characters, given that the BPE implementation used adds extra tokens for later recovering the original words.
- 4.
References
Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd KDD - ACM SIGKDD, pp. 785–794. ACM (2016)
Chollet, F., et al.: Keras: deep learning library for Theano and TensorFlow, vol. 7(8) (2015). https://keras.io/k
Hövelmann, L., Allee, S., Friedrich, C.M.: Fasttext and gradient boosted trees at GermEval-2017 on relevance classification and document-level polarity. In: Shared Task on Aspect-Based Sentiment in Social Media Customer Feedback, pp. 30–35 (2017)
Jiang, L., Li, C., Wang, S., Zhang, L.: Deep feature weighting for Naive Bayes and its application to text classification. Eng. Appl. AI 52, 26–39 (2016)
Jiang, M., et al.: Text classification based on deep belief network and softmax regression. Neural Comput. Appl. 29(1), 61–70 (2018)
Joulin, A., Grave, E., Bojanowski, P., Mikolov, T.: Bag of tricks for efficient text classification. In: Proceedings of EACL 2017, vol. 2, pp. 427–431. ACL (2017)
Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. CoRR abs/1412.6980 (2014)
Koehn, P.: Statistical significance tests for machine translation evaluation. In: Proceedings of the EMNLP 2004, pp. 388–395 (2004)
Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: AAAI, vol. 333, pp. 2267–2273 (2015)
Liang, D., Xu, W., Zhao, Y.: Combining word-level and character-level representations for relation classification of informal text. In: Proceedings of the 2nd Workshop on Representation Learning for NLP, pp. 43–47. ACL (2017)
Lilleberg, J., Zhu, Y., Zhang, Y.: Support vector machines and Word2vec for text classification with semantic features. In: ICCI*CC, pp. 136–140. IEEE (2015)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: Proceedings of Workshop at ICLR 2013 (2013)
Mishra, P., Mujadia, V., Lanka, S.: Germeval 2017: sequence based models for customer feedback analysis. In: Shared Task on Aspect-Based Sentiment in Social Media Customer Feedback, pp. 36–42 (2017)
Bridle, J.S.: Probabilistic interpretation of feedforward classification network outputs, with relationships to statistical pattern recognition. In: Soulié, F.F., Hérault, J. (eds.) Neurocomputing. NATO ASI Series (Series F: Computer and Systems Sciences), vol. 68, pp. 227–236. Springer, Heidelberg (1990). https://doi.org/10.1007/978-3-642-76153-9_28
Sayyed, Z.A., Dakota, D., Kübler, S.: IDS IUCL: investigating feature selection and oversampling for GermEval2017. In: Shared Task on Aspect-based Sentiment in Social Media Customer Feedback, pp. 43–48 (2017)
Sennrich, R., Haddow, B., Birch, A.: Neural machine translation of rare words with subword units. In: Proceedings of ACL 2016, vol. 1, pp. 1715–1725. ACL (2016)
Tan, Y.: An improved KNN text classification algorithm based on K-medoids and rough set. In: 2018 10th International Conference on IHMSC, vol. 1, pp. 109–113. IEEE (2018)
Wojatzki, M., Ruppert, E., Holschneider, S., Zesch, T., Biemann, C.: GermEval 2017: shared task on aspect-based sentiment in social media customer feedback
Xu, B., Wang, N., Chen, T., Li, M.: Empirical evaluation of rectified activations in convolutional network. CoRR abs/1505.00853 (2015)
Xu, J., Zhang, C., Zhang, P., Song, D.: Text classification with enriched word features. In: Geng, X., Kang, B.-H. (eds.) PRICAI 2018. LNCS (LNAI), vol. 11013, pp. 274–281. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-97310-4_31
Zhang, X., LeCun, Y.: Text understanding from scratch. arXiv preprint arXiv:1502.01710 (2015)
Acknowledgments
Work partially supported by MINECO under grant DI-15-08169 and by Sciling under its R+D programme. The authors would like to thank NVIDIA for their donation of Titan Xp GPU that allowed to conduct this research.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Parcheta, Z., Sanchis-Trilles, G., Casacuberta, F., Redahl, R. (2019). Multi-input CNN for Text Classification in Commercial Scenarios. In: Rojas, I., Joya, G., Catala, A. (eds) Advances in Computational Intelligence. IWANN 2019. Lecture Notes in Computer Science(), vol 11506. Springer, Cham. https://doi.org/10.1007/978-3-030-20521-8_49
Download citation
DOI: https://doi.org/10.1007/978-3-030-20521-8_49
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-20520-1
Online ISBN: 978-3-030-20521-8
eBook Packages: Computer ScienceComputer Science (R0)