Abstract
The analysis and understanding of spoken texts is an important task in artificial intelligence and natural language processing. However, there are many verbose expressions (such as mantras, nonsense, modal particle, etc.) in spoken texts, which brings great challenges to subsequent tasks. This paper devote to detect verbose expressions in spoken texts. Considering the correlation of verbose words/characters in spoken texts, we adapt sequence models to detect them with an end-to-end manner. Moreover, we propose a model with the long-short term memory (LSTM) and modified restrict attention (MRA) mechanism which are able to utilize the mutual influence between long-distance and local words in sentences. In addition, we propose a compare mechanism to model the repetitive verbose expressions. The experimental result shows that compared with the rule-based and direct classification methods, our proposed model increases F1 measure by 54.08% and 18.91%.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Adda-Decker, M., Adda, G., Lamel, L.: Investigating text normalization and pronunciation variants for German broadcast transcription. In: Sixth International Conference on Spoken Language Processing (2000)
Aw, A., Zhang, M., Xiao, J., Su, J.: A phrase-based statistical model for SMS text normalization. In: Proceedings of the COLING/ACL on Main Conference Poster Sessions, pp. 33–40. Association for Computational Linguistics (2006)
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. CoRR abs/1409.0473 (2014)
Beaufort, R., Roekhaut, S., Cougnon, L.A., Fairon, C.: A hybrid rule/model-based finite-state framework for normalizing SMS messages. In: Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics, pp. 770–779. Association for Computational Linguistics, Uppsala, July 2010
Cook, P., Stevenson, S.: An unsupervised model for text message normalization. In: Proceedings of the Workshop on Computational Approaches to Linguistic Creativity, pp. 71–78. Association for Computational Linguistics (2009)
Dong, L., Mallinson, J., Reddy, S., Lapata, M.: Learning to paraphrase for question answering. In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pp. 875–886. Association for Computational Linguistics, Copenhagen, September 2017
Hassan, H., Menezes, A.: Social text normalization using contextual graph random walks. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 1577–1586. Association for Computational Linguistics, Sofia, August 2013
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Huang, Z., Xu, W., Yu, K.: Bidirectional LSTM-CRF models for sequence tagging. CoRR abs/1508.01991 (2015)
Khashabi, D., Khot, T., Sabharwal, A., Roth, D.: Learning what is essential in questions. In: Proceedings of the 21st Conference on Computational Natural Language Learning (CoNLL 2017), pp. 80–89. Association for Computational Linguistics, Vancouver, August 2017
Liu, F., Weng, F., Wang, B., Liu, Y.: Insertion, deletion, or substitution? Normalizing text messages without pre-categorization nor supervision. In: Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies, pp. 71–76 (2011)
Liu, Y., Sun, C., Lin, L., Wang, X.: Learning natural language inference using bidirectional LSTM model and inner-attention. CoRR abs/1605.09090 (2016)
Qin, K., Wang, L., Kim, J.: Joint modeling of content and discourse relations in dialogues. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp. 974–984. Association for Computational Linguistics, Vancouver, July 2017
Shaw, P., Uszkoreit, J., Vaswani, A.: Self-attention with relative position representations. CoRR abs/1803.02155 (2018)
Sonmez, C., Ozgur, A.: A graph-based approach for contextual text normalization. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 313–324. Association for Computational Linguistics, Doha, October 2014
Sproat, R., Jaitly, N.: RNN approaches to text normalization: A challenge. CoRR abs/1611.00068 (2016)
Sridhar, V.K.R.: Unsupervised text normalization using distributed representations of words and phrases. In: NAACL HLT 2015, pp. 8–16 (2015)
Sun, W., Sui, Z., Wang, M., Wang, X.: Chinese semantic role labeling with shallow parsing. In: Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 3. EMNLP 2009, vol. 3, pp. 1475–1483. Association for Computational Linguistics, Stroudsburg (2009)
Vaswani, A., et al.: Attention is all you need. CoRR abs/1706.03762 (2017)
Xia, Y., Wong, K.F., Li, W.: A phonetic-based approach to Chinese chat text normalization. In: Proceedings of the 21st International Conference on Computational Linguistics and 44th Annual Meeting of the Association for Computational Linguistics, pp. 993–1000. Association for Computational Linguistics, Sydney, July 2006
Yang, Y., Eisenstein, J.: A log-linear model for unsupervised text normalization. In: Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, pp. 61–72. Association for Computational Linguistics, Seattle, October 2013
Acknowledgments
The research work is supported by the National Key Research and Development Program of China under Grant No.2017YFB1002101, the Natural Science Foundation of China (No.61533018 and No.61702512), and the independent research project of National Laboratory of Pattern Recognition.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Liu, Q., He, S., Liu, K., Liu, S., Zhao, J. (2018). Learning to Detect Verbose Expressions in Spoken Texts. In: Sun, M., Liu, T., Wang, X., Liu, Z., Liu, Y. (eds) Chinese Computational Linguistics and Natural Language Processing Based on Naturally Annotated Big Data. CCL NLP-NABD 2018 2018. Lecture Notes in Computer Science(), vol 11221. Springer, Cham. https://doi.org/10.1007/978-3-030-01716-3_30
Download citation
DOI: https://doi.org/10.1007/978-3-030-01716-3_30
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-01715-6
Online ISBN: 978-3-030-01716-3
eBook Packages: Computer ScienceComputer Science (R0)