Skip to main content

A Coarse-to-Fine Text Matching Framework for Customer Service Question Answering

  • Conference paper
  • First Online:
  • 249 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13734))

Abstract

Customer service question answering have recently seen increased interest in NLP due to their potential commercial values. However, existing methods are largely based on Deep Neural Networks (DNNs) that are computationally expensive and memory intensive, which hinder their deployment in many real-world scenarios. In addition, the customer service dialogue data is very domain-specific, and it is difficult to achieve a high matching accuracy without specific model optimization. In this paper, we propose CFTM, A Coarse-to-Fine Text Matching Framework, which consists of Fasttext coarse-grained classification, and Roformer-sim fine-grained sentence vector matching. This Coarse-to-Fine structure can effectively reduce the amount of model parameters and speed up system inference. We also use the CoSENT loss function to optimize the Roformer-sim model according to the characteristics of customer service dialogue data, which effectively improves the matching accuracy of the framework. We conduct extensive experiments on CHUZHOU and EIP customer service questioning datasets from KONKA. The result shows that CFTM outperforms baselines across all metrics, achieving a 2.5 improvement in F1-Score and a 30% improvement in inference time, which demonstrates that our CFTM gets higher response accuracy and faster interaction speed in customer service question answering.

A. Li and X. Liang—These authors contributed equally to this work.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Wen, T., Gasic, M., Mrksic, N., Su, P., Vandyke, D., Young, S.J.: Semantically conditioned LSTM-based natural language generation for spoken dialogue systems. The Association for Computational Linguistics (2015)

    Google Scholar 

  2. Wang, S., Jiang, J.: A compare-aggregate model for matching text sequences. In: 5th International Conference on Learning Representations, ICLR 2017, Toulon, France, Conference Track Proceedings, OpenReview.net (2017)

    Google Scholar 

  3. Huang, P., He, X., Gao, J., Deng, L., Acero, A., Heck, L.P.: Learning deep structured semantic models for web search using clickthrough data. In: ACM (2013)

    Google Scholar 

  4. Shen, Y., He, X., Gao, J., Deng, L., Mesnil, G.: Learning semantic representations using convolutional neural networks for web search. In: ACM (2014)

    Google Scholar 

  5. Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q., eds Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, pp. 2042–2050. Montreal, Quebec, Canada (2014)

    Google Scholar 

  6. Mueller, J., Thyagarajan, A.: Siamese recurrent architectures for learning sentence similarity. In: AAAI Press (2016)

    Google Scholar 

  7. Wan, S., Lan, Y., Guo, J., Xu, J., Pang, L., Cheng, X.: A deep architecture for semantic matching with multiple positional sentence representations. In: AAAI Press (2016)

    Google Scholar 

  8. Pang, L., Lan, Y., Guo, J., Xu, J., Wan, S., Cheng, X.: Text matching as image recognition. In: AAAI Press (2016)

    Google Scholar 

  9. Wang, Z., Hamza, W., Florian, R.: Bilateral multi-perspective matching for natural language sentences (2017)

    Google Scholar 

  10. Chen, Q., Zhu, X., Ling, Z., Wei, S., Jiang, H., Inkpen, D.: Enhanced LSTM for natural language inference. Association for Computational Linguistics (2017)

    Google Scholar 

  11. Qiao, Y., Xiong, C., Liu, Z., Liu, Z.: Understanding the behaviors of BERT in ranking. CoRR abs/1904.07531 (2019)

    Google Scholar 

  12. Devlin, J., Chang, M., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. In: Association for Computational Linguistics (2019)

    Google Scholar 

  13. Reimers, N., Gurevych, I.: Sentence-BERT: sentence embeddings using siamese BERT-networks. In: Association for Computational Linguistics (2019)

    Google Scholar 

  14. Su, J.: Simbertv2 is here! fusion of retrieved and generated Roformer-sim models (In Chinese) (2021)

    Google Scholar 

  15. Lu, Z., Li, H.: A deep architecture for matching short texts. In: Burges, C.J.C., Bottou, L., Ghahramani, Z., Weinberger, K.Q., eds Advances in Neural Information Processing Systems 26: 27th Annual Conference on Neural Information Processing Systems 2013. Proceedings of a meeting held, pp. 1367–1375. Lake Tahoe, Nevada, United States (2013)

    Google Scholar 

  16. Guo, J., Fan, Y., Ai, Q., Croft, W.B.: A deep relevance matching model for ad-hoc retrieval. In: ACM (2016)

    Google Scholar 

  17. Joulin, A., Grave, E., Bojanowski, P., Mikolov, T.: Bag of tricks for efficient text classification. In: Association for Computational Linguistics (2017)

    Google Scholar 

  18. Su, J.: Cosent (1): a more efficient sentence vector scheme than sentence-BERT (In Chinese) (2022)

    Google Scholar 

  19. Jiao, X., et al.: Tinybert: distilling BERT for natural language understanding. Volume EMNLP 2020 of Findings of ACL. In: Association for Computational Linguistics (2020)

    Google Scholar 

  20. Sun, Y., et al.: ERNIE 3.0: large-scale knowledge enhanced pre-training for language understanding and generation. CoRR abs/2107.02137 (2021)

    Google Scholar 

Download references

Acknowledgments

We thank all anonymous reviewers for their helpful comments. This work was partially supported by the National Natural Science Foundation of China (62006062, 62176076), Shenzhen Foundational Research Funding JCYJ20200109113441941, Shenzhen Key Technology Project JSGG20210802154400001, and Joint Lab of HITSZ and Konka.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ruifeng Xu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, A. et al. (2022). A Coarse-to-Fine Text Matching Framework for Customer Service Question Answering. In: Yang, Y., Wang, X., Zhang, LJ. (eds) Cognitive Computing – ICCC 2022. ICCC 2022. Lecture Notes in Computer Science, vol 13734. Springer, Cham. https://doi.org/10.1007/978-3-031-23585-6_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-23585-6_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-23584-9

  • Online ISBN: 978-3-031-23585-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics