Skip to main content

A Simple Baseline for Cross-Domain Few-Shot Text Classification

  • Conference paper
  • First Online:
Natural Language Processing and Chinese Computing (NLPCC 2021)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13028))

Abstract

Few-shot text classification has been largely explored due to its remarkable few-shot generalization ability to in-domain novel classes. Yet, the generalization ability of existing models to cross-domain novel classes has seldom be studied. To fill the gap, we investigate a new task, called cross-domain few-shot text classification (XFew) and present a simple baseline that witnesses an appealing cross-domain generalization capability while retains a nice in-domain generalization capability. Experiments are conducted on two datasets under both in-domain and cross-domain settings. The results show that current few-shot text classification models lack a mechanism to account for potential domain shift in the XFew task. In contrast, our proposed simple baseline achieves surprisingly superior results in comparison with other models in cross-domain scenarios, confirming the need of further research in the XFew task and providing insights for possible directions. (The code and datasets are available at https://github.com/GeneZC/XFew).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The layer is a fully connected layer armed with softmax.

  2. 2.

    MAML can be simplified with first-order gradients, though.

  3. 3.

    Some literature regards different scenarios in the dataset as separate domains. However, we think the domain shifts among them are not sufficiently large, so that in this work we do not consider them as different domains.

References

  1. Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. In: ICLR (2015)

    Google Scholar 

  2. Bao, Y., Wu, M., Chang, S., Barzilay, R.: Few-shot text classification with distributional signatures. In: ICLR (2019)

    Google Scholar 

  3. Bengio, Y., Ducharme, R., Vincent, P.: A neural probabilistic language model. In: NeurIPS, pp. 932–938 (2000)

    Google Scholar 

  4. Casanueva, I., Temcinas, T., Gerz, D., Henderson, M., Vulic, I.: Efficient intent detection with dual sentence encoders. arXiv arXiv:2003.04807 (2020)

  5. Chen, W.Y., Liu, Y.C., Kira, Z., Wang, Y.C.F., Huang, J.B.: A closer look at few-shot classification. In: ICLR (2019)

    Google Scholar 

  6. Chen, Y., Wang, X., Liu, Z., Xu, H., Darrell, T.: A new meta-baseline for few-shot learning. arXiv arXiv:2003.04390 (2020)

  7. Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: ICML, vol. 70, pp. 1126–1135 (2017)

    Google Scholar 

  8. Gao, T., Han, X., Zhu, H., Liu, Z., Li, P., Sun, M., Zhou, J.: FewRel 2.0: towards more challenging few-shot relation classification. In: EMNLP/IJCNLP, no. 1, pp. 6249–6254 (2019)

    Google Scholar 

  9. Geng, R., Li, B., Li, Y., Zhu, X., Jian, P., Sun, J.: Induction networks for few-shot text classification. In: EMNLP-IJCNLP, no. 1, pp. 3902–3911 (2019)

    Google Scholar 

  10. Guo, Y., Codella, N.C.F., Karlinsky, L., Smith, J.R., Rosing, T., Feris, R.S.: A new benchmark for evaluation of cross-domain few-shot learning. arXiv arXiv:1912.07200 (2019)

  11. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  12. Kim, Y.: Convolutional neural networks for sentence classification. In: EMNLP, pp. 1746–1751 (2014)

    Google Scholar 

  13. Liu, X., Eshghi, A., Swietojanski, P., Rieser, V.: Benchmarking natural language understanding services for building conversational agents. arXiv arXiv:1903.05566 (2019)

  14. Luong, T., Pham, H., Manning, C.D.: Effective approaches to attention-based neural machine translation. In: EMNLP, pp. 1412–1421 (2015)

    Google Scholar 

  15. Mishra, N., Rohaninejad, M., Chen, X., Abbeel, P.: A simple neural attentive meta-learner. In: ICLR (2018)

    Google Scholar 

  16. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: EMNLP, vol. 14, pp. 1532–1543 (2014)

    Google Scholar 

  17. Snell, J., Swersky, K., Zemel, R.S.: Prototypical networks for few-shot learning. In: NeurIPS, pp. 4077–4087 (2017)

    Google Scholar 

  18. Sung, F., Yang, Y., Zhang, L., Xiang, T., Torr, P.H.S., Hospedales, T.M.: Learning to compare: relation network for few-shot learning. In: CVPR, pp. 1199–1208 (2018)

    Google Scholar 

  19. Vinyals, O., Blundell, C., Lillicrap, T., Kavukcuoglu, K., Wierstra, D.: Matching networks for one shot learning. In: NeurIPS, pp. 3630–3638 (2016)

    Google Scholar 

  20. Yu, M., et al.: Diverse few-shot text classification with multiple metrics. In: NAACL-HLT, pp. 1206–1215 (2018)

    Google Scholar 

Download references

Acknowledgement

This work is supported by the National Key Research and Development Program of China (grant No. 2018YFC0831704) and Natural Science Foundation of China (grant No. U1636203).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dawei Song .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, C., Song, D. (2021). A Simple Baseline for Cross-Domain Few-Shot Text Classification. In: Wang, L., Feng, Y., Hong, Y., He, R. (eds) Natural Language Processing and Chinese Computing. NLPCC 2021. Lecture Notes in Computer Science(), vol 13028. Springer, Cham. https://doi.org/10.1007/978-3-030-88480-2_56

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-88480-2_56

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-88479-6

  • Online ISBN: 978-3-030-88480-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics