Skip to main content
Log in

A two-generation based method for few-shot learning with few-shot instance-level privileged information

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Few-shot Learning (FSL) aims to recognize the novel classes from few novel samples. Recently, lots of methods have been proposed to improve FSL performance by introducing privileged information. However, on the one hand, they utilize the class name or class-level description generated by some tools such as WordNet as the privileged information. On the other hand, they are all one-generation based and just use the simple convex integration of visual modality and privileged information modality. Besides, the classic FSL dataset miniImageNet has no labels for few-shot instance-level privileged information. In this paper, we propose that the few-shot instance-level privileged information generated by few-shot visual images samples are more concrete and more diverse, which is more in line with the real world situation than the class-level privileged information that is the abstract concept summarized from a large number of visual image samples. For this, we propose a novel Two-generation based FSL method (2G-FSL) which transfers the prior knowledge from the prior model to the posterior model. This can make 2G-FSL learns the meta-knowledge about preserving correct prior knowledge and self-correcting erroneous prior knowledge after introducing the few-shot instance-level privileged information, growing into a more robust posterior model. In 2G-FSL, we introduce a novel Latent Feature Augmentation (LFA) module in posterior model to learn the episode-related augmentation and integration of the latent features of visual and privileged information modalities instead of the simple convex integration, which can generate diverse modality integration strategies for enhancing the diversity of latent features to make the features more robust to alleviate the insufficient data problem of FSL. We make the dataset of few-shot instance-level privileged information of miniImageNet publicly available for the subsequent research of FSL with few-shot instance-level privileged information. Experimental results demonstrate the effectiveness and superiority of 2G-FSL with LFA in FSL with few-shot instance-level privileged information.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Notes

  1. https://github.com/yaoyao-liu/mini-imagenet-tools.

  2. http://www.vision.caltech.edu/visipedia/CUB-200-2011.html.

  3. The FSL dataset with few-shot instance-level dataset is available at https://github.com/FlyGreyWolf/FSL-FSPI-dataset.

References

  1. Chen C, Li K, Wei W et al (2022) Hierarchical graph neural networks for few-shot learning. IEEE Trans Circuits Syst Video Technol 32(1):240–252. https://doi.org/10.1109/TCSVT.2021.3058098

    Article  Google Scholar 

  2. Chen M, Fang Y, Wang X et al (2020) Diversity transfer network for few-shot learning. In: Proceedings of the AAAI conference on artificial intelligence, pp 10559–10566

  3. Chen R, Chen T, Hui X et al (2020) Knowledge graph transfer network for few-shot recognition. In: Proceedings of the AAAI conference on artificial intelligence, pp 10575–10582

  4. Chen WY, Liu YC, Kira Z et al (2019) A closer look at few-shot classification. In: International Conference on Learning Representations. https://openreview.net/forum?id=HkxLXnAcFQ

  5. Chen Z, Fu Y, Wang YX et al (2019) Image deformation meta-networks for one-shot learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 8680–8689

  6. Chen Z, Fu Y, Zhang Y et al (2019) Multi-level semantic feature augmentation for one-shot learning. IEEE Trans Image Process 28(9):4594–4605

    Article  MathSciNet  Google Scholar 

  7. Devlin J, Chang MW, Lee K et al (2018) Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805

  8. Fei-Fei L, Fergus R, Perona P (2006) One-shot learning of object categories. IEEE Trans Pattern Anal Machine Intell 28(4):594–611. https://doi.org/10.1109/TPAMI.2006.79

    Article  Google Scholar 

  9. Finn C, Abbeel P, Levine S (2017) Model-agnostic meta-learning for fast adaptation of deep networks. In: Proceedings of the 34th international conference on machine learning, pp 1126–1135

  10. Fu Y, Fu Y, Jiang YG (2021) Meta-fdmixup: Cross-domain few-shot learning guided by labeled target data. In: Proceedings of the 29th ACM international conference on multimedia, pp 5326–5334

  11. Gao H, Shou Z, Zareian A et al (2018) Low-shot learning via covariance-preserving adversarial augmentation networks. In: Advances in neural information processing systems

  12. Gou J, Yu B, Maybank SJ et al (2021) Knowledge distillation: A survey. Int J Comput Vision 129:1789–1819

    Article  Google Scholar 

  13. Gulrajani I, Ahmed F, Arjovsky M et al (2017) Improved training of wasserstein gans. In: Advances in neural information processing systems

  14. Han J, Cheng B, Wan Z et al (2023) Towards hard few-shot relation classification. IEEE Trans Knowl Data Eng pp 1–14. https://doi.org/10.1109/TKDE.2023.3240851

  15. He K, Zhang X, Ren S et al (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR)

  16. Hinton GE, Salakhutdinov RR (2006) Reducing the dimensionality of data with neural networks. Science 313(5786):504–507

    Article  MathSciNet  Google Scholar 

  17. Hong Y, Niu L, Zhang J et al (2020) Matchinggan: Matching-based few-shot image generation. In: 2020 IEEE International conference on multimedia and expo (ICME), pp 1–6, https://doi.org/10.1109/ICME46284.2020.9102917

  18. Jiang W, Huang K, Geng J et al (2021) Multi-scale metric learning for few-shot learning. IEEE Trans Circuits Syst Video Technol 31(3):1091–1102. https://doi.org/10.1109/TCSVT.2020.2995754

    Article  Google Scholar 

  19. Kim J, Kim T, Kim S et al (2019) Edge-labeling graph neural network for few-shot learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR)

  20. Koch G, Zemel R, Salakhutdinov R et al (2015) Siamese neural networks for one-shot image recognition. In: ICML deep learning workshop, Lille, p 0

  21. Lai N, Kan M, Han C et al (2021) Learning to learn adaptive classifier-predictor for few-shot learning. IEEE Trans Neural Netw Learn Syst 32(8):3458–3470. https://doi.org/10.1109/TNNLS.2020.3011526

    Article  Google Scholar 

  22. Li K, Zhang Y, Li K et al (2020) Adversarial feature hallucination networks for few-shot learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 13470–13479

  23. Li P, Gong S, Wang C et al (2022) Ranking distance calibration for cross-domain few-shot learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 9099–9108

  24. Liu Y, Lee J, Park M et al (2019) Learning to propagate labels: transductive propagation network for few-shot learning. In: International conference on learning representations, https://openreview.net/forum?id=SyVuRiC5K7

  25. Liu Y, Zhang W, Xiang C et al (2022) Learning to affiliate: Mutual centralized learning for few-shot classification. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 14411–14420

  26. Luo Q, Wang L, Lv J et al (2021) Few-shot learning via feature hallucination with variational inference. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision, pp 3963–3972

  27. Ma Y, Bai S, An S et al (2020) Transductive relation-propagation network for few-shot learning. In: IJCAI, pp 804–810

  28. Nichol A, Achiam J, Schulman J (2018) On first-order meta-learning algorithms. arXiv:1803.02999

  29. Noh H, You T, Mun J et al (2017) Regularizing deep neural networks by noise: Its interpretation and optimization. In: Advances in neural information processing systems

  30. Oreshkin B, Rodríguez López P, Lacoste A (2018) Tadam: Task dependent adaptive metric for improved few-shot learning. In: Advances in neural information processing systems

  31. Park SJ, Han S, Baek JW et al (2020) Meta variance transfer: Learning to augment from the others. In: International conference on machine learning, PMLR, pp 7510–7520

  32. Peng Z, Li Z, Zhang J et al (2019) Few-shot image recognition with knowledge transfer. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 441–449

  33. Rusu AA, Rao D, Sygnowski J et al (2019) Meta-learning with latent embedding optimization. In: International conference on learning representations

  34. Satorras VG, Estrach JB (2018) Few-shot learning with graph neural networks. In: International conference on learning representations, https://openreview.net/forum?id=BJj6qGbRW

  35. Schwartz E, Karlinsky L, Shtok J et al (2018) Delta-encoder: an effective sample synthesis method for few-shot object recognition. In: Advances in neural information processing systems

  36. Schwartz E, Karlinsky L, Feris R et al (2022) Baby steps towards few-shot learning with multiple semantics. Pattern Recognition Lett 160:142–147

    Article  Google Scholar 

  37. Selvaraju RR, Cogswell M, Das A et al (2017) Grad-cam: Visual explanations from deep networks via gradient-based localization. In: Proceedings of the IEEE international conference on computer vision, pp 618–626

  38. Shao S, Xing L, Xu R et al (2022) Mdfm: Multi-decision fusing model for few-shot learning. IEEE Trans Circuits Syst Video Technol 32(8):5151–5162. https://doi.org/10.1109/TCSVT.2021.3135023

    Article  Google Scholar 

  39. Snell J, Swersky K, Zemel R (2017) Prototypical networks for few-shot learning. In: Advances in neural information processing systems, vol 30. Curran Associates, Inc

  40. Sung F, Yang Y, Zhang L et al (2018) Learning to compare: Relation network for few-shot learning. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1199–1208

  41. Tang H, Li Z, Peng Z et al (2020) Blockmix: meta regularization and self-calibrated inference for metric-based meta-learning. In: Proceedings of the 28th ACM international conference on multimedia, pp 610–618

  42. Tang H, Yuan C, Li Z et al (2022) Learning attention-guided pyramidal features for few-shot fine-grained recognition. Pattern Recognition pp 108792

  43. Tian S, Li W, Ning X et al (2023) Continuous transfer of neural network representational similarity for incremental learning. Neurocomputing 545:126300

    Article  Google Scholar 

  44. Vinyals O, Blundell C, Lillicrap T et al (2016) Matching networks for one shot learning. In: Advances in neural information processing systems 29

  45. Wang Y, Chao WL, Weinberger KQ et al (2019) Simpleshot: Revisiting nearest-neighbor classification for few-shot learning. arXiv:1911.04623

  46. Wang Y, Pan X, Song S et al (2019) Implicit semantic data augmentation for deep networks. In: Advances in neural information processing systems 32

  47. Wang YX, Girshick R, Hebert M et al (2018) Low-shot learning from imaginary data. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7278–7286

  48. Wertheimer D, Tang L, Hariharan B (2021) Few-shot classification with feature map reconstruction networks. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 8012–8021

  49. Xie J, Long F, Lv J et al (2022) Joint distribution matters: Deep brownian distance covariance for few-shot classification. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 7972–7981

  50. Xing C, Rostamzadeh N, Oreshkin B et al (2019) Adaptive cross-modal few-shot learning. In: Advances in neural information processing systems

  51. Xu J, Liu B, Xiao Y (2022) A multitask latent feature augmentation method for few-shot learning. IEEE Trans Neural Netw Learn Syst 1–15. https://doi.org/10.1109/TNNLS.2022.3213576

  52. Xu J, Liu B, Xiao Y (2022) A variational inference method for few-shot learning. IEEE Trans Circuits Syst Video Technol 1–1. https://doi.org/10.1109/TCSVT.2022.3199496

  53. Xu R, Xing L, Shao S et al (2022) Gct: Graph co-training for semi-supervised few-shot learning. IEEE Trans Circuits Syst Video Technol 32(12):8674–8687. https://doi.org/10.1109/TCSVT.2022.3196550

    Article  Google Scholar 

  54. Xu T, Zhang P, Huang Q et al (2018) Attngan: Fine-grained text to image generation with attentional generative adversarial networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1316–1324

  55. Yang F, Wang R, Chen X (2022) Sega: semantic guided attention on visual prototype for few-shot learning. In: Proceedings of the IEEE/CVF winter conference on applications of computer vision, pp 1056–1066

  56. Yang S, Liu L, Xu M (2021) Free lunch for few-shot learning: Distribution calibration. In: International conference on learning representations, https://openreview.net/forum?id=JWOiYxMG92s

  57. Ye HJ, Hu H, Zhan DC et al (2020) Few-shot learning via embedding adaptation with set-to-set functions. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition (CVPR)

  58. Zhang B, Leung KC, Li X et al (2021) Learn to abstract via concept graph for weakly-supervised few-shot learning. Pattern Recognition 117:107946

    Article  Google Scholar 

  59. Zhang B, Li X, Ye Y et al (2021) Prototype completion with primitive knowledge for few-shot learning. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 3754–3762

  60. Zhang B, Jiang H, Li X et al (2022) Metadt: Meta decision tree with class hierarchy for interpretable few-shot learning. IEEE Trans Circuits Syst Video Technol

  61. Zhang C, Cai Y, Lin G et al (2020) Deepemd: Few-shot image classification with differentiable earth mover’s distance and structured classifiers. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 12203–12213

  62. Zhang C, Li C, Cheng J (2020) Few-shot visual classification using image pairs with binary transformation. IEEE Trans Circuits Syst Video Technol 30(9):2867–2871. https://doi.org/10.1109/TCSVT.2019.2920783

    Article  Google Scholar 

  63. Zhang H, Zhang J, Koniusz P (2019) Few-shot learning via saliency-guided hallucination of samples. In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pp 2770–2779

  64. Zhang M, Zhang J, Lu Z et al (2021) \(\{\)IEPT\(\}\): Instance-level and episode-level pretext tasks for few-shot learning. In: International conference on learning representations, https://openreview.net/forum?id=xzqLpqRzxLq

  65. Zhang M, Shi M, Li L (2022) Mfnet: Multiclass few-shot segmentation network with pixel-wise metric learning. IEEE Trans Circuits Syst Video Technol 32(12):8586–8598. https://doi.org/10.1109/TCSVT.2022.3193612

    Article  Google Scholar 

  66. ZHANG R, Che T, Ghahramani Z et al (2018) Metagan: An adversarial approach to few-shot learning. In: Advances in neural information processing systems

  67. Zhou F, Zhang L, Wei W (2022) Meta-generating deep attentive metric for few-shot classification. IEEE Trans Circuits Syst Video Technol 32(10):6863–6873. https://doi.org/10.1109/TCSVT.2022.3173687

    Article  Google Scholar 

  68. Zhu Y, Liu C, Jiang S (2020) Multi-attention meta learning for few-shot fine-grained image recognition. In: IJCAI, pp 1090–1096

Download references

Author information

Authors and Affiliations

Authors

Contributions

Jian Xu: Writing -original draft, Conceptualization, Methodology. Jinghui He: Data annotation. Bo Liu: Writing -review & editing, Conceptualization, Methodology. Fan Cao: Validation. Yanshan Xiao: Validation.

Corresponding author

Correspondence to Bo Liu.

Ethics declarations

Competing interests

The authors have no competing interests to declare that are relevant to the content of this article. The authors have no relevant financial or non-financial interests to disclose. All authors certify that they have no affliations with or involvement in any organization or entity with any financial interest or non-fnancial interest inthe subject matter or materials discussed in this manuscript. The authors have no financial or proprietary interests in any material discussed in this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xu, J., He, J., Liu, B. et al. A two-generation based method for few-shot learning with few-shot instance-level privileged information. Appl Intell 54, 4077–4094 (2024). https://doi.org/10.1007/s10489-024-05388-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-024-05388-z

Keywords

Navigation