Abstract
As a typical dense prediction task, semantic segmentation remains challenging in industrial automation, since it is non-trivial to achieve a good tradeoff between the performance and the efficiency. Meanwhile, knowledge distillation has been applied to reduce the computational cost in semantic segmentation task. However, existing knowledge distillation methods for semantic segmentation mainly mimic the teachers’ behaviour using the well-designed knowledge variants from a single image, failing to explore discrepancy knowledge between different images. Considering that the large pre-trained teacher network usually tends to form a more robust discrepancy space than the small student, we propose a new inter-image discrepancy knowledge distillation method (IIDKD) for semantic segmentation. Extensive experiments are conducted on two popular semantic segmentation datasets, where the experimental results show the efficiency and effectiveness of distilling inter-image discrepancy knowledge.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Beyer, L., Zhai, X., Royer, A., Markeeva, L., Anil, R., Kolesnikov, A.: Knowledge distillation: a good teacher is patient and consistent. In: CVPR, pp. 10925–10934 (2022)
Cordts, M., et al.: The cityscapes dataset for semantic urban scene understanding. In: CVPR, pp. 3213–3223 (2016)
Deng, H., Li, X.: Anomaly detection via reverse distillation from one-class embedding. In: CVPR, pp. 9737–9746 (2022)
Everingham, M., Van Gool, L., Williams, C.K., Winn, J., Zisserman, A.: The pascal visual object classes (VOC) challenge. Int. J. Comput. Vision 88(2), 303–338 (2010)
He, R., Sun, S., Yang, J., Bai, S., Qi, X.: Knowledge distillation as efficient pre-training: faster convergence, higher data-efficiency, and better transferability. In: CVPR, pp. 9161–9171 (2022)
Hinton, G., Vinyals, O., Dean, J., et al.: Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, vol. 2, no. 7 (2015)
Kirillov, A., et al.: Segment anything (2023)
Li, M., et al.: Cross-domain and cross-modal knowledge distillation in domain adaptation for 3D semantic segmentation. In: Proceedings of the 30th ACM International Conference on Multimedia, pp. 3829–3837 (2022)
Li, Q., Jin, S., Yan, J.: Mimicking very efficient network for object detection. In: CVPR, pp. 6356–6364 (2017)
Liu, Y., Shu, C., Wang, J., Shen, C.: Structured knowledge distillation for dense prediction. TPAMI (2020)
Mehta, S., Rastegari, M., Caspi, A., Shapiro, L., Hajishirzi, H.: ESPNet: efficient spatial pyramid of dilated convolutions for semantic segmentation. In: ECCV, pp. 552–568 (2018)
Mirzadeh, S.I., Farajtabar, M., Li, A., Levine, N., Matsukawa, A., Ghasemzadeh, H.: Improved knowledge distillation via teacher assistant. In: AAAI, vol. 34, pp. 5191–5198 (2020)
Pan, H., Chang, X., Sun, W.: Multitask knowledge distillation guides end-to-end lane detection. IEEE Trans. Ind. Inform. (2023)
Paszke, A., Chaurasia, A., Kim, S., Culurciello, E.: ENet: a deep neural network architecture for real-time semantic segmentation. arXiv preprint arXiv:1606.02147 (2016)
Peng, B., et al.: Correlation congruence for knowledge distillation. In: ICCV, pp. 5007–5016 (2019)
Shu, C., Liu, Y., Gao, J., Yan, Z., Shen, C.: Channel-wise knowledge distillation for dense prediction. In: ICCV, pp. 5311–5320 (2021)
Tung, F., Mori, G.: Similarity-preserving knowledge distillation. In: ICCV, pp. 1365–1374 (2019)
Vaswani, A., et al.: Attention is all you need. In: NeuraIPS, vol. 30 (2017)
Wang, Y., Zhou, W., Jiang, T., Bai, X., Xu, Y.: Intra-class feature variation distillation for semantic segmentation. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12352, pp. 346–362. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58571-6_21
Xu, C., Gao, W., Li, T., Bai, N., Li, G., Zhang, Y.: Teacher-student collaborative knowledge distillation for image classification. Appl. Intell. 53(2), 1997–2009 (2023)
Yang, C., Zhou, H., An, Z., Jiang, X., Xu, Y., Zhang, Q.: Cross-image relational knowledge distillation for semantic segmentation. In: CVPR, pp. 12319–12328 (2022)
Acknowledgement
This work was supported in part by National Natural Science Foundation of China (Grant Nos. 61976107 and 61502208).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Chen, K., Gou, J., Li, L. (2024). Inter-image Discrepancy Knowledge Distillation for Semantic Segmentation. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14427. Springer, Singapore. https://doi.org/10.1007/978-981-99-8435-0_22
Download citation
DOI: https://doi.org/10.1007/978-981-99-8435-0_22
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8434-3
Online ISBN: 978-981-99-8435-0
eBook Packages: Computer ScienceComputer Science (R0)