Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models
Abstract
References
Index Terms
- Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models
Recommendations
Multi-target Knowledge Distillation via Student Self-reflection
AbstractKnowledge distillation is a simple yet effective technique for deep model compression, which aims to transfer the knowledge learned by a large teacher model to a small student model. To mimic how the teacher teaches the student, existing knowledge ...
Student-friendly knowledge distillation
AbstractIn knowledge distillation, the knowledge from the teacher model is often too complex for the student model to thoroughly process. However, good teachers in real life always simplify complex material before teaching it to students. Inspired by ...
Highlights- Simplify the knowledge difficulty for the student model in knowledge distillation.
- Use softening and attention mechanism to simplify the knowledge.
- Achieve state-of-the-art performance while maintaining high training efficiency.
Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation
MM '24: Proceedings of the 32nd ACM International Conference on MultimediaKnowledge distillation has become widely recognized for its ability to transfer knowledge from a large teacher network to a compact and more streamlined student network. Traditional knowledge distillation methods primarily follow a teacher-oriented ...
Comments
Information & Contributors
Information
Published In

- General Chairs:
- Grace Hui Yang,
- Hongning Wang,
- Sam Han,
- Program Chairs:
- Claudia Hauff,
- Guido Zuccon,
- Yi Zhang
Sponsors
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Author Tags
Qualifiers
- Research-article
Funding Sources
- Science and Technology In- novation 2030-Major Project
- Natural Science Foundation of Jiangsu Province
Conference
Acceptance Rates
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 347Total Downloads
- Downloads (Last 12 months)347
- Downloads (Last 6 weeks)40
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in