CLIP-KD: An Empirical Study of CLIP Model Distillation.
Chuanguang Yang, Zhulin An, Libo Huang, Junyu Bi, Xinqiang Yu, Han Yang, Boyu Diao, Yongjun Xu.
in
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR-2024)
CCF-A, Acceptance rate: 2719/11532=23.6%
[
Paper]
[
Code]
We propose several distillation strategies, including relation, feature, gradient and contrastive paradigms, to examine the effectiveness of CLIP-Knowledge Distillation.