Skip to content

This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).

Notifications You must be signed in to change notification settings

DefangChen/Knowledge-Distillation-Paper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

75 Commits
 
 

Repository files navigation

Awesome License: MIT

Knowledge-Distillation-Paper

This resposity maintains a collection of important papers on knowledge distillation.

Pioneering Papers

Survey Papers

  • Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks, TPAMI 2022

  • Knowledge Distillation: A Survey, IJCV 2021

  • A Comprehensive Survey on Knowledge Distillation

    • https://arxiv.org/abs/2503.12067
    • Amir M. Mansourian, Rozhan Ahmadi, Masoud Ghafouri, Amir Mohammad Babaei, Elaheh Badali Golezani, Zeynab Yasamani Ghamchi, Vida Ramezanian, Alireza Taherian, Kimia Dinashi, Amirali Miri, Shohreh Kasaei.

Distillation Meets Diffusion Models

Extremely Promising !!!!!

Feature Distillation

Online Knowledge Distillation

Multi-Teacher Knowledge Distillation

Data-Free Knowledge Distillation

Distillation for Segmentation

  • Structured Knowledge Distillation for Dense Prediction, CVPR 2019, TPAMI 2020 [Pytorch]

  • Channel-wise Knowledge Distillation for Dense Prediction, ICCV 2021 [Pytorch]

  • Cross-Image Relational Knowledge Distillation for Semantic Segmentation, CVPR 2022 [Pytorch]

  • Holistic Weighted Distillation for Semantic Segmentation, ICME 2023 [Pytorch]

    • Wujie Sun, Defang Chen, Can Wang, Deshi Ye, Yan Feng, Chun Chen.

Useful Resources

  • Acceptance rates of the main AI conferences [Link]
  • AI conference deadlines [Link]
  • CCF conference deadlines [Link]

About

This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •