Skip to content

DanielSHKao/StableKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 

Repository files navigation

StableKD

This is the official repository of the paper:

StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation
Shiu-hong Kao*, Jierun Chen*, S.H. Gary Chan
arXiv preprint arXiv:2312.13223, 2023


We reveal the issue of Inter-block Optimization Entanglement (IBOE) in end-to-end KD training and further propose StableKD to stablilize optimization. Extensive experiments show StableKD achieve high accuracy, fast convergence, and high data efficiency.

Feel free to contact me at [email protected]. The codes will be released soon!

Citation

If you find this paper/repository helpful, please consider citing:

@article{kao2023stablekd,
  title={StableKD: Breaking Inter-block Optimization Entanglement for Stable Knowledge Distillation}, 
  author={Kao, Shiu-hong and Chen, Jierun and Chan, S-H Gary},
  journal={arXiv preprint arXiv:2312.13223},
  year={2023}
}

About

Official implementation for StableKD.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published