Results 1-1 of 1
期刊文章
IEEE ACCESS
2018
[ABSTRACT] Knowledge distillation (KD) is a powerful technique that enables a well-trained large model to assist a small model. However, KD is constrained in ...
[KEYWORDS] Bidirectional model assistance; collaborative learning; deep neural networks; mutual knowledge base; NEURAL-NETWORKS; DEEP; SIMILARITY
Results 1-1 of 1
- <<
- 1
- >>