Skip to content

Latest commit

 

History

History
5 lines (3 loc) · 342 Bytes

200702 Interactive Knowledge Distillation.md

File metadata and controls

5 lines (3 loc) · 342 Bytes

https://arxiv.org/abs/2007.01476

Interactive Knowledge Distillation (Shipeng Fu, Zhen Li, Jun Xu, Ming-Ming Cheng, Zitao Liu, Xiaomin Yang)

feature matching 대신 student 블록을 teacher 블록으로 대체해서 학습시키는 방식으로 KD. 구현이나 효율성에 꽤 장점이 있을 듯. teacher가 내가 된다! #distillation