Search
📃

Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search

Conference and Journal
ICML2022
Workshop
Author
1st author
Keyword
Architecture Search
Bayesian Optimization
Knowledge Distillation
Orthogonality Regularization
Specification
International
Top-tier
Under Review