Best Poster Awards, KAIST AI 21/22 Workshop
FINE Samples for Learning with Noisy Labels.
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge Distillation
8th Award in NeurIPS 2020 Black-Box Optimization Challenge.
Organizers: Twitter, Facebook, Valohai, SigOPT hosted at NeurIPS 2020
Subjects: Auto-ML, Bayesian Learning, Hyperparameter Optimization.
Role: Team leading, Develop, and Research(BO, multi-armed bandit)
Better ranks than IBM, Oxford University
Other participants: Nvidia, Huawei, JetBrains
2nd & 3rd Awards in NeurIPS 2019 MicroNet Challenge, CIFAR-100 Track.
Organizers: Google Research, OpenAI, DeepMind, Facebook hosted at NeurIPS.
Subjects: Image Classification, Model Compression.
Role: Team leading, Develop, and Research
Parameter regularization, data augmentation, knowledge distillation, pruning.