Search

Taehyeon Kim

 ML Research Scientist
 Ph. D. Candidate @ KAIST AI [OSI Lab]
 Advised by Prof. Se-Young Yun
 Located in Seoul, South Korea
Education
 M.S. in Data Science (KAIST)
 B.S. in Mathematical Science (KAIST)
I'm looking for a research internship / visiting scholar position for 2023 ~. I can relocate for an on-site internship/visit for up to six months. If you are interested, feel free to contact me. My expected graduation is Feb. 2024.  Contact: potter32@kaist.ac.kr, kimtaehyeon610@gmail.com (permanant)

 About

My research has investigated trustworthy and real-world AI/ML challenges. My research has been presented at several conferences and journals. Specifically, my interests include:
Distillation & Learning with Noisy Labels
Federated Learning & AutoML (Neural Architecture Search)
Semi-Supervised Learning
Optimization for training deep neural networks (Efficient Deep Learning)
Game-Changing Research

️ News

Jan. 2023
 Working with DynamoFL Research Team as Research Scientist
Dec. 2022
 4th Award in NeurIPS2022 Competition-Weather4Cast [Link]
 Two papers accepted to NeurIPS2022 Workshop
 Invited talk at HyperConnect Research Team
 Invited talk at KSC Conference
 Invited to Twelve Labs Networking
Oct. 2022
 Qualcomm Innovation Fellowship Winner in 2022 [Link]
Jul. 2022
 Two papers accepted to ICML2022 Workshop (including an oral presentation)
May. 2022
 One paper accepted to IEEE Access Journal
Mar. 2022
 General Manager for the National Institute of Meteorological Sciences
Jan. 2022
 21/22 KAIST AI Workshop: Two Best Paper Awards

Working at/with

DynamoFL

(YC W22)
U.S.A. (Remote)
Feb. 2023 - May. 2023

Research Scientist

Research on real-world federated learning problems.

OpenMined

England (Remote)
Jun. 2022 - Present

Research Scientist

Social-focused AI research: privacy, fairness, accountability, and transparency.

Qualcomm

Seoul, South Korea
Jun. 2021 - Dec. 2021

CV & ML Ph.D. Internship for Autonomous Driving report

Designing a resource-efficient and accurate backbone for ADAS
1 Paper and 1 US Patent
Knowledge Distillation, Neural Architecture Search

Data Mining Lab

Daejeon, South Korea
Mar. 2017 - Feb. 2018

Undergraduate Research Internship

Data Science, KAIST (Advisor: Jae-Gil Lee)
Trajectory prediction based on user card data

 Publications & Technical Reports

List
Default view
Search
Region-Conditioned Orthogonal 3D U-Net for Weather4Cast Competition
Competition
NeurIPS2022
Region-Conditioned Orthogonal 3D U-Net for Weather4Cast Competition
Competition
NeurIPS2022

Preprints (Under Review)

 Efficient Framework for Knowledge Distillation with Trust Region Aware Architecture Search
 Label Noise-Resistant Federated Learning with Adaptive Regularization
 Federated Supernet Training: A Framework for Addressing Heterogeneity in Federated Learning
 A Survey of Supernet Optimization and its Applications: Spatial and Temporal Optimization for Neural Architecture Search
 Task Adaptive Distillation in Cross Domain Few Shot Learning

 Details