Search

Taehyeon Kim

 ML Research Scientist
 Ph. D. Candidate @ KAIST AI [OSI Lab]
 Advised by Prof. Se-Young Yun
 Located in Seoul, South Korea
Education
 M.S. in Data Science (KAIST)
 B.S. in Mathematical Science (KAIST)
 Minor in Intellectual Property (KAIST)
I'm looking for a research internship / visiting scholar position for 2023 ~. I can relocate for an on-site internship/visit for up to six months. If you are interested, feel free to contact me. My expected graduation is Feb. 2024.  Contact: potter32@kaist.ac.kr, kimtaehyeon610@gmail.com (permanant)

 About

My research has investigated trustworthy and real-world AI/ML challenges. My research has been presented at several conferences and journals. Specifically, my interests include:
Federated learning
Automated neural architecture search
Learning with noisy labels
Precipitation nowcasting
Optimization for training deep neural networks
Automated hyperparameter search
Efficient deep learning
Visual representation learning (e.g., Vision-Language Model, Video Representation Learning)

️ News

Jan. 2023
 Working with DynamoFL Research Team as Research Scientist
Dec. 2022
 4th Award in NeurIPS2022 Competition-Weather4Cast [Link]
 Two papers accepted to NeurIPS2022 Workshop
 Invited talk at HyperConnect Research Team
 Invited talk at KSC Conference
 Invited to Twelve Labs Networking
Oct. 2022
 Qualcomm Innovation Fellowship Winner in 2022 [Link]
Jul. 2022
 Two papers accepted to ICML2022 Workshop (including an oral presentation)
May. 2022
 One paper accepted to IEEE Access Journal
Mar. 2022
 General Manager for the National Institute of Meteorological Sciences
Jan. 2022
 21/22 KAIST AI Workshop: Two Best Paper Awards

Working at/with

DynamoFL

(YC W22)
U.S.A. (Remote)
Feb. 2023 - Apr. 2023

Research Scientist

Research on real-world federated learning problems.

OpenMined

England (Remote)
Jun. 2022 - Present

Research Scientist

Social-focused AI research: privacy, fairness, accountability, and transparency.

Qualcomm

Seoul, South Korea
Jun. 2021 - Dec. 2021

CV & ML Ph.D. Internship for Autonomous Driving report

Designing a resource-efficient and accurate backbone for ADAS
1 Paper and 1 US Patent
Knowledge Distillation, Neural Architecture Search

Data Mining Lab

Daejeon, South Korea
Mar. 2017 - Feb. 2018

Undergraduate Research Internship

Data Science, KAIST (Advisor: Jae-Gil Lee)
Trajectory prediction based on user card data

 Publications & Technical Reports

List
Default view
Search

Preprints (Under Review)

 Efficient Framework for Knowledge Distillation with Trust Region Aware Architecture Search
Taehyeon Kim, Heesoo Myeong, Se-Young Yun
 Label Noise-Resistant Federated Learning with Adaptive Regularization
Taehyeon Kim*, Donggyu Kim*, Se-Young Yun
 Federated Supernet Training: A Framework for Addressing Heterogeneity in Federated Learning
Taehyeon Kim, Se-Young Yun
 A Survey of Supernet Optimization and its Applications: Spatial and Temporal Optimization for Neural Architecture Search
Stephen Cha, Taeheyeon Kim, Hayeon Lee, Se-Young Yun

 Details