Education
My expected graduation date is Aug. 2024.
Contact: potter32@kaist.ac.kr, kimtaehyeon610@gmail.com (permanant)
About
My goal is to tackle trustworthy and real-world AI/ML challenges. Specifically, my interests include:
•
Knowledge Distillation & Learning with Noisy Labels
•
Federated Learning & AutoML & Semi-Supervised Learning
•
Optimization for training deep neural networks (Efficient Deep Learning)
•
Language Model & Instruction Tuning
•
Game-Changing Research 
️ News
Oct. 2023
May. 2023
Jan. 2023
Dec. 2022
Oct. 2022
Jul. 2022
Working at/with
Google Research
NYC, U.S.A.
Oct. 2023 - Dec. 2023
DynamoFL
(YC W22)
U.S.A. (Remote)
Jan. 2023 - May. 2023
Research Intern
•
Research on Language Models.
Research Intern
•
Research on real-world federated learning problems.
•
Semi-supervised federated learning
OpenMined
England (Remote)
Jun. 2022 - Jun. 2023
Research Scientist
•
Social-focused AI research: privacy, fairness, accountability, and transparency.
Qualcomm
Seoul, South Korea
Jun. 2021 - Dec. 2021
CV & ML Ph.D. Internship for Autonomous Driving report
•
Designing a resource-efficient and accurate backbone for ADAS
•
1 Paper and 1 US Patent
•
Knowledge Distillation, Neural Architecture Search
Data Mining Lab
Daejeon, South Korea
Mar. 2017 - Feb. 2018
Undergraduate Research Internship
•
Data Science, KAIST (Advisor: Jae-Gil Lee)
•
Trajectory prediction based on user card data
Publications & Technical Reports
List
Default view
Search
Preprints (Under Review)
Taehyeon Kim*, Donggyu Kim*, Se-Young Yun
Taehyeon Kim, Se-Young Yun
Stephen Cha, Taeheyeon Kim, Hayeon Lee, Se-Young Yun
Yongjin Yang, Taehyeon Kim, Se-Young Yun