Prev: Google Research (NYC), Qualcomm AI, DynamoFL (YCW22)
Education
PhD - KAIST AI
MS - KAIST Data Science
BS - KAIST Mathematical Science
[Minor-Intellectual Property]
I am a Ph.D. Student in Optimization & Statistical Inference (OSI) Lab @ KAIST, advised by Prof. Se-Young Yun. I worked as a PhD Intern @ Google Research, Qualcomm AI, and DynamoFL (YCW22). My expected graduation date is Feb. 2025.
Contact: potter32 [at] kaist.ac.kr, kimtaehyeon610 [at] gmail.com (permanent)
About
My goal is to tackle trustworthy and real-world AI/ML challenges
•
LLM Inference: Instructive Decoding, Speculative Decoding, Parallel Decoding
•
Alignment Algorithm: Knowledge Distillation & Learning with Noisy Labels
•
Data Heterogeneity: Federated Learning & AutoML & Semi-Supervised Learning
•
Game-Changing Research
CV (Updated: Oct 04, 2024)
️ News
Sep. 2024
Successfully passed my PhD Proposal!
1 Accepted @ EMNLP2024 Main - Specialized Speculative Decoding!
2 Accepted @ NeurIPS 2024 - Speculative Decoding and Block Transformer!
May. 2024
Attending ICLR 2024 @ Vienna, Aus
Jan. 2024
1 Accepted @ ICLR2024 (Spotlight): Instruction Following on Large Language Model
Dec. 2023
1 Accepted @ NeurIPS2023W: Instruction Tuning & Instruction Following
1 Accepted @ NeurIPS2023: Semi-Supervised Federated Object Detection
Attending NeurIPS 2023 @ NOLA, US
1 Accepted @ AAAI2024: Few-shot & Domain Generalization
Oct. 2023
Working with Google Research NYC
Working at/with
Table
Search
Publications & Technical Reports
Table
List
Default view
Search
Preprints (Under Review)
Marc Bartholet*, Taehyeon Kim*, Ami Beuret, Se-Young Yun, Joachim M. Buhmann
FLR: Label-Mixture Regularization for Federated Learning with Noisy Labels
Taehyeon Kim, Donggyu Kim, Se-Young Yun
Leadership Awards Activities
List view
Search
Research Projects
Table
List view
Search
Invited Talks
Table
Search
Services & Others
Gallery view
Search