Search

Taehyeon Kim

 Prev: Google Research (NYC), Qualcomm AI, DynamoFL (YCW22)
 PhD Student @ KAIST AI [OSI Lab]
 Located in Seoul, South Korea
Education
 PhD in Graduate School of AI (KAIST)
 MS in Data Science (KAIST)
 BS in Mathematical Science (KAIST)
[Minor-Intellectual Property]
I am a Ph.D. Student in Optimization & Statistical Inference (OSI) Lab @ KAIST, advised by Prof. Se-Young Yun. I worked as a PhD Intern @ Google ResearchQualcomm AI, and DynamoFL (YCW22 Startup). My expected graduation date is Feb. 2025.  Contact: potter32 [at] kaist.ac.kr, kimtaehyeon610 [at] gmail.com (permanent)

 About

My goal is to tackle trustworthy and real-world AI/ML challenges
LLM: Instruction Tuning, Instructive Decoding, Speculative Decoding, Parallel Decoding
CV/NLP: Knowledge Distillation & Learning with Noisy Labels
Data Heterogeneity: Federated Learning & AutoML & Semi-Supervised Learning
Efficiency: Efficient Deep Learning
Game-Changing Research

️ News

Jul. 2024
Jun. 2024
 1 Accepted @ ICML2024W: Blockwise Parallel Decoding (Speculative Decoding)
May. 2024
 Attending ICLR 2024 @ Vienna, Aus  
Jan. 2024
 1 Accepted @ ICLR2024 (Spotlight): Instruction Following on Large Language Model
Dec. 2023
 1 Accepted @ NeurIPS2023W: Instruction Tuning & Instruction Following
 1 Accepted @ NeurIPS2023: Semi-Supervised Federated Object Detection
 Attending NeurIPS 2023 @ NOLA, US  
 1 Accepted @ AAAI2024: Few-shot & Domain Generalization
Oct. 2023
 Working with Google Research NYC

Working at/with

Search
Name
When
Research
Advisor/Coworker
2023.01 - 2023.05
Semi-Supervised Object Detection
Federated Learning
Eric Lin
2021.06 - 2021.12
Neural Architecture Search
Knowledge Distillation
Heesoo Myeong
2017.03 - 2018.02
Trajectory Prediction
Jaegil Lee

 Publications & Technical Reports

Search
Title
Conference and Journal
Author
COUNT20

Preprints (Under Review)

 Exploring and Improving Drafts in Blockwise Parallel Decoding
Taehyeon Kim, Ananda Theertha Suresh, Kishore Papineni, Michael Riley, Sanjiv Kumar, Adrian Benton
Marc Bartholet*, Taehyeon Kim*, Ami Beuret, Se-Young Yun, Joachim M. Buhmann
 FLR: Label-Mixture Regularization for Federated Learning with Noisy Labels
Taehyeon Kim, Donggyu Kim, Se-Young Yun
 Towards Fast Multilingual LLM Inference: Speculative Decoding and Specialized Drafters
Euiin Yi*, Taehyeon Kim*, Hongseok Jeung, Du-Seong Chang, Se-Young Yun
Namgyu Ho*, Sangmin Bae*, Taehyeon Kim, Hyunjik Jo, Yireun Kim, Tal Schuster, Adam Fisch, James Thorne, Se-Young Yun

 Leadership Awards  Activities

 Research Projects

 Invited Talks

 Services & Others

Search

 Others