Prev: Google Research (NYC), Qualcomm AI, DynamoFL (YCW22)
Education
PhD - KAIST AI
MS - KAIST Data Science
BS - KAIST Mathematical Science
[Minor-Intellectual Property]
I am a final-year Ph.D. Student in Optimization & Statistical Inference (OSI) Lab @ KAIST, advised by Prof. Se-Young Yun. I worked as a PhD Intern @ Google Research, Qualcomm AI, and DynamoFL (YCW22). My expected graduation date is Feb. 2025.
Contact: potter32 [at] kaist.ac.kr, kimtaehyeon610 [at] gmail.com (permanent)
Short Bio
Taehyeon Kim is a Ph.D. candidate at the Korea Advanced Institute of Science and Technology (KAIST) AI, South Korea, supervised by Prof. Se-Young Yun. During his time at KAIST, he has gained diverse experiences working at/with Google Research (2023), Dynamo AI (2023), the Korean National Institute of Meteorological Sciences (2022), and Qualcomm AI (2021). His work has been recognized with multiple spotlight and oral presentation awards (e.g., ICLR, ICML Workshop), as well as several NeurIPS competition awards and leadership roles.
His research focuses on efficient and effective inference strategies for large language models (LLMs), including speculative decoding, instructive decoding, and collaborative LLM inference. Specifically, his research aims to enhance deep learning systems in challenging real-world scenarios. This includes designing distributed algorithms for heterogeneous data, such as knowledge distillation and semi-supervised federated object detection, implementing automated hyperparameter search, developing test-time strategies for instruction-following and user-aligned LLM behavior, and creating weather forecast models tailored for South Korea.
Looking ahead, Taehyeon’s future research plans focus on collaborative decoding among multiple LLMs using prompt optimization, efficient speculative test-time reasoning, and client-centric test-time adaptation by incorporating user preferences. In his research, he also brings a theoretical foundation in matrix bounds and empirical optimization to his work, further supporting his contributions to the field.
(Click to Open) → Featured Publications (1st Authored)
CV (Updated: Dec, 2024)
️ News
Oct. 2024
1 Accepted @ NeurIPS2024W: Speculative Decoding with multiple drafters
1 Accepted @ TMLR 2024: Federated Learning with Noisy Labels
Google Conference Scholarship - NeurIPS 2024
Sep. 2024
Successfully passed my PhD Proposal!
1 Accepted @ EMNLP2024 Main - Specialized Speculative Decoding!
2 Accepted @ NeurIPS 2024 - Speculative Decoding and Block Transformer!
May. 2024
Attending ICLR 2024 @ Vienna, Aus
Jan. 2024
1 Accepted @ ICLR2024 (Spotlight): Instruction Following on Large Language Model
Dec. 2023
1 Accepted @ NeurIPS2023W: Instruction Tuning & Instruction Following
1 Accepted @ NeurIPS2023: Semi-Supervised Federated Object Detection
Attending NeurIPS 2023 @ NOLA, US
1 Accepted @ AAAI2024: Few-shot & Domain Generalization
Oct. 2023
Working with Google Research NYC
Working at/with
Table
Search
Publications & Technical Reports
Table
List
Default view
Search
Preprints (Under Review)
Marc Bartholet*, Taehyeon Kim*, Ami Beuret, Se-Young Yun, Joachim M. Buhmann
Leadership Awards Activities
List view
Search
Research Projects
Table
List view
Search
Invited Talks
Table
Search
Services & Others
Gallery view
Search