Prev: Google Research (NYC), Qualcomm AI, DynamoFL (YCW22)
Located in Seoul, South Korea
Education
PhD in Graduate School of AI (KAIST)
MS in Data Science (KAIST)
BS in Mathematical Science (KAIST)
[Minor-Intellectual Property]
I am a Ph.D. Student in Optimization & Statistical Inference (OSI) Lab @ KAIST, advised by Prof. Se-Young Yun. I worked as a PhD Intern @ Google Research, Qualcomm AI, and DynamoFL (YCW22 Startup). My expected graduation date is Aug. 2024 (or Feb. 2025).
Contact: potter32 [at] kaist.ac.kr, kimtaehyeon610 [at] gmail.com (permanent)
About
My goal is to tackle trustworthy and real-world AI/ML challenges
•
NLP: Instruction Tuning and Following, Parallel Decoding
•
CV/NLP: Knowledge Distillation & Learning with Noisy Labels
•
Data Heterogeneity: Federated Learning & AutoML & Semi-Supervised Learning
•
Efficiency: Efficient Deep Learning
•
Game-Changing Research
️ News
Jan. 2023
1 Accepted @ ICLR2024 (Spotlight): Instruction Following on Large Language Model
Dec. 2023
1 Accepted @ NeurIPS2023W: Instruction Tuning & Instruction Following
1 Accepted @ NeurIPS2023: Semi-Supervised Federated Object Detection
Attending NeurIPS 2023 @ NOLA, US
1 Accepted @ AAAI2024: Few-shot & Domain Generalization
Oct. 2023
Working with Google Research NYC
Working at/with
Table
Search
Publications & Technical Reports
Table
List
Default view
Search
Preprints (Under Review)
Towards Fast Inference: Exploring and Improving Blockwise Parallel Drafts
Taehyeon Kim, Ananda Theertha Suresh, Kishore Papineni, Michael Riley, Sanjiv Kumar, Adrian Benton
Non-linear Fusion in Federated Learning: A Hypernetwork Approach to Federated Domain Generalization
Marc Bartholet, Taehyeon Kim, Ami Beuret, Se-Young Yun, Joachim M. Buhmann
Revisiting Early-Learning Regularization: When Federated Learning Meets Noisy Labels
Taehyeon Kim, Donggyu Kim, Se-Young Yun
Leadership Awards Activities
List view
Search
Research Projects
Table
List view
Search
Invited Talks
Table
Search
Services & Others
Gallery view
Search