Gyuwan Kim Kyunghyun Cho

By Saehee Jong, Communications & Special Events Assistant

First-year Ph.D. student Gyuwan Kim was awarded Best Paper in November 2021 for his paper titled, "Length-Adaptive Transformer: Train Once with Length Drop, Use Anytime with Search" at SustaiNLP 2021 Workshop at EMNLP 2021, along with his co-author Kyunghyun Cho, Associate Professor for Computer Science and Data Science at New York University. 

Kim and Cho are interested in elevating current transformer models, which are widely used in natural language processing and machine learning in general, but are often costly. In their paper, they propose a new framework that trains a transfer called Length-Adaptive Transformer to use it in inference scenarios with various computational budgets, achieving a superior accuracy-efficiency trade-off.

Congratulations to Gyuwan and Kyunghyun!