Prof. William Wang seeks to close the gaps in human-computer communication through NLP research
“Machines need to not just understand human language, but learn how to generate human language,” contends Assistant Professor William Wang. Improving the ability for computers to converse with humans, and vice versa, is the focus of Wang's research. Think iPhone's Siri, Amazon's Alexa, Microsoft's Cortana...and beyond.
Additionally, improving human-to-computer and computer-to-human communication is fueling increased student interest in natural language processing or NLP. Case in point: UCSB's first-ever graduate-level course on deep learning for NLP introduced recently by Prof. Wang had a cap of 25 students; but 65 showed up on the first day of class.
Taking his work a step further, Prof. Wang is conducting research not only into teaching machines to learn, but, in some sense, teaching machines to teach themselves. “In reinforcement learning, the idea is how can we teach the machine to make incremental decisions, without having a lot of human annotations,” Wang said. With this method of machine learning, computers are made to explore options and take actions in order to maximize a reward, with minimal human supervision.
Read the full story here.