Nidhi Ravi Hiremath MS Project Defense

Wednesday, May 29, 2019 - 4:00pm to 5:00pm
HFH 1132
Multi-hop Reasoning for Question Answering with Self-attention Networks
Nidhi Ravi Hiremath
William Wang (Chair), Tao Yang

Answering questions about text frequently requires aggregating information from multiple places in that text. Self-attention language models, such as Google's BERT language model have achieved state-of-the-art results in question answering tasks over a single sentence, or those that do not require multi-hop reasoning. In this work, we report the findings and feasibility of building end-toend neural models on self-attention based language models for a variety of question answering settings, including knowledge graph based and text based data, and varying answer availability settings.

Everyone welcome!