Date of Award

Spring 5-15-2020

Author's School

McKelvey School of Engineering

Author's Department

Computer Science & Engineering

Degree Name

Master of Science (MS)

Degree Type

Thesis

Abstract

In this thesis, we consider neural network approaches to the semantic role labeling task in seman-tic parsing. Recent state-of-the-art results for semantic role labeling are achieved by combiningLSTM neural networks and pre-trained features. This work offers a simple BERT-based modelwhich shows that, contrary to the popular belief that more complexity means better performance,removing LSTM improves the state of the art for span-based semantic role labeling. This modelhas improved F1 scores on both the test set of CoNLL-2012, and the Brown test set of CoNLL-2005 by at least 3 percentage points.In addition to this refinement of existing architectures, we also propose a new mechanism. Therehas been an active line of research focusing on incorporating syntax information into the atten-tion mechanism for semantic parsing. However, the existing models do not make use of whichsub-clause a given token belongs to or where the boundary of the sub-clause lies. In this thesis,we propose a predicate-aware attention mechanism that explicitly incorporates the portion of theparsing spanning from the predicate. The proposed Syntax-Guidance (SG) mechanism further improves the model performance. We compare the predicate informed method with three other SG mechanisms in detailed error analysis, showing the advantage and potential research directions ofthe proposed method.

Language

English (en)

Chair

Brendan Juba, Michael Brent, Ayan Chakrabarti

Committee Members

Brendan Juba, Michael Brent, Ayan Chakrabarti

Comments

Permanent URL: https://doi.org/10.7936/hvst-sf33

Included in

Engineering Commons

Share

COinS