Latent Compositional Representations for English Function Word Comprehension

Loading...
Thumbnail Image

Authors

Barnes, Megan

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

This paper investigates whether biasing natural language models toward tree-compositional structure and systematic token representation can improve performance on tasks that require the use of function words. The method used treats tree-structure as latent and thus requires no gold parse labels. Results show that across four function-word-focused NLI probing tasks, tree-compositional models perform as well as LSTMs, but lag behind BERT to varying degrees between tasks. Context-dependent behavior of tree-compositional models highlights a potential weakness of the architecture in the absence of grounding information.

Description

Thesis (Master's)--University of Washington, 2021

Citation

DOI

Collections