Latent Compositional Representations for English Function Word Comprehension

dc.contributor.advisorSteinert-Threlkeld, Shane
dc.contributor.authorBarnes, Megan
dc.date.accessioned2022-01-26T23:25:26Z
dc.date.available2022-01-26T23:25:26Z
dc.date.issued2022-01-26
dc.date.submitted2021
dc.descriptionThesis (Master's)--University of Washington, 2021
dc.description.abstractThis paper investigates whether biasing natural language models toward tree-compositional structure and systematic token representation can improve performance on tasks that require the use of function words. The method used treats tree-structure as latent and thus requires no gold parse labels. Results show that across four function-word-focused NLI probing tasks, tree-compositional models perform as well as LSTMs, but lag behind BERT to varying degrees between tasks. Context-dependent behavior of tree-compositional models highlights a potential weakness of the architecture in the absence of grounding information.
dc.embargo.termsOpen Access
dc.format.mimetypeapplication/pdf
dc.identifier.otherBarnes_washington_0250O_23738.pdf
dc.identifier.urihttp://hdl.handle.net/1773/48282
dc.language.isoen_US
dc.rightsCC BY
dc.subject
dc.subjectLinguistics
dc.subjectComputer science
dc.subject.otherLinguistics
dc.titleLatent Compositional Representations for English Function Word Comprehension
dc.typeThesis

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Barnes_washington_0250O_23738.pdf
Size:
604.52 KB
Format:
Adobe Portable Document Format

Collections