Latent Compositional Representations for English Function Word Comprehension
| dc.contributor.advisor | Steinert-Threlkeld, Shane | |
| dc.contributor.author | Barnes, Megan | |
| dc.date.accessioned | 2022-01-26T23:25:26Z | |
| dc.date.available | 2022-01-26T23:25:26Z | |
| dc.date.issued | 2022-01-26 | |
| dc.date.submitted | 2021 | |
| dc.description | Thesis (Master's)--University of Washington, 2021 | |
| dc.description.abstract | This paper investigates whether biasing natural language models toward tree-compositional structure and systematic token representation can improve performance on tasks that require the use of function words. The method used treats tree-structure as latent and thus requires no gold parse labels. Results show that across four function-word-focused NLI probing tasks, tree-compositional models perform as well as LSTMs, but lag behind BERT to varying degrees between tasks. Context-dependent behavior of tree-compositional models highlights a potential weakness of the architecture in the absence of grounding information. | |
| dc.embargo.terms | Open Access | |
| dc.format.mimetype | application/pdf | |
| dc.identifier.other | Barnes_washington_0250O_23738.pdf | |
| dc.identifier.uri | http://hdl.handle.net/1773/48282 | |
| dc.language.iso | en_US | |
| dc.rights | CC BY | |
| dc.subject | ||
| dc.subject | Linguistics | |
| dc.subject | Computer science | |
| dc.subject.other | Linguistics | |
| dc.title | Latent Compositional Representations for English Function Word Comprehension | |
| dc.type | Thesis |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Barnes_washington_0250O_23738.pdf
- Size:
- 604.52 KB
- Format:
- Adobe Portable Document Format
