Steinert-Threlkeld, ShaneBarnes, Megan2022-01-262022-01-262022-01-262021Barnes_washington_0250O_23738.pdfhttp://hdl.handle.net/1773/48282Thesis (Master's)--University of Washington, 2021This paper investigates whether biasing natural language models toward tree-compositional structure and systematic token representation can improve performance on tasks that require the use of function words. The method used treats tree-structure as latent and thus requires no gold parse labels. Results show that across four function-word-focused NLI probing tasks, tree-compositional models perform as well as LSTMs, but lag behind BERT to varying degrees between tasks. Context-dependent behavior of tree-compositional models highlights a potential weakness of the architecture in the absence of grounding information.application/pdfen-USCC BYLinguisticsComputer scienceLinguisticsLatent Compositional Representations for English Function Word ComprehensionThesis