Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01h415pd38k
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorNarasimhan, Karthik-
dc.contributor.authorArora, Karan-
dc.date.accessioned2019-07-24T17:51:13Z-
dc.date.available2019-07-24T17:51:13Z-
dc.date.created2019-05-10-
dc.date.issued2019-07-24-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01h415pd38k-
dc.description.abstractWe consider a setting in which a language model - given access to some information about an input’s domain - is trained to learn a task over an entire distribution of domains, with the goal of generalizing to inputs from domains that are not in its training data. Drawing inspiration from existing methods outside our problem setting, we develop a mechanism that conditions an operation in a language model to modify its representation of an input based on information about a domain. This mechanism is meant to be trained jointly with the taskperforming model, and makes few assumptions about the model architecture. We perform experiments in which we compare the performance of a model that is augmented with our mechanism to a baseline that is not for language modeling and sentiment analysis tasks. While the conditioning mechanism does not currently provide a performance improvement on real data, experiments with synthetic data suggest that it is capable of doing so, and that some fine-tuning and further experimentation may enable it to work better.en_US
dc.format.mimetypeapplication/pdf-
dc.language.isoenen_US
dc.titleConditioning Language Models for Domainen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2019en_US
pu.departmentComputer Scienceen_US
pu.pdf.coverpageSeniorThesisCoverPage-
pu.contributor.authorid960978935-
Appears in Collections:Computer Science, 1988-2020

Files in This Item:
File SizeFormat 
ARORA-KARAN-THESIS.pdf857.02 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.