Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01v692t8641
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Fellbaum, Christiane | - |
dc.contributor.author | Madge, Saahil | - |
dc.date.accessioned | 2016-06-29T14:22:39Z | - |
dc.date.available | 2016-06-29T14:22:39Z | - |
dc.date.created | 2016-04-29 | - |
dc.date.issued | 2016-06-29 | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/dsp01v692t8641 | - |
dc.description.abstract | Abstract We present a general approach for machine comprehension tasks by converting the text to a knowledge graph and the questions to queries on the graph. We extend [19] and use the Stanford NLP Toolkit’s Dependency Parser [17, 9] to transform each sentence into a set of entity-relation triples. We use word2vec [18] to convert the questions into queries on the graph. We present a tensor decomposition approach to answering queries by adding Semantically Smooth Embedding [11] to RESCAL [20]. We also generalize the Memory Networks [28, 25] architecture to take any knowledge graph as input. We evaluate these models on three full SAT reading comprehension tests. The models presented here outperform their respective baselines. Both models demonstrate the ability to capture the semantic and structural information in the text and answer questions using that information. | en_US |
dc.format.extent | 57 | en_US |
dc.language.iso | en_US | en_US |
dc.title | Tensor Decomposition and Memory Networks for SAT Reading Comprehension | en_US |
dc.type | Princeton University Senior Theses | - |
pu.date.classyear | 2016 | en_US |
pu.department | Computer Science | en_US |
pu.pdf.coverpage | SeniorThesisCoverPage | - |
Appears in Collections: | Computer Science, 1988-2020 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
null | 905.23 kB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.