Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01xd07gw74f
Full metadata record
DC FieldValueLanguage
dc.contributor.advisorNarasimhan, Karthik
dc.contributor.authorHong, Katherine
dc.date.accessioned2020-10-02T21:30:19Z-
dc.date.available2020-10-02T21:30:19Z-
dc.date.issued2020-10-02-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01xd07gw74f-
dc.description.abstractI programmed Long Short-Term Memory (LSTM) models to generate poems using Walt Whitman’s poetry collection. I programmed two different models: a character level model, which generates text by character; and a word level model, which generates text by word. Within each model, I also experimented with different parameters. I wrote a baseline model, a wide model which has double number of cells in each LSTM layer than the baseline model, a deep model which adds one more LSTM layer, and a wide and deep model which combines both features. I used perplexity to measure the prediction ability of the generative models. By evaluating the generated poems and their perplexities, I conclude that the word level model is far superior than the character level model. Within the word level model, the wide and deep model produces the best quality of poems, although its perplexity is sometimes slightly higher. After sufficient training, the poems generated by the word level model are meaningful, expressive, and thematic.
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.titleProgramming A Poet: Poetry Text Generation Using LSTM
dc.typePrinceton University Senior Theses
pu.date.classyear2020
pu.departmentElectrical Engineering
pu.pdf.coverpageSeniorThesisCoverPage
pu.contributor.authorid920088756
pu.certificateNone
Appears in Collections:Electrical Engineering, 1932-2020

Files in This Item:
File SizeFormat 
HONG-KATHERINE-THESIS.pdf268.96 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.