Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01qr46r3852
Title: | Pseudo-Random Weight Initialization in Deep Neural Networks |
Authors: | Draper, Jack |
Advisors: | Adams, Ryan P Sly, Allan |
Department: | Mathematics |
Class Year: | 2020 |
Abstract: | Deep neural networks are incredibly complex learning models whose performance is greatly dependent on the random initialization of their connecting weights. Training two neural networks of the same architecture with different weight initializations will almost inevitably lead to two completely different networks, each learning to perform its task in a unique way. This thesis investigates the use of an elliptical sampling technique to produce pseudo-random initializations for a rudimentary deep neural network. By using this technique to gradually change the weight initialization of our network, we hope to gain a better understanding of the complicated mechanics underlying its learning process. Additionally, we present the method of iterative weight refinement, which takes advantage of elliptical sampling techniques to optimize the performance of a given metric by repeatedly improving the choice of weight initialization. While this technique has certain limitations, it offers a clear and systematic method for manipulating weight initializations to improve the performance of a neural network. |
URI: | http://arks.princeton.edu/ark:/88435/dsp01qr46r3852 |
Type of Material: | Princeton University Senior Theses |
Language: | en |
Appears in Collections: | Mathematics, 1934-2020 |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
DRAPER-JACK-THESIS.pdf | 1.4 MB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.