Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01zg64tp23b
Title: | COMPUTATION OF INFORMATION MEASURES |
Authors: | Zhan, Shuxin |
Advisors: | Verdu, Sergio |
Contributors: | Lieb, Elliott |
Department: | Mathematics |
Class Year: | 2015 |
Abstract: | For well-behaved distributions, mutual information can computed using a simple identity with the two distribution’s marginal and conditional entropies. However, when these entropies are ill-defined, more powerful methods are required. This thesis aims to calculate the mutual information of one such distribution given by p(x) = 1/xlog2(x). This is the first known attempt to approximate mutual information of distributions such as these. While I was able to numerically approximate the mutual information of this distribution as well as find meaningful lower bounds, proving the existence of an upper bound remains an open problem. |
Extent: | 26 pages |
URI: | http://arks.princeton.edu/ark:/88435/dsp01zg64tp23b |
Type of Material: | Princeton University Senior Theses |
Language: | en_US |
Appears in Collections: | Mathematics, 1934-2020 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
PUTheses2015-Zhan_Shuxin.pdf | 689.96 kB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.