Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01zg64tp23b
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor | Lieb, Elliott | - |
dc.contributor.advisor | Verdu, Sergio | - |
dc.contributor.author | Zhan, Shuxin | - |
dc.date.accessioned | 2015-06-12T20:01:23Z | - |
dc.date.available | 2015-06-12T20:01:23Z | - |
dc.date.created | 2015-05-04 | - |
dc.date.issued | 2015-06-12 | - |
dc.identifier.uri | http://arks.princeton.edu/ark:/88435/dsp01zg64tp23b | - |
dc.description.abstract | For well-behaved distributions, mutual information can computed using a simple identity with the two distribution’s marginal and conditional entropies. However, when these entropies are ill-defined, more powerful methods are required. This thesis aims to calculate the mutual information of one such distribution given by p(x) = 1/xlog2(x). This is the first known attempt to approximate mutual information of distributions such as these. While I was able to numerically approximate the mutual information of this distribution as well as find meaningful lower bounds, proving the existence of an upper bound remains an open problem. | en_US |
dc.format.extent | 26 pages | en_US |
dc.language.iso | en_US | en_US |
dc.title | COMPUTATION OF INFORMATION MEASURES | en_US |
dc.type | Princeton University Senior Theses | - |
pu.date.classyear | 2015 | en_US |
pu.department | Mathematics | en_US |
pu.pdf.coverpage | SeniorThesisCoverPage | - |
Appears in Collections: | Mathematics, 1934-2020 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
PUTheses2015-Zhan_Shuxin.pdf | 689.96 kB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.