Skip navigation
Please use this identifier to cite or link to this item: http://arks.princeton.edu/ark:/88435/dsp01zg64tp23b
Full metadata record
DC FieldValueLanguage
dc.contributorLieb, Elliott-
dc.contributor.advisorVerdu, Sergio-
dc.contributor.authorZhan, Shuxin-
dc.date.accessioned2015-06-12T20:01:23Z-
dc.date.available2015-06-12T20:01:23Z-
dc.date.created2015-05-04-
dc.date.issued2015-06-12-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/dsp01zg64tp23b-
dc.description.abstractFor well-behaved distributions, mutual information can computed using a simple identity with the two distribution’s marginal and conditional entropies. However, when these entropies are ill-defined, more powerful methods are required. This thesis aims to calculate the mutual information of one such distribution given by p(x) = 1/xlog2(x). This is the first known attempt to approximate mutual information of distributions such as these. While I was able to numerically approximate the mutual information of this distribution as well as find meaningful lower bounds, proving the existence of an upper bound remains an open problem.en_US
dc.format.extent26 pagesen_US
dc.language.isoen_USen_US
dc.titleCOMPUTATION OF INFORMATION MEASURESen_US
dc.typePrinceton University Senior Theses-
pu.date.classyear2015en_US
pu.departmentMathematicsen_US
pu.pdf.coverpageSeniorThesisCoverPage-
Appears in Collections:Mathematics, 1934-2020

Files in This Item:
File SizeFormat 
PUTheses2015-Zhan_Shuxin.pdf689.96 kBAdobe PDF    Request a copy


Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.