Measuring Dependence via Mutual Information
Loading...
Authors
Lu, Shan
Date
2011-10-03
Type
thesis
Language
eng
Keyword
Mutual Information , Dependence Measure
Alternative Title
Abstract
Considerable research has been done on measuring dependence between random variables. The correlation coefficient is the most widely studied linear measure of dependence. However, the limitation of linearity limits its application. The informational coefficient of correlation is defined in terms of mutual information. It also has some deficiencies, such as it is only normalized to continuous random variables.
Based on the concept of the informational coefficient of correlation, a new dependence measure, which we call the L-measure, is proposed in this work which generalizes Linfoot's measure for both continuous and discrete random variables. To further elucidate its properties, simulated models are used, and estimation algorithms are also discussed. Furthermore, another measure based on the L-measure, which we call the intrinsic L-measure, is studied for the purpose of studying nonlinear dependence. Based on criteria for a dependence measure presented by Renyi and simulation results in this thesis, we believe that the L-measure is satisfactory as a dependence measure.
Description
Thesis (Master, Mathematics & Statistics) -- Queen's University, 2011-09-30 14:29:35.153
Citation
Publisher
License
This publication is made available by the authority of the copyright owner solely for the purpose of private study and research and may not be copied or reproduced except as permitted by the copyright laws without written authority from the copyright owner.
