Toulias, T. L. and Kitsos, C. P. (2019) Information Distances and Divergences for the Generalized Normal Distribution. In: Advances in Mathematics and Computer Science Vol. 3. B P International, pp. 29-45. ISBN 978-93-89562-49-1
Full text not available from this repository.Abstract
The study of relative measures of information between two distributions that characterizes an
Input/Output System is important for the investigation of the informational ability and behaviour
of that system. The most important measures of information distance and divergence are briefly
presented and grouped. In Statistical Geometry, and for the study of statistical manifolds, relative
measures of information are needed that are also distance metrics. The Hellinger distance metric is
studied, providing a “compact” measure of informational “proximity” between of two distributions.
Certain formulations of the Hellinger distance between two generalized normal distributions are
given and discussed. Some results for the Bhattacharyya distance are also given. Moreover, the
symmetricity of the Kullback-Leibler divergence between a generalized normal and a t -distribution,
is examined for this key measure of information divergence.
Item Type: | Book Section |
---|---|
Subjects: | Universal Eprints > Computer Science |
Depositing User: | Managing Editor |
Date Deposited: | 17 Nov 2023 03:32 |
Last Modified: | 17 Nov 2023 03:32 |
URI: | http://journal.article2publish.com/id/eprint/3210 |