请输入您要查询的百科知识:

 

词条 Divergence (statistics)
释义

  1. Definition

  2. Geometrical properties

  3. Examples

      f-divergences    Bregman divergences    M-divergences    S-divergences  

  4. History

  5. See also

  6. References

{{distinguish|Deviance (statistics)|Deviation (statistics)|discrepancy (disambiguation)#Statistics{{!}}Discrepancy (statistics)}}

In statistics and information geometry, divergence or a contrast function is a function which establishes the "distance" of one probability distribution to the other on a statistical manifold. The divergence is a weaker notion than that of the distance, in particular the divergence need not be symmetric (that is, in general the divergence from p to q is not equal to the divergence from q to p), and need not satisfy the triangle inequality.

Definition

Suppose S is a space of all probability distributions with common support. Then a divergence on S is a function {{nowrap|D(· {{!}}{{!}} ·): S×SR}} satisfying [1]

  1. D(p || q) ≥ 0 for all p, qS,
  2. D(p || q) = 0 if and only if p = q,

The dual divergence D* is defined as

Geometrical properties

{{details|Information geometry}}

Many properties of divergences can be derived if we restrict S to be a statistical manifold, meaning that it can be parametrized with a finite-dimensional coordinate system θ, so that for a distribution {{nowrap|pS}} we can write {{nowrap|1=p = p(θ)}}.

For a pair of points {{nowrap|p, qS}} with coordinates θp and θq, denote the partial derivatives of D(p || q) as

Now we restrict these functions to a diagonal {{nowrap|1=p = q}}, and denote [2]

By definition, the function D(p || q) is minimized at {{nowrap|1=p = q}}, and therefore

where matrix g(D) is positive semi-definite and defines a unique Riemannian metric on the manifold S.

Divergence D(· || ·) also defines a unique torsion-free affine connection ∇(D) with coefficients

and the dual to this connection ∇* is generated by the dual divergence D*.

Thus, a divergence D(· || ·) generates on a statistical manifold a unique dualistic structure (g(D), ∇(D), ∇(D*)). The converse is also true: every torsion-free dualistic structure on a statistical manifold is induced from some globally defined divergence function (which however need not be unique).[3]

For example, when D is an f-divergence for some function ƒ(·), then it generates the metric {{nowrap|1=g(Df) = c·g}} and the connection {{nowrap|1=∇(Df) = ∇(α)}}, where g is the canonical Fisher information metric, ∇(α) is the α-connection, {{nowrap|1=c = ƒ′′(1)}}, and {{nowrap|1=α = 3 + 2ƒ′′′(1)/ƒ′′(1)}}.

Examples

The two most important divergences are the relative entropy (Kullback–Leibler divergence, KL divergence), which is central to information theory and statistics, and the squared Euclidean distance (SED). Minimizing these two divergences is the main way that linear inverse problem are solved, via the principle of maximum entropy and least squares, notably in logistic regression and linear regression.{{sfn|Csiszár|1991}}

The two most important classes of divergences are the f-divergences and Bregman divergences; however, other types of divergence functions are also encountered in the literature. The only divergence that is both an f-divergence and a Bregman divergence is the Kullback–Leibler divergence; the squared Euclidean divergence is a Bregman divergence (corresponding to the function {{tmath|x^2}}), but not an f-divergence.

f-divergences

{{Main|f-divergence}}

This family of divergences are generated through functions f(u), convex on {{nowrap|u > 0}} and such that {{nowrap|1=f(1) = 0}}. Then an f-divergence is defined as

Kullback–Leibler divergence:
squared Hellinger distance:
Jeffreys divergence:
Chernoff's α-divergence:
exponential divergence:
Kagan's divergence:
(α,β)-product divergence:

Bregman divergences

{{Main|Bregman divergence}}

Bregman divergences correspond to convex functions on convex sets. Given a strictly convex, continuously-differentiable function {{math|F}} on a convex set, known as the Bregman generator, the Bregman divergence measures the convexity of: the error of the linear approximation of {{math|F}} from {{math|q}} as an approximation of the value at {{math|p}}:

The dual divergence to a Bregman divergence is the divergence generated by the convex conjugate {{math|F*}} of the Bregman generator of the original divergence. For example, for the squared Euclidean distance, the generator is {{tmath|x^2}}, while for the relative entropy the generator is the negative entropy {{tmath|x \\log x}}.

M-divergences

{{Empty section|date=January 2011}}

S-divergences

{{Empty section|date=January 2011}}

History

The term "divergence" for a statistical distance was used informally in various contexts from c. 1910 to c. 1940. Its formal use dates at least to {{harvtxt|Bhattacharyya|1943}}, entitled "On a measure of divergence between two statistical populations defined by their probability distributions", which defined the Bhattacharyya distance, and {{harvtxt|Bhattacharyya|1946}}, entitled "On a Measure of Divergence between Two Multinomial Populations", which defined the Bhattacharyya angle. The term was popularized by its use for the Kullback–Leibler divergence in {{harvtxt|Kullback|Leibler|1951}}, its use in the textbook {{harvtxt|Kullback|1959}}, and then by {{harvtxt|Ali|Silvey|1966}} generally, for the class of f-divergences. The term "Bregman distance" is still found, but "Bregman divergence" is now preferred. In information geometry, alternative terms were initially used, including "quasi-distance" {{harvtxt|Amari|1982|p=369}} and "contrast function" {{harvtxt|Eguchi|1985}}, though "divergence" was used in {{harvtxt|Amari|1985}} for the {{math|α}}-divergence, and has become standard (e.g., {{harvtxt|Amari|Cichocki|2010}}).

See also

  • Statistical distance

References

1. ^{{harvtxt|Eguchi|1985}}
2. ^{{harvtxt|Eguchi|1992}}
3. ^{{harvtxt|Matumoto|1993}}
{{refbegin}}
  • {{cite book

| last1 = Amari | first1 = Shun-ichi | authorlink = Shun'ichi Amari
| last2 = Nagaoka | first2 = Hiroshi
| title = Methods of information geometry
| year = 2000
| publisher = Oxford University Press
| isbn = 0-8218-0531-2
| ref = CITEREFAmariNagaoka2000
  • {{cite journal

| last = Eguchi | first = Shinto
| title = A differential geometric approach to statistical inference on the basis of contrast functionals
| year = 1985
| journal = Hiroshima mathematical journal
| volume = 15
| issue = 2
| pages = 341–391
| url = http://projecteuclid.org/euclid.hmj/1206130775
| ref = CITEREFEguchi1985
  • {{cite journal

| last = Eguchi | first = Shinto
| title = Geometry of minimum contrast
| year = 1992
| journal = Hiroshima mathematical journal
| volume = 22
| issue = 3
| pages = 631–647
| url = http://projecteuclid.org/euclid.hmj/1206128508
| ref = CITEREFEguchi1992
  • {{cite journal

| last = Matumoto | first = Takao
| title = Any statistical manifold has a contrast function — on the C³-functions taking the minimum at the diagonal of the product manifold
| year = 1993
| journal = Hiroshima mathematical journal
| volume = 23
| issue = 2
| pages = 327–332
| url = http://projecteuclid.org/euclid.hmj/1206128255
| ref = CITEREFMatumoto1993{{refend}}{{Statistics|inference|collapsed}}{{DEFAULTSORT:Divergence (Statistics)}}

2 : Statistical distance|F-divergences

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/13 13:04:09