请输入您要查询的百科知识:

 

词条 Gibbs' inequality
释义

  1. Gibbs' inequality

  2. Proof

  3. Alternative proofs

  4. Corollary

  5. See also

  6. References

In information theory, Gibbs' inequality is a statement about the mathematical entropy of a discrete probability distribution. Several other bounds on the entropy of probability distributions are derived from Gibbs' inequality, including Fano's inequality.

It was first presented by J. Willard Gibbs in the 19th century.

Gibbs' inequality

Suppose that

is a probability distribution. Then for any other probability distribution

the following inequality between positive quantities (since the pi and qi are positive numbers less than one) holds[1]{{rp|68}}

with equality if and only if

for all i. Put in words, the information entropy of a distribution P is less than or equal to its cross entropy with any other distribution Q.

The difference between the two quantities is the Kullback–Leibler divergence or relative entropy, so the inequality can also be written:[2]{{rp|34}}

Note that the use of base-2 logarithms is optional, and

allows one to refer to the quantity on each side of the inequality as an

"average surprisal" measured in bits.

Proof

For simplicity, we prove the statement using the natural logarithm (ln), since

the particular logarithm we choose only scales the relationship.

Let denote the set of all for which pi is non-zero. Then, since for all x > 0, with equality if and only if x=1, we have:

Then,

The last inequality is a consequence of the pi and qi being part of a probability distribution. Therefore, the sum of all values is unity. Specifically, the sum of all non-zero values is also unity, however, some non-zero qi may be excluded since the choice of indices is conditioned upon the pi. Therefore the sum of the qi may be less than unity.

We now have:

Since the pi and qi are probabilities, their logarithms are negative. The negation of the first sum is thus positive, and the un-negated second sum is negative. We may therefore add the negation of the second sum to both sides (a positive number) without changing the inequality to get:

Since the logarithm of zero is negative infinity, restoring the indices for values of pi that are zero requires some care. We notice that:

The quotient of zeros is an indeterminate form. Typically it is defined to be the convergent of asymptotic values in its neigborhood, or, if such a convergent does not exist, a convention convenient to the case at hand is adopted. In this context, the usual convention is to take the indeterminate form to be identically zero. This gives us:

The right hand side does not grow by our convention, and the left hand side does not grow either because zero times anything is zero, or by convention when qi is also zero.

For equality to hold, we require:

  1. for all so that the approximation is exact.
  2. so that equality continues to hold between the third and fourth lines of the proof.

This can happen if and only if

for i = 1, ..., n.

Alternative proofs

The result can alternatively be proved using Jensen's inequality or log sum inequality. Below we give a proof based on Jensen's inequality:

Because log is a concave function, we have that:

Where the first inequality is due to Jensen's inequality, and the last equality is because is a probability distribution.

Further, because is not linear, therefore by the equality condition of Jensen's inequality, we get equality when

Suppose that this ratio is , then we have that

Where we use the fact that are probability distributions. Therefore the equality happens when .

Corollary

The entropy of is bounded by:[1]{{rp|68}}

The proof is trivial – simply set for all i.

See also

  • Information entropy

References

1. ^{{cite book|author=Pierre Bremaud|title=An Introduction to Probabilistic Modeling|date=6 December 2012|publisher=Springer Science & Business Media|isbn=978-1-4612-1046-7}}
2. ^{{cite book|author=David J. C. MacKay|title=Information Theory, Inference and Learning Algorithms|publisher=Cambridge University Press|isbn=978-0-521-64298-9}}

4 : Information theory|Coding theory|Inequalities|Articles containing proofs

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/27 10:19:04