请输入您要查询的百科知识:

 

词条 Entropy rate
释义

  1. Entropy rates for Markov chains

  2. See also

  3. References

{{Information theory}}

In the mathematical theory of probability, the entropy rate or source information rate of a stochastic process is, informally, the time density of the average information in a stochastic process. For stochastic processes with a countable index, the entropy rate is the limit of the joint entropy of members of the process divided by , as tends to infinity:

when the limit exists. An alternative, related quantity is:

For strongly stationary stochastic processes, . The entropy rate can be thought of as a general property of stochastic sources; this is the asymptotic equipartition property. The entropy rate may be used to estimate the complexity of stochastic processes. It is used in diverse applications ranging from characterizing the complexity of languages, blind source separation, through to optimizing quantizers and data compression algorithms. For example, a maximum entropy rate criterion may be used for feature selection in machine learning .[1]

Entropy rates for Markov chains

Since a stochastic process defined by a Markov chain that is irreducible, aperiodic

and positive recurrent has a stationary distribution, the entropy rate is independent of the initial distribution.

For example, for such a Markov chain defined on a countable number of states, given the transition matrix , is given by:

where is the asymptotic distribution of the chain.

A simple consequence of this definition is that an i.i.d. stochastic process has an entropy rate that is the same as the entropy of any individual member of the process.

See also

  • Information source (mathematics)
  • Markov information source
  • Asymptotic equipartition property
  • Maximal Entropy Random Walk - chosen to maximize entropy rate

References

1. ^{{cite journal |last1=Einicke |first1=G. A. |title=Maximum-Entropy Rate Selection of Features for Classifying Changes in Knee and Ankle Dynamics During Running |journal=IEEE Journal of Biomedical and Health Informatics |volume=28 |issue=4 |pages=1097–1103 |year=2018 |doi= 10.1109/JBHI.2017.2711487 }}
  • Cover, T. and Thomas, J. (1991) Elements of Information Theory, John Wiley and Sons, Inc., {{ISBN|0-471-06259-6}} [https://archive.today/20121216133431/http://www3.interscience.wiley.com/cgi-bin/bookhome/110438582?CRETRY=1&SRETRY=0]

4 : Information theory|Entropy|Markov models|Temporal rates

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/9/21 22:38:13