词条 | Method of moments (probability theory) |
释义 |
In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.[1] Suppose X is a random variable and that all of the moments exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If for all values of k, then the sequence {Xn} converges to X in distribution. The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé.[2] More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.[3] Notes1. ^{{cite book|last=Prokhorov|first=A.V.|chapter=Moments, method of (in probability theory)|title=Encyclopaedia of Mathematics (online)|isbn=1-4020-0609-8|url=http://eom.springer.de/m/m064610.htm|mr=1375697|editor=M. Hazewinkel}} 2. ^{{cite book|mr=2743162|last=Fischer|first=H.|title=A history of the central limit theorem. From classical to modern probability theory.|series= Sources and Studies in the History of Mathematics and Physical Sciences|publisher=Springer|location=New York|year=2011|isbn=978-0-387-87856-0|chapter=4. Chebyshev's and Markov's Contributions.}} 3. ^{{cite book|last=Anderson|first=G.W.|last2=Guionnet|first2=A.|last3=Zeitouni|first3=O.|title=An introduction to random matrices.|year=2010|publisher=Cambridge University Press|location=Cambridge|isbn=978-0-521-19452-5|chapter=2.1}} 1 : Moment (mathematics) |
随便看 |
开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。