Method of moments (probability theory)

{{Dablink|This article is about the method of moments in probability theory. See method of moments (disambiguation) for other techniques bearing the same name.}}

In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.{{cite book|last=Prokhorov|first=A.V.|chapter=Moments, method of (in probability theory)|title=Encyclopaedia of Mathematics (online)|isbn=1-4020-0609-8|url=http://encyclopediaofmath.org/index.php?title=Moments,_method_of_(in_probability_theory)&oldid=47882|mr=1375697|editor=M. Hazewinkel}} Suppose X is a random variable and that all of the moments

:\operatorname{E}(X^k)\,

exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments

(cf. the problem of moments). If

:\lim_{n\to\infty}\operatorname{E}(X_n^k) = \operatorname{E}(X^k)\,

for all values of k, then the sequence {Xn} converges to X in distribution.

The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé.{{cite book|mr=2743162|last=Fischer|first=H.|title=A history of the central limit theorem. From classical to modern probability theory.|series= Sources and Studies in the History of Mathematics and Physical Sciences|publisher=Springer|location=New York|year=2011|isbn=978-0-387-87856-0|chapter=4. Chebyshev's and Markov's Contributions.}} More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.{{cite book|last=Anderson|first=G.W.|last2=Guionnet|first2=A.|last3=Zeitouni|first3=O.|title=An introduction to random matrices.|year=2010|publisher=Cambridge University Press|location=Cambridge|isbn=978-0-521-19452-5|chapter=2.1}}

Notes