Mean square

{{Short description|Average of squared values of a sample}}

In mathematics and its applications, the mean square is normally defined as the arithmetic mean of the squares of a set of numbers or of a random variable.{{cite web |title=Noise and Noise Rejection |url=https://engineering.purdue.edu/ME365/Textbook/chapter11.pdf |website=engineering.purdue.edu/ME365/Textbook/chapter11 |accessdate=6 January 2020}}

It may also be defined as the arithmetic mean of the squares of the deviations between a set of numbers and a reference value (e.g., may be a mean or an assumed mean of the data),{{cite web |title=OECD Glossary of Statistical Terms |url=https://stats.oecd.org/glossary/detail.asp?ID=3714 |website=oecd.org |accessdate=6 January 2020}} in which case it may be known as mean square deviation.

When the reference value is the assumed true value, the result is known as mean squared error.

A typical estimate for the sample variance from a set of sample values x_i uses a divisor of the number of values minus one, n-1, rather than n as in a simple quadratic mean, and this is still called the "mean square" (e.g. in analysis of variance):

:s^2=\textstyle\frac{1}{n-1}\sum(x_i-\bar{x})^2

The second moment of a random variable, E(X^{2}) is also called the mean square.

The square root of a mean square is known as the root mean square (RMS or rms), and can be used as an estimate of the standard deviation of a random variable when the random variable is zero-mean.

References

{{reflist}}

Category:Means

{{math-stub}}