Taylor expansions for the moments of functions of random variables
{{Short description|Concept in probability theory}}
{{Multiple issues|
{{More citations needed|date=November 2014}}
{{Technical|date=December 2021}}
}}
In probability theory, it is possible to approximate the moments of a function f of a random variable X using Taylor expansions, provided that f is sufficiently differentiable and that the moments of X are finite.
A simulation-based alternative to this approximation is the application of Monte Carlo simulations.
First moment
Given and , the mean and the variance of , respectively,Haym Benaroya, Seon Mi Han, and Mark Nagurka. Probability Models in Engineering and Science. CRC Press, 2005, p166. a Taylor expansion of the expected value of can be found via
:
\begin{align}
\operatorname{E}\left[f(X)\right] & {} = \operatorname{E}\left[f\left(\mu_X + \left(X - \mu_X\right)\right)\right] \\
& {} \approx \operatorname{E}\left[f(\mu_X) + f'(\mu_X)\left(X-\mu_X\right) + \frac{1}{2}f''(\mu_X) \left(X - \mu_X\right)^2 \right] \\
& {} = f(\mu_X) + f'(\mu_X) \operatorname{E} \left[ X-\mu_X \right] + \frac{1}{2}f''(\mu_X) \operatorname{E} \left[ \left(X - \mu_X\right)^2 \right].
\end{align}
Since the second term vanishes. Also, is . Therefore,
:.
It is possible to generalize this to functions of more than one variable using multivariate Taylor expansions. For example,
:
Second moment
:
The above is obtained using a second order approximation, following the method used in estimating the first moment. It will be a poor approximation in cases where is highly non-linear. This is a special case of the delta method.
Indeed, we take .
With , we get . The variance is then computed using the formula
.
:
The second order approximation, when X follows a normal distribution, is:{{cite web|last1=Hendeby|first1=Gustaf|last2=Gustafsson|first2=Fredrik|title=ON NONLINEAR TRANSFORMATIONS OF GAUSSIAN DISTRIBUTIONS|url=http://users.isy.liu.se/en/rt/fredrik/reports/07SSPut.pdf|access-date=5 October 2017}}
:
First product moment
To find a second-order approximation for the covariance of functions of two random variables (with the same function applied to both), one can proceed as follows. First, note that . Since a second-order expansion for has already been derived above, it only remains to find . Treating as a two-variable function, the second-order Taylor expansion is as follows:
:
\begin{align}
f(X)f(Y) & {} \approx f(\mu_X) f(\mu_Y) + (X-\mu_X) f'(\mu_X)f(\mu_Y) + (Y - \mu_Y)f(\mu_X)f'(\mu_Y) + \frac{1}{2}\left[(X-\mu_X)^2 f(\mu_X)f(\mu_Y) + 2(X-\mu_X)(Y-\mu_Y)f'(\mu_X)f'(\mu_Y) + (Y-\mu_Y)^2 f(\mu_X)f(\mu_Y) \right]
\end{align}
Taking expectation of the above and simplifying—making use of the identities and —leads to . Hence,
:
\begin{align}
\operatorname{cov}\left[f(X),f(Y)\right] & {} \approx f(\mu_X)f(\mu_Y)+f'(\mu_X)f'(\mu_Y)\operatorname{cov}(X,Y)+\frac{1}{2}f(\mu_X)f(\mu_Y)\operatorname{var}(X)+\frac{1}{2}f(\mu_X)f(\mu_Y)\operatorname{var}(Y) - \left[f(\mu_X)+\frac{1}{2}f(\mu_X)\operatorname{var}(X)\right] \left[f(\mu_Y)+\frac{1}{2}f(\mu_Y)\operatorname{var}(Y) \right] \\
& {} = f'(\mu_X)f'(\mu_Y) \operatorname{cov}(X,Y) - \frac{1}{4}f(\mu_X)f(\mu_Y)\operatorname{var}(X)\operatorname{var}(Y)
\end{align}
Random vectors
If X is a random vector, the approximations for the mean and variance of are given by{{Cite journal |last=Rego |first=Bruno V. |last2=Weiss |first2=Dar |last3=Bersi |first3=Matthew R. |last4=Humphrey |first4=Jay D. |date=14 December 2021 |title=Uncertainty quantification in subject‐specific estimation of local vessel mechanical properties |url=https://onlinelibrary.wiley.com/doi/10.1002/cnm.3535 |journal=International Journal for Numerical Methods in Biomedical Engineering |language=en |volume=37 |issue=12 |pages=e3535 |doi=10.1002/cnm.3535 |issn=2040-7939 |pmc=9019846 |pmid=34605615}}
:
\begin{align}
\operatorname{E}(f(X)) &= f(\mu_X) + \frac{1}{2} \operatorname{trace}(H_f(\mu_X) \Sigma_X) \\
\operatorname{var}(f(X)) &= \nabla f(\mu_X)^t \Sigma_X \nabla f(\mu_X) + \frac{1}{2} \operatorname{trace} \left( H_f(\mu_X) \Sigma_X H_f(\mu_X) \Sigma_X \right).
\end{align}
Here and denote the gradient and the Hessian matrix respectively, and is the covariance matrix of X.
See also
Notes
{{reflist}}
Further reading
- {{cite book |first=Kirk M. |last=Wolter |chapter=Taylor Series Methods |title=Introduction to Variance Estimation |location=New York |publisher=Springer |year=1985 |isbn=0-387-96119-4 |pages=221–247 |chapter-url=https://books.google.com/books?id=EadxTw0t2dMC&pg=PA221 }}
{{DEFAULTSORT:Taylor Expansions For The Moments Of Functions Of Random Variables}}
Category:Statistical approximations