Distribution of the product of two random variables
{{Short description|Probability distribution}}
A product distribution is a probability distribution constructed as the distribution of the product of random variables having two other known distributions. Given two statistically independent random variables X and Y, the distribution of the random variable Z that is formed as the product is a product distribution.
The product distribution is the PDF of the product of sample values. This is not the same as the product of their PDFs yet the concepts are often ambiguously termed as in "product of Gaussians".
Algebra of random variables
{{main|Algebra of random variables}}
The product is one type of algebra for random variables: Related to the product distribution are the ratio distribution, sum distribution (see List of convolutions of probability distributions) and difference distribution. More generally, one may talk of combinations of sums, differences, products and ratios.
Many of these distributions are described in Melvin D. Springer's book from 1979 The Algebra of Random Variables.{{Cite book
| last = Springer
| first = Melvin Dale
| title = The Algebra of Random Variables
| url = https://archive.org/details/algebraofrandomv0000spri
| url-access = registration
| access-date = 24 September 2012
| publisher = Wiley
| year = 1979
| isbn = 978-0-471-01406-5
}}
Derivation for independent random variables
If and are two independent, continuous random variables, described by probability density functions and then the probability density function of is{{cite book|last=Rohatgi|first=V. K.|title=An Introduction to Probability Theory and Mathematical Statistics|year=1976|publisher=Wiley|location=New York|isbn=978-0-19-853185-2|doi=10.1002/9781118165676|series=Wiley Series in Probability and Statistics}}
:
=Proof =
We first write the cumulative distribution function of starting with its definition
:
F_Z(z) & \, \stackrel{\text{def}}{=}\ \mathbb{P}(Z\leq z) \\
& = \mathbb{P}(XY\leq z) \\
& = \mathbb{P}(XY\leq z , X \geq 0) + \mathbb{P}(XY\leq z , X \leq 0)\\
& = \mathbb{P}(Y\leq z/X , X \geq 0) + \mathbb{P}(Y\geq z/X , X \leq 0)\\
& = \int^\infty_0 f_X(x) \int^{z/x}_{-\infty} f_Y(y)\, dy \,dx
+\int^0_{-\infty} f_X(x) \int^\infty_{z/x} f_Y(y)\, dy \,dx
\end{align}
We find the desired probability density function by taking the derivative of both sides with respect to . Since on the right hand side, appears only in the integration limits, the derivative is easily performed using the fundamental theorem of calculus and the chain rule. (Note the negative sign that is needed when the variable occurs in the lower limit of the integration.)
:
f_Z(z) & = \int^\infty_0 f_X(x) f_Y(z/x) \frac{1}{x}\,dx
-\int^0_{-\infty} f_X(x) f_Y(z/x) \frac{1}{x} \,dx \\
& = \int^\infty_0 f_X(x) f_Y(z/x) \frac{1}
x |
+ \int_{-\infty}^0 f_X(x) f_Y(z/x) \frac{1}
x |
& = \int^\infty_{-\infty} f_X(x) f_Y(z/x) \frac{1}
x |
\end{align}
where the absolute value is used to conveniently combine the two terms.{{cite book|last1=Grimmett|first1=G. R. |last2=Stirzaker |first2=D.R.|title=Probability and Random Processes |year=2001|publisher=Oxford University Press|location=Oxford|isbn=978-0-19-857222-0|url=http://ukcatalogue.oup.com/product/9780198572220.do?keyword=grimmett+stirzaker&sortby=bestMatches|access-date=4 October 2015}}
=Alternate proof=
A faster more compact proof begins with the same step of writing the cumulative distribution of starting with its definition:
:
F_Z(z) & \overset{\underset{\mathrm{def}}{}}{=} \ \ \mathbb{P}(Z\leq z) \\
& = \mathbb{P}(XY\leq z) \\
& = \int^\infty_{-\infty} \int^\infty_{-\infty} f_X(x) f_Y(y) u(z-xy) \, dy \,dx
\end{align}
where is the Heaviside step function and serves to limit the region of integration to values of and satisfying .
We find the desired probability density function by taking the derivative of both sides with respect to .
:
f_Z(z) & = \int^\infty_{-\infty} \int^\infty_{-\infty} f_X(x) f_Y(y) \delta(z-xy) \, dy \,dx\\
& = \int^\infty_{-\infty} f_X(x) \left[\int^\infty_{-\infty} f_Y(y) \delta(z-xy) \, dy \right]\,dx\\
& = \int^\infty_{-\infty} f_X(x) f_Y(z/x) \frac{1}
x |
\end{align}
where we utilize the translation and scaling properties of the Dirac delta function .
A more intuitive description of the procedure is illustrated in the figure below. The joint pdf exists in the - plane and an arc of constant value is shown as the shaded line. To find the marginal probability on this arc, integrate over increments of area on this contour.
Starting with , we have . So the probability increment is . Since implies , we can relate the probability increment to the -increment, namely . Then integration over , yields .
=A Bayesian interpretation=
Let be a random sample drawn from probability distribution . Scaling by generates a sample from scaled distribution which can be written as a conditional distribution .
Letting be a random variable with pdf , the distribution of the scaled sample becomes and integrating out we get so is drawn from this distribution . However, substituting the definition of we also have
which has the same form as the product distribution above. Thus the Bayesian posterior distribution is the distribution of the product of the two independent random samples and .
For the case of one variable being discrete, let have probability at levels with . The conditional density is . Therefore .
Expectation of product of random variables
When two random variables are statistically independent, the expectation of their product is the product of their expectations. This can be proved from the law of total expectation:
:
In the inner expression, {{mvar|Y}} is a constant. Hence:
:
:
This is true even if {{mvar|X}} and {{mvar|Y}} are statistically dependent in which case is a function of {{mvar|Y}}. In the special case in which {{mvar|X}} and {{mvar|Y}} are statistically
independent, it is a constant independent of {{mvar|Y}}. Hence:
:
:
Variance of the product of independent random variables
Let be uncorrelated random variables with means and variances .
If, additionally, the random variables and are uncorrelated, then the variance of the product XY is{{cite journal |last1=Goodman |first1=Leo A. |author-link1=Leo Goodman |date=1960 |title=On the Exact Variance of Products |jstor=2281592 |journal=Journal of the American Statistical Association |volume=55 |issue=292 |pages=708–713 |doi=10.2307/2281592}}
:
In the case of the product of more than two variables, if are statistically independent then{{Cite web |url=https://stats.stackexchange.com/q/52699 |title=Variance of product of multiple random variables |last=Sarwate |first=Dilip |date=March 9, 2013 |work=Stack Exchange }} the variance of their product is
:
Characteristic function of product of random variables
Assume X, Y are independent random variables. The characteristic function of X is , and the distribution of Y is known. Then from the law of total expectation, we have{{cite web |title=How to find characteristic function of product of random variables |date=January 3, 2013 |work=Stack Exchange |url=https://math.stackexchange.com/q/269579 }}
:
\\ & = \operatorname{E} ( \operatorname{E} (e^{itX Y} \mid Y))
\\ & = \operatorname{E} ( \varphi_X(tY))
\end{align}
If the characteristic functions and distributions of both X and Y are known, then alternatively,
also holds.
Mellin transform
The Mellin transform of a distribution with support only on and having a random sample is
:
The inverse transform is
:
if are two independent random samples from different distributions, then the Mellin transform of their product is equal to the product of their Mellin transforms:
:
\mathcal{M}_{XY}(s) = \mathcal{M}_X(s)\mathcal{M}_Y(s)
If s is restricted to integer values, a simpler result is
:
Thus the moments of the random product are the product of the corresponding moments of and this extends to non-integer moments, for example
:
The pdf of a function can be reconstructed from its moments using the saddlepoint approximation method.
A further result is that for independent X, Y
:
Gamma distribution example To illustrate how the product of moments yields a much simpler result than finding the moments of the distribution of the product, let be sampled from two Gamma distributions, with parameters
whose moments are
:
Multiplying the corresponding moments gives the Mellin transform result
:
Independently, it is known that the product of two independent Gamma-distributed samples (~Gamma(α,1) and Gamma(β,1)) has a K-distribution:
:
To find the moments of this, make the change of variable , simplifying similar integrals to:
:
thus
:
2^{-(\alpha+\beta) - 2p+1} \int_0^\infty y^{(\alpha+\beta) + 2p -1} K_{\alpha-\beta}(y) \, dy
The definite integral
: is well documented and we have finally
:
E[Z^p] & = \frac {2^{-(\alpha+\beta) - 2p+1} \; 2^{(\alpha+\beta) + 2p -1} }{\Gamma(\alpha) \; \Gamma(\beta)} \Gamma \left ( \frac{(\alpha+\beta+2p)+(\alpha-\beta)}{2} \right ) \Gamma \left( \frac{(\alpha+\beta+2p)-(\alpha-\beta)}{2} \right ) \\ \\
& = \frac{\Gamma( \alpha+ p) \, \Gamma(\beta + p)} {\Gamma(\alpha) \, \Gamma(\beta)}
\end{align}
which, after some difficulty, has agreed with the moment product result above.
If X, Y are drawn independently from Gamma distributions with shape parameters then
:
This type of result is universally true, since for bivariate independent variables thus
:
\\ & = \int_{x=-\infty}^\infty x^p \Big [ \int_{y=-\infty}^\infty y^q f_Y(y)\, dy \Big ] f_X(x) \, dx
\\ & = \int_{x=-\infty}^\infty x^p f_X(x) \, dx \int_{y=-\infty}^\infty y^q f_Y(y) \, dy
\\ & = \operatorname{E}[X^p] \; \operatorname{E}[Y^q]
\end{align}
or equivalently it is clear that are independent variables.
Special cases
=Lognormal distributions=
The distribution of the product of two random variables which have lognormal distributions is again lognormal. This is itself a special case of a more general set of results where the logarithm of the product can be written as the sum of the logarithms. Thus, in cases where a simple result can be found in the list of convolutions of probability distributions, where the distributions to be convolved are those of the logarithms of the components of the product, the result might be transformed to provide the distribution of the product. However this approach is only useful where the logarithms of the components of the product are in some standard families of distributions.
=Uniformly distributed independent random variables=
Let be the product of two independent variables each uniformly distributed on the interval [0,1], possibly the outcome of a copula transformation. As noted in "Lognormal Distributions" above, PDF convolution operations in the Log domain correspond to the product of sample values in the original domain. Thus, making the transformation , such that , each variate is distributed independently on u as
:.
and the convolution of the two distributions is the autoconvolution
:
Next retransform the variable to yielding the distribution
: on the interval [0,1]
For the product of multiple (> 2) independent samples the characteristic function route is favorable. If we define then above is a Gamma distribution of shape 1 and scale factor 1, , and its known CF is . Note that so the Jacobian of the transformation is unity.
The convolution of independent samples from therefore has CF which is known to be the CF of a Gamma distribution of shape :
:.
Make the inverse transformation to extract the PDF of the product of the n samples:
:
The following, more conventional, derivation from Stackexchange{{Cite web |url=https://math.stackexchange.com/q/659278 |title=product distribution of two uniform distribution, what about 3 or more |author=heropup |date=1 February 2014 |work=Stack Exchange }} is consistent with this result.
First of all, letting its CDF is
:
\\ & = \int_{x=0}^z 1 dx + \int_{x=z}^1 \frac{z}{x} \, dx
\\ & = z - z\log z, \;\; 0 < z \le 1
\end{align}
The density of
Multiplying by a third independent sample gives distribution function
:
\begin{align} F_{Z_3}(z) = \Pr \Big [Z_3 \le z \Big ] & = \int_{x=0}^1 \Pr \Big [ X_3 \le \frac{z}{x} \Big] f_{Z_2}(x) \, dx
\\ & = -\int_{x=0}^z \log(x) \, dx - \int_{x=z}^1 \frac{z}{x} \log(x) \,dx
\\ & = -z \Big (\log(z) - 1 \Big ) + \frac{1}{2}z \log^2 (z)
\end{align}
Taking the derivative yields
f_{Z_3}(z) = \frac{1}{2} \log^2 (z), \;\; 0 < z \le 1.
The author of the note conjectures that, in general,
The figure illustrates the nature of the integrals above. The area of the selection within the unit square and below the line z = xy, represents the CDF of z. This divides into two parts. The first is for 0 < x < z where the increment of area in the vertical slot is just equal to dx. The second part lies below the xy line, has y-height z/x, and incremental area dx z/x.
=Independent central-normal distributions=
The product of two independent Normal samples follows a modified Bessel function. Let be independent samples from a Normal(0,1) distribution and .
Then
:
The variance of this distribution could be determined, in principle, by a definite integral from Gradsheyn and Ryzhik,{{Cite book|title=Tables of Integrals, Series and Products|last1=Gradsheyn|first1=I S|last2=Ryzhik|first2=I M|publisher=Academic Press|year=1980|pages=section 6.561}}
:
\int_0^\infty x^\mu K_\nu (ax) \, dx = 2^{\mu - 1} a^{-\mu - 1} \Gamma \Big ( \frac{1 + \mu + \nu}{2} \Big ) \Gamma \Big ( \frac{1 + \mu - \nu}{2} \Big), \;\; a>0, \;\nu + 1 \pm \mu >0
thus
\operatorname{E}[Z^2] = \int_{-\infty}^\infty \frac{z^2 K_0 (|z|)}{\pi} \, dz = \frac{4}{\pi} \; \Gamma^2 \Big (\frac{3}{2} \Big ) = 1
A much simpler result, stated in a section above, is that the variance of the product of zero-mean independent samples is equal to the product of their variances. Since the variance of each Normal sample is one, the variance of the product is also one.
The product of two Gaussian samples is often confused with the product of two Gaussian PDFs. The latter simply results in a bivariate Gaussian distribution.
=Independent complex-valued central-normal distributions=
== product of two variables ==
Let
Setting
The density functions of
:
:
\frac{4-\pi}{2}
The variable
:
Wells et al.{{Cite journal|last1=Wells|first1=R T|last2=Anderson|first2=R L|last3=Cell|first3=J W|date=1962|title=The Distribution of the Product of Two Central or Non-Central Chi-Square Variates|journal=The Annals of Mathematical Statistics|volume=33| issue = 3|pages=1016–1020|doi=10.1214/aoms/1177704469|doi-access=free}} show that the density function of
:
and the cumulative distribution function of
:
Thus the polar representation of the product of two uncorrelated complex Gaussian samples is
:
The first and second moments of this distribution can be found from the integral in Normal Distributions above
:
:
Thus its variance is
Further, the density of
^2 = r_1 r_2
^2 r_1
^2 = y_1 y_2 r_2
:
== sum of the product of two variables ==
Let
Let of
Besides, they also prove that the density function of the magnitude of
(s)=\frac{4}{\Gamma(N)\sigma_s^{N+1}}s^{N}K_{N-1}\left(\frac{2s}{\sigma_s}\right),s
The first moment of this distribution, i.e. the mean of
=Independent complex-valued noncentral normal distributions=
The product of non-central independent complex Gaussians is described by O’Donoughue and Moura{{Cite journal|last1=O’Donoughue|first1=N|last2=Moura|first2=J M F|date=March 2012|title=On the Product of Independent Complex Gaussians|journal=IEEE Transactions on Signal Processing|volume=60|issue=3|pages=1050–1063|doi=10.1109/TSP.2011.2177264|bibcode=2012ITSP...60.1050O|s2cid=1069298}} and forms a double infinite series of modified Bessel functions of the first and second types.
=Gamma distributions=
The product of two independent Gamma samples,
:
K_{k_1 -k_2 } \left( 2 \sqrt{ \frac {z}{\theta_1 \theta_2} } \right) \\ \\
& = \frac {2}{\Gamma(k_1) \Gamma(k_2) } \frac {y^{\frac {k_1 + k_2}{2 }-1}}
{\theta_1 \theta_2 } K_{k_1 -k_2 } \left(2 \sqrt y \right) \text { where } y = \frac {z}{\theta_1 \theta_2}\\
\end{align}
=Beta distributions=
Nagar et al.{{Cite journal|last1=Nagar|first1=D K|last2=Orozco-Castañeda|first2=J M|last3=Gupta|first3=A K|date=2009|title=Product and quotient of correlated beta variables|journal=Applied Mathematics Letters|volume=22|pages=105–109|doi=10.1016/j.aml.2008.02.014|doi-access=free}} define a correlated bivariate beta distribution
:
where
:
Then the pdf of Z = XY is given by
:
{_2F_1} (a+c,a+c; a+b+2c; 1-z), \;\;\; 0< z <1
where
:
\int_0^1 v^{a-1} (1-v)^{c-a-1} (1-vz)^{-b} \, dv
Note that multivariate distributions are not generally unique, apart from the Gaussian case, and there may be alternatives.
=Uniform and gamma distributions=
The distribution of the product of a random variable having a uniform distribution on (0,1) with a random variable having a gamma distribution with shape parameter equal to 2, is an exponential distribution.
{{cite book
|last1=Johnson |first1=Norman L.
|last2=Kotz |first2=Samuel
|last3=Balakrishnan |first3= N.
|title=Continuous Univariate Distributions Volume 2, Second edition
|url=http://www.wiley.com/WileyCDA/WileyTitle/productCd-0471584940.html
|access-date=24 September 2012
|page=306
|year=1995
|publisher=Wiley
|isbn=978-0-471-58494-0
}} A more general case of this concerns the distribution of the product of a random variable having a beta distribution with a random variable having a gamma distribution: for some cases where the parameters of the two component distributions are related in a certain way, the result is again a gamma distribution but with a changed shape parameter.
The K-distribution is an example of a non-standard distribution that can be defined as a product distribution (where both components have a gamma distribution).
=Gamma and Pareto distributions=
The product of n Gamma and m Pareto independent samples was derived by Nadarajah.{{Cite journal|last=Nadarajah|first=Saralees|date=June 2011|title=Exact distribution of the product of n gamma and m Pareto random variables|journal=Journal of Computational and Applied Mathematics|volume=235|issue=15|pages=4496–4512|doi=10.1016/j.cam.2011.04.018|doi-access=free}}
See also
Notes
{{Reflist}}
References
- {{cite journal
| doi = 10.1137/0118065
| title = The distribution of products of beta, gamma and Gaussian random variables
| last1 = Springer
| first1 = Melvin Dale
| last2 = Thompson
| first2 = W. E.
| journal = SIAM Journal on Applied Mathematics
| volume = 18
| issue = 4
| pages = 721–737
| year = 1970
| jstor = 2099424
}}
- {{cite journal
| title = The distribution of products of independent random variables
| last1 = Springer
| first1 = Melvin Dale
| last2 = Thompson
| first2 = W. E.
| journal = SIAM Journal on Applied Mathematics
| pages = 511–526
| year = 1966
| jstor = 2946226
| volume = 14
| issue = 3
| doi = 10.1137/0114046
}}
{{DEFAULTSORT:Product Distribution}}