Generalized inverse Gaussian distribution
{{Short description|Family of continuous probability distributions}}
{{Probability distribution|
name =Generalized inverse Gaussian|
type =density|
pdf_image =Image:GIG distribution pdf.svg|
cdf_image =|
parameters =a > 0, b > 0, p real|
support =x > 0|
pdf =|
cdf =|
mean =
|
median =|
mode =|
variance =|
skewness =|
kurtosis =|
entropy =|
mgf =|
char =|
}}
In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function
:
where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.
{{Cite book
| last = Seshadri
| first = V.
| contribution = Halphen's laws
| editor-last = Kotz
| editor-first = S.
| editor2-last = Read
| editor2-first = C. B.
| editor3-last = Banks
| editor3-first = D. L.
| title = Encyclopedia of Statistical Sciences, Update Volume 1
| pages = 302–306
| publisher = Wiley
| place = New York
| year = 1997
}}
{{Cite journal | last1 = Perreault | first1 = L. | last2 = Bobée | first2 = B. | last3 = Rasmussen | first3 = P. F. | doi = 10.1061/(ASCE)1084-0699(1999)4:3(189) | title = Halphen Distribution System. I: Mathematical and Statistical Properties | journal = Journal of Hydrologic Engineering | volume = 4 | issue = 3 | pages = 189 | year = 1999 }}Étienne Halphen was the grandson of the mathematician Georges Henri Halphen.
It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. Its statistical properties are discussed in Bent Jørgensen's lecture notes.
{{cite book
| last = Jørgensen
| first = Bent
| title = Statistical Properties of the Generalized Inverse Gaussian Distribution
| publisher = Springer-Verlag
| year = 1982
| location = New York–Berlin
| series = Lecture Notes in Statistics
| volume = 9
| isbn = 0-387-90665-7
|mr=0648107}}
Properties
=Alternative parametrization=
By setting and , we can alternatively express the GIG distribution as
:
where is the concentration parameter while is the scaling parameter.
= Summation =
Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible.{{cite journal |first=O. |last=Barndorff-Nielsen |first2=Christian |last2=Halgreen |title=Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions |journal=Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete |year=1977 |volume=38 |issue= |pages=309–311 |doi=10.1007/BF00533162 }}
= Entropy =
The entropy of the generalized inverse Gaussian distribution is given as{{citation needed|date=February 2012}}
:
\begin{align}
H = \frac{1}{2} \log \left( \frac b a \right) & {} +\log \left(2 K_p\left(\sqrt{ab} \right)\right) - (p-1) \frac{\left[\frac{d}{d\nu}K_\nu\left(\sqrt{ab}\right)\right]_{\nu=p}}{K_p\left(\sqrt{a b}\right)} \\
& {} + \frac{\sqrt{a b}}{2 K_p\left(\sqrt{a b}\right)}\left( K_{p+1}\left(\sqrt{ab}\right) + K_{p-1}\left(\sqrt{a b}\right)\right)
\end{align}
where is a derivative of the modified Bessel function of the second kind with respect to the order evaluated at
= Characteristic Function =
The characteristic of a random variable is given as (for a derivation of the characteristic function, see supplementary materials of {{cite journal |last1=Pal |first1=Subhadip |last2=Gaskins |first2=Jeremy |title=Modified Pólya-Gamma data augmentation for Bayesian analysis of directional data |journal=Journal of Statistical Computation and Simulation |date=23 May 2022 |volume=92 |issue=16 |pages=3430–3451 |doi=10.1080/00949655.2022.2067853 |s2cid=249022546 |url=https://www.tandfonline.com/doi/abs/10.1080/00949655.2022.2067853?journalCode=gscs20 |issn=0094-9655}})
:
for where denotes the imaginary number.
Related distributions
=Special cases=
The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = −1/2 and b = 0, respectively. Specifically, an inverse Gaussian distribution of the form
:
is a GIG with , , and . A Gamma distribution of the form
:
g(x;\alpha,\beta) = \beta^\alpha \frac 1 {\Gamma(\alpha)} x^{\alpha-1} e^{-\beta x}
is a GIG with , , and .
Other special cases include the inverse-gamma distribution, for a = 0.
=Conjugate prior for Gaussian=
The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.{{cite journal |first=Dimitris |last=Karlis |title=An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution |journal=Statistics & Probability Letters |volume=57 |issue=1 |year=2002 |pages=43–52 |doi=10.1016/S0167-7152(02)00040-8 }}{{cite journal |last=Barndorf-Nielsen |first=O. E. |year=1997 |title=Normal Inverse Gaussian Distributions and stochastic volatility modelling |journal=Scand. J. Statist. |volume=24 |issue=1 |pages=1–13 |doi=10.1111/1467-9469.00045 }} Let the prior distribution for some hidden variable, say , be GIG:
:
P(z\mid a,b,p) = \operatorname{GIG}(z\mid a,b,p)
and let there be observed data points, , with normal likelihood function, conditioned on
:
P(X\mid z,\alpha,\beta) = \prod_{i=1}^T N(x_i\mid\alpha+\beta z,z)
where is the normal distribution, with mean and variance . Then the posterior for , given the data is also GIG:
:
P(z\mid X,a,b,p,\alpha,\beta) = \text{GIG}\left(z\mid a+T\beta^2,b+S,p-\frac T 2 \right)
where .Due to the conjugacy, these details can be derived without solving integrals, by noting that
:.
Omitting all factors independent of , the right-hand-side can be simplified to give an un-normalized GIG distribution, from which the posterior parameters can be identified.
=Sichel distribution=
The Sichel distribution results when the GIG is used as the mixing distribution for the Poisson parameter .{{cite journal |last=Sichel |first=Herbert S. |year=1975 |title=On a distribution law for word frequencies |journal=Journal of the American Statistical Association |volume=70 |issue=351a |pages=542-547 |doi=10.1080/01621459.1975.10482469 }}{{cite journal |last=Stein |first=Gillian Z. |first2=Walter |last2=Zucchini |first3=June M. |last3=Juritz |year=1987 |title=Parameter estimation for the Sichel distribution and its multivariate extension |journal=Journal of the American Statistical Association |volume=82 |issue=399 |pages=938-944 |doi=10.1080/01621459.1987.10478520 }}
Notes
{{reflist|group=note}}
References
{{reflist|refs=
{{Citation | last1=Johnson | first1=Norman L. | last2=Kotz | first2=Samuel | last3=Balakrishnan | first3=N. | title=Continuous univariate distributions. Vol. 1 | publisher=John Wiley & Sons | location=New York | edition=2nd | series=Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics | isbn=978-0-471-58495-7 | mr= 1299979| year=1994 |pages=284–285}}
}}
See also
{{ProbDistributions|continuous-semi-infinite}}
{{DEFAULTSORT:Generalized Inverse Gaussian Distribution}}