Generalized inverse Gaussian distribution

{{Short description|Family of continuous probability distributions}}

{{Probability distribution|

name =Generalized inverse Gaussian|

type =density|

pdf_image =Image:GIG distribution pdf.svg|

cdf_image =|

parameters =a > 0, b > 0, p real|

support =x > 0|

pdf =f(x) = \frac{(a/b)^{p/2}}{2 K_p(\sqrt{ab})} x^{(p-1)} e^{-(ax + b/x)/2}|

cdf =|

mean =\operatorname{E}[x]=\frac{\sqrt{b}\ K_{p+1}(\sqrt{a b}) }{ \sqrt{a}\ K_{p}(\sqrt{a b})}
\operatorname{E}[x^{-1}]=\frac{\sqrt{a}\ K_{p+1}(\sqrt{a b}) }{ \sqrt{b}\ K_{p}(\sqrt{a b})}-\frac{2p}{b}
\operatorname{E}[\ln x]=\ln \frac{\sqrt{b}}{\sqrt{a}}+\frac{\partial}{\partial p} \ln K_{p}(\sqrt{a b})|

median =|

mode =\frac{(p-1)+\sqrt{(p-1)^2+ab}}{a}|

variance =\left(\frac{b}{a}\right)\left[\frac{K_{p+2}(\sqrt{ab})}{K_p(\sqrt{ab})}-\left(\frac{K_{p+1}(\sqrt{ab})}{K_p(\sqrt{ab})}\right)^2\right]|

skewness =|

kurtosis =|

entropy =|

mgf =\left(\frac{a}{a-2t}\right)^{\frac{p}{2}}\frac{K_p(\sqrt{b(a-2t)})}{K_p(\sqrt{ab})}|

char =\left(\frac{a}{a-2it}\right)^{\frac{p}{2}}\frac{K_p(\sqrt{b(a-2it)})}{K_p(\sqrt{ab})}|

}}

In probability theory and statistics, the generalized inverse Gaussian distribution (GIG) is a three-parameter family of continuous probability distributions with probability density function

:f(x) = \frac{(a/b)^{p/2}}{2 K_p(\sqrt{ab})} x^{(p-1)} e^{-(ax + b/x)/2},\qquad x>0,

where Kp is a modified Bessel function of the second kind, a > 0, b > 0 and p a real parameter. It is used extensively in geostatistics, statistical linguistics, finance, etc. This distribution was first proposed by Étienne Halphen.

{{Cite book

| last = Seshadri

| first = V.

| contribution = Halphen's laws

| editor-last = Kotz

| editor-first = S.

| editor2-last = Read

| editor2-first = C. B.

| editor3-last = Banks

| editor3-first = D. L.

| title = Encyclopedia of Statistical Sciences, Update Volume 1

| pages = 302–306

| publisher = Wiley

| place = New York

| year = 1997

}}

{{Cite journal | last1 = Perreault | first1 = L. | last2 = Bobée | first2 = B. | last3 = Rasmussen | first3 = P. F. | doi = 10.1061/(ASCE)1084-0699(1999)4:3(189) | title = Halphen Distribution System. I: Mathematical and Statistical Properties | journal = Journal of Hydrologic Engineering | volume = 4 | issue = 3 | pages = 189 | year = 1999 }}Étienne Halphen was the grandson of the mathematician Georges Henri Halphen.

It was rediscovered and popularised by Ole Barndorff-Nielsen, who called it the generalized inverse Gaussian distribution. Its statistical properties are discussed in Bent Jørgensen's lecture notes.

{{cite book

| last = Jørgensen

| first = Bent

| title = Statistical Properties of the Generalized Inverse Gaussian Distribution

| publisher = Springer-Verlag

| year = 1982

| location = New York–Berlin

| series = Lecture Notes in Statistics

| volume = 9

| isbn = 0-387-90665-7

|mr=0648107}}

Properties

=Alternative parametrization=

By setting \theta = \sqrt{ab} and \eta = \sqrt{b/a}, we can alternatively express the GIG distribution as

:f(x) = \frac{1}{2\eta K_p(\theta)} \left(\frac{x}{\eta}\right)^{p-1} e^{-\theta(x/\eta + \eta/x)/2},

where \theta is the concentration parameter while \eta is the scaling parameter.

= Summation =

Barndorff-Nielsen and Halgreen proved that the GIG distribution is infinitely divisible.{{cite journal |first=O. |last=Barndorff-Nielsen |first2=Christian |last2=Halgreen |title=Infinite Divisibility of the Hyperbolic and Generalized Inverse Gaussian Distributions |journal=Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete |year=1977 |volume=38 |issue= |pages=309–311 |doi=10.1007/BF00533162 }}

= Entropy =

The entropy of the generalized inverse Gaussian distribution is given as{{citation needed|date=February 2012}}

:

\begin{align}

H = \frac{1}{2} \log \left( \frac b a \right) & {} +\log \left(2 K_p\left(\sqrt{ab} \right)\right) - (p-1) \frac{\left[\frac{d}{d\nu}K_\nu\left(\sqrt{ab}\right)\right]_{\nu=p}}{K_p\left(\sqrt{a b}\right)} \\

& {} + \frac{\sqrt{a b}}{2 K_p\left(\sqrt{a b}\right)}\left( K_{p+1}\left(\sqrt{ab}\right) + K_{p-1}\left(\sqrt{a b}\right)\right)

\end{align}

where \left[\frac{d}{d\nu}K_\nu\left(\sqrt{a b}\right)\right]_{\nu=p} is a derivative of the modified Bessel function of the second kind with respect to the order \nu evaluated at \nu=p

= Characteristic Function =

The characteristic of a random variable X\sim GIG(p, a, b) is given as (for a derivation of the characteristic function, see supplementary materials of {{cite journal |last1=Pal |first1=Subhadip |last2=Gaskins |first2=Jeremy |title=Modified Pólya-Gamma data augmentation for Bayesian analysis of directional data |journal=Journal of Statistical Computation and Simulation |date=23 May 2022 |volume=92 |issue=16 |pages=3430–3451 |doi=10.1080/00949655.2022.2067853 |s2cid=249022546 |url=https://www.tandfonline.com/doi/abs/10.1080/00949655.2022.2067853?journalCode=gscs20 |issn=0094-9655}})

: E(e^{itX}) = \left(\frac{a }{a-2it }\right)^{\frac{p}{2}} \frac{K_{p}\left( \sqrt{(a-2it)b} \right)}{ K_{p}\left( \sqrt{ab} \right) }

for t \in \mathbb{R} where i denotes the imaginary number.

Related distributions

=Special cases=

The inverse Gaussian and gamma distributions are special cases of the generalized inverse Gaussian distribution for p = −1/2 and b = 0, respectively. Specifically, an inverse Gaussian distribution of the form

: f(x;\mu,\lambda) = \left[\frac{\lambda}{2 \pi x^3}\right]^{1/2} \exp{ \left( \frac{-\lambda (x-\mu)^2}{2 \mu^2 x} \right)}

is a GIG with a = \lambda/\mu^2, b = \lambda, and p=-1/2. A Gamma distribution of the form

:

g(x;\alpha,\beta) = \beta^\alpha \frac 1 {\Gamma(\alpha)} x^{\alpha-1} e^{-\beta x}

is a GIG with a = 2 \beta, b = 0, and p = \alpha.

Other special cases include the inverse-gamma distribution, for a = 0.

=Conjugate prior for Gaussian=

The GIG distribution is conjugate to the normal distribution when serving as the mixing distribution in a normal variance-mean mixture.{{cite journal |first=Dimitris |last=Karlis |title=An EM type algorithm for maximum likelihood estimation of the normal–inverse Gaussian distribution |journal=Statistics & Probability Letters |volume=57 |issue=1 |year=2002 |pages=43–52 |doi=10.1016/S0167-7152(02)00040-8 }}{{cite journal |last=Barndorf-Nielsen |first=O. E. |year=1997 |title=Normal Inverse Gaussian Distributions and stochastic volatility modelling |journal=Scand. J. Statist. |volume=24 |issue=1 |pages=1–13 |doi=10.1111/1467-9469.00045 }} Let the prior distribution for some hidden variable, say z, be GIG:

:

P(z\mid a,b,p) = \operatorname{GIG}(z\mid a,b,p)

and let there be T observed data points, X=x_1,\ldots,x_T, with normal likelihood function, conditioned on z:

:

P(X\mid z,\alpha,\beta) = \prod_{i=1}^T N(x_i\mid\alpha+\beta z,z)

where N(x\mid\mu,v) is the normal distribution, with mean \mu and variance v. Then the posterior for z, given the data is also GIG:

:

P(z\mid X,a,b,p,\alpha,\beta) = \text{GIG}\left(z\mid a+T\beta^2,b+S,p-\frac T 2 \right)

where \textstyle S = \sum_{i=1}^T (x_i-\alpha)^2.Due to the conjugacy, these details can be derived without solving integrals, by noting that

:P(z\mid X,a,b,p,\alpha,\beta)\propto P(z\mid a,b,p)P(X\mid z,\alpha,\beta).

Omitting all factors independent of z, the right-hand-side can be simplified to give an un-normalized GIG distribution, from which the posterior parameters can be identified.

=Sichel distribution=

The Sichel distribution results when the GIG is used as the mixing distribution for the Poisson parameter \lambda.{{cite journal |last=Sichel |first=Herbert S. |year=1975 |title=On a distribution law for word frequencies |journal=Journal of the American Statistical Association |volume=70 |issue=351a |pages=542-547 |doi=10.1080/01621459.1975.10482469 }}{{cite journal |last=Stein |first=Gillian Z. |first2=Walter |last2=Zucchini |first3=June M. |last3=Juritz |year=1987 |title=Parameter estimation for the Sichel distribution and its multivariate extension |journal=Journal of the American Statistical Association |volume=82 |issue=399 |pages=938-944 |doi=10.1080/01621459.1987.10478520 }}

Notes

{{reflist|group=note}}

References

{{reflist|refs=

{{Citation | last1=Johnson | first1=Norman L. | last2=Kotz | first2=Samuel | last3=Balakrishnan | first3=N. | title=Continuous univariate distributions. Vol. 1 | publisher=John Wiley & Sons | location=New York | edition=2nd | series=Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics | isbn=978-0-471-58495-7 | mr= 1299979| year=1994 |pages=284–285}}

}}

See also

{{ProbDistributions|continuous-semi-infinite}}

{{DEFAULTSORT:Generalized Inverse Gaussian Distribution}}

Category:Continuous distributions

Category:Exponential family distributions