q-Gaussian distribution

{{short description|Probability distribution}}

{{About|the Tsallis q-Gaussian|a different q-analog|Gaussian q-distribution}}

{{DISPLAYTITLE:q-Gaussian distribution}}

{{Probability distribution |

name =q-Gaussian|

type =density|

pdf_image =File:The PDF of QGaussian.svg|

parameters =q < 3 shape (real)
\beta > 0 (real) |

support =x \in (-\infty; +\infty)\! for 1\le q < 3
x \in \left[\pm {1 \over \sqrt{\beta(1-q)}}\right] for q < 1 |

pdf ={\sqrt{\beta} \over C_q} e_q({-\beta x^2}) |

cdf = see text |

mean =0\text{ for }q<2, otherwise undefined|

median =0|

mode =0|

variance = { 1 \over {\beta (5-3q)}} \text{ for } q < {5 \over 3}

\infty \text{ for } {5 \over 3} \le q < 2
\text{Undefined for }2 \le q <3|

skewness = 0 \text{ for } q < {3 \over 2} |

kurtosis = 6{q-1 \over 7-5q} \text{ for } q < {7 \over 5} |

entropy =|

mgf =|

cf =|

}}

The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy.Tsallis, C. Nonadditive entropy and nonextensive statistical mechanics-an overview after 20 years. Braz. J. Phys. 2009, 39, 337–356 The normal distribution is recovered as q → 1.

The q-Gaussian has been applied to problems in the fields of statistical mechanics, geology, anatomy, astronomy, economics, finance, and machine learning.{{cn|date=October 2024}} The distribution is often favored for its heavy tails in comparison to the Gaussian for 1 < q < 3. For q <1 the q-Gaussian distribution is the PDF of a bounded random variable. This makes in biology and other domainsd'Onofrio A. (ed.) Bounded Noises in Physics, Biology, and Engineering. Birkhauser (2013) the q-Gaussian distribution more suitable than Gaussian distribution to model the effect of external stochasticity. A generalized q-analog of the classical central limit theorem{{cite journal |last1=Umarov |first1=Sabir |author2=Tsallis, Constantino |author3=Steinberg, Stanly |year=2008 |title=On a q-Central Limit Theorem Consistent with Nonextensive Statistical Mechanics |journal=Milan J. Math. |volume=76 |pages=307–328 |publisher=Birkhauser Verlag |doi=10.1007/s00032-008-0087-y |s2cid=55967725 |url=http://www.cbpf.br/GrupPesq/StatisticalPhys/pdftheo/UmarovTsallisSteinberg2008.pdf |access-date=2011-07-27}} was proposed in 2008, in which the independence constraint for the i.i.d. variables is relaxed to an extent defined by the q parameter, with independence being recovered as q → 1. However, a proof of such a theorem is still lacking.{{Citation |last1=Hilhorst |first1=H.J.|year=2010 |title=Note on a q-modified central limit theorem |journal=Journal of Statistical Mechanics: Theory and Experiment|volume=2010 |issue=10 |pages= 10023|doi=10.1088/1742-5468/2010/10/P10023|arxiv=1008.4259|postscript=.|bibcode=2010JSMTE..10..023H|s2cid=119316670}}

In the heavy tail regions, the distribution is equivalent to the Student's t-distribution with a direct mapping between q and the degrees of freedom. A practitioner using one of these distributions can therefore parameterize the same distribution in two different ways. The choice of the q-Gaussian form may arise if the system is non-extensive, or if there is lack of a connection to small samples sizes.

Characterization

=Probability density function=

The standard q-Gaussian has the probability density function

: f(x) = {\sqrt{\beta} \over C_q} e_q(-\beta x^2)

where

:e_q(x) = [1+(1-q)x]_+^{1 \over 1-q}

is the q-exponential and the normalization factor C_q is given by

:C_q = {{2 \sqrt{\pi} \Gamma\left({1 \over 1-q}\right)} \over {(3-q) \sqrt{1-q} \Gamma\left({3-q \over 2(1-q)}\right)}} \text{ for } -\infty < q < 1

: C_q = \sqrt{\pi} \text{ for } q = 1 \,

:C_q = { {\sqrt{\pi} \Gamma\left({3-q \over 2(q-1)}\right)} \over {\sqrt{q-1} \Gamma\left({1 \over q-1}\right)}} \text{ for }1 < q < 3 .

Note that for q <1 the q-Gaussian distribution is the PDF of a bounded random variable.

=Cumulative density function=

For 1 < q < 3 cumulative density function is {{Cite web | title=TsallisQGaussianDistribution—Wolfram Language Documentation | url=https://reference.wolframcloud.com/language/ref/TsallisQGaussianDistribution.html | access-date=2025-02-15 | website=reference.wolframcloud.com}}

: F(x)= \frac{1}{2} + \frac{\sqrt{q-1}\, \Gamma\left({1 \over q-1}\right) x \sqrt{\beta} \, {}_2F_1\left (\tfrac{1}{2},\tfrac{1}{q-1};\tfrac{3}{2};-(q-1)\beta x^2 \right)}{\sqrt{\pi}\, \Gamma\left({3-q \over 2(q-1)}\right)} ,

where {}_2F_1(a,b;c;z) is the hypergeometric function. As the hypergeometric function is defined for {{math|{{!}}z{{!}} < 1}} but x is unbounded, Pfaff transformation could be used.

For q<1 ,

F(x)=

\begin{cases}

0 & x < - \frac{1}{\sqrt{\beta(1-q)}}, \\

\frac{1}{2} + \frac{\sqrt{1-q}\, \Gamma\left({5-3q \over 2(1-q)}\right) x \sqrt{\beta} \, {}_2F_1\left (\tfrac{1}{2},\tfrac{1}{q-1};\tfrac{3}{2};-(q-1)\beta x^2 \right)}{\sqrt{\pi}\, \Gamma\left({2-q \over 1-q}\right)} & - \frac{1}{\sqrt{\beta(1-q)}} < x < \frac{1}{\sqrt{\beta(1-q)}}, \\

1 & x > \frac{1}{\sqrt{\beta(1-q)}}.

\end{cases}

Entropy

Just as the normal distribution is the maximum information entropy distribution for fixed values of the first moment \operatorname{E}(X) and second moment \operatorname{E}(X^2) (with the fixed zeroth moment \operatorname{E}(X^0)=1 corresponding to the normalization condition), the q-Gaussian distribution is the maximum Tsallis entropy distribution for fixed values of these three moments.

Related distributions

=Student's ''t''-distribution=

While it can be justified by an interesting alternative form of entropy, statistically it is a scaled reparametrization of the Student's t-distribution introduced by W. Gosset in 1908 to describe small-sample statistics. In Gosset's original presentation the degrees of freedom parameter ν was constrained to be a positive integer related to the sample size, but it is readily observed that Gosset's density function is valid for all real values of ν.{{citation needed|date=February 2012}} The scaled reparametrization introduces the alternative parameters q and β which are related to ν.

Given a Student's t-distribution with ν degrees of freedom, the equivalent q-Gaussian has

:q = \frac{\nu+3}{\nu+1}\text{ with }\beta = \frac{1}{3-q}

with inverse

:\nu = \frac{3-q}{q-1},\text{ but only if }\beta = \frac{1}{3-q}.

Whenever \beta \ne {1 \over {3-q}}, the function is simply a scaled version of Student's t-distribution.

It is sometimes argued that the distribution is a generalization of Student's t-distribution to negative and or non-integer degrees of freedom. However, the theory of Student's t-distribution extends trivially to all real degrees of freedom, where the support of the distribution is now compact rather than infinite in the case of ν < 0.{{citation needed|date=February 2012}}

=Three-parameter version=

As with many distributions centered on zero, the q-Gaussian can be trivially extended to include a location parameter μ. The density then becomes defined by

:{\sqrt{\beta} \over C_q} e_q({-\beta (x-\mu)^2}) .

Generating random deviates

The Box–Muller transform has been generalized to allow random sampling from q-Gaussians.W. Thistleton, J.A. Marsh, K. Nelson and C. Tsallis, Generalized Box–Muller method for generating q-Gaussian random deviates, IEEE Transactions on Information Theory 53, 4805 (2007) The standard Box–Muller technique generates pairs of independent normally distributed variables from equations of the following form.

:Z_1 = \sqrt{-2 \ln(U_1)} \cos(2 \pi U_2)

:Z_2 = \sqrt{-2 \ln(U_1)} \sin(2 \pi U_2)

The generalized Box–Muller technique can generates pairs of q-Gaussian deviates that are not independent. In practice, only a single deviate will be generated from a pair of uniformly distributed variables. The following formula will generate deviates from a q-Gaussian with specified parameter q and \beta = {1 \over {3-q}}

:Z = \sqrt{-2 \text{ ln}_{q'}(U_1)} \text{ cos}(2 \pi U_2)

where \text{ ln}_q is the q-logarithm and q' = { {1+q} \over {3-q}}

These deviates can be transformed to generate deviates from an arbitrary q-Gaussian by

: Z' = \mu + {Z \over \sqrt{\beta (3-q)}}

Applications

= Physics =

It has been shown that the momentum distribution of cold atoms in dissipative optical lattices is a q-Gaussian.{{Cite journal | last1 = Douglas | first1 = P. | last2 = Bergamini | first2 = S. | last3 = Renzoni | first3 = F. | title = Tunable Tsallis Distributions in Dissipative Optical Lattices | doi = 10.1103/PhysRevLett.96.110601 | journal = Physical Review Letters | volume = 96 | issue = 11 | year = 2006 | pmid = 16605807|bibcode = 2006PhRvL..96k0601D | page=110601| url = http://discovery.ucl.ac.uk/142750/1/142750.pdf }}

The q-Gaussian distribution is also obtained as the asymptotic probability density function of the position of the unidimensional motion of a mass subject to two forces: a deterministic force of the type F_1(x) = - 2 x/(1-x^2) (determining an infinite potential well) and a stochastic white noise force F_2(t)= \sqrt{2(1-q)} \xi(t), where \xi(t) is a white noise. Note that in the overdamped/small mass approximation the above-mentioned convergence fails for q <0 , as recently shown.{{cite journal | last1=Domingo | first1=Dario | last2=d’Onofrio | first2=Alberto | last3=Flandoli | first3=Franco | title=Boundedness vs unboundedness of a noise linked to Tsallis q-statistics: The role of the overdamped approximation | journal=Journal of Mathematical Physics | publisher=AIP Publishing | volume=58 | issue=3 | year=2017 | issn=0022-2488 | doi=10.1063/1.4977081 | page=033301| arxiv=1709.08260 | bibcode=2017JMP....58c3301D | s2cid=84178785 | url=https://zenodo.org/record/889716 }}

=Finance=

Financial return distributions in the New York Stock Exchange, NASDAQ and elsewhere have been interpreted as q-Gaussians.{{cite journal | last=Borland | first=Lisa | title=Option Pricing Formulas Based on a Non-Gaussian Stock Price Model | journal=Physical Review Letters | publisher=American Physical Society (APS) | volume=89 | issue=9 | date=2002-08-07 | issn=0031-9007 | doi=10.1103/physrevlett.89.098701 | page=098701| pmid=12190447 |arxiv=cond-mat/0204331| bibcode=2002PhRvL..89i8701B | s2cid=5740827 }}L. Borland, The pricing of stock options, in Nonextensive Entropy – Interdisciplinary Applications, eds. M. Gell-Mann and C. Tsallis (Oxford University Press, New York, 2004)

See also

Notes

{{reflist}}

Further reading

  • Juniper, J. (2007) {{cite web |url=http://e1.newcastle.edu.au/coffee/pubs/wp/2007/07-10.pdf |title=The Tsallis Distribution and Generalised Entropy: Prospects for Future Research into Decision-Making under Uncertainty |url-status=dead |access-date=2011-06-24 |archive-date=2011-07-06 |archive-url=https://web.archive.org/web/20110706114736/http://e1.newcastle.edu.au/coffee/pubs/wp/2007/07-10.pdf }}, Centre of Full Employment and Equity, The University of Newcastle, Australia