mixed Poisson distribution
{{Infobox probability distribution
| name = mixed Poisson distribution
| type = mass
| pdf_image =
| pdf_caption =
| cdf_image =
| cdf_caption =
| notation =
| parameters =
| support =
| pdf =
| cdf =
| mean =
| median =
| mode =
| variance =
| skewness =
| kurtosis =
| entropy =
| pgf =
| mgf = , with the MGF of {{pi}}
| char =
| fisher =
}}
A mixed Poisson distribution is a univariate discrete probability distribution in stochastics. It results from assuming that the conditional distribution of a random variable, given the value of the rate parameter, is a Poisson distribution, and that the rate parameter itself is considered as a random variable. Hence it is a special case of a compound probability distribution. Mixed Poisson distributions can be found in actuarial mathematics as a general approach for the distribution of the number of claims and is also examined as an epidemiological model.{{Citation |last1=Willmot |first1=Gordon E. |title=Mixed Poisson distributions |date=2001 |url=http://link.springer.com/10.1007/978-1-4613-0111-0_3 |work=Lundberg Approximations for Compound Distributions with Insurance Applications |volume=156 |pages=37–49 |place=New York, NY |publisher=Springer New York |doi=10.1007/978-1-4613-0111-0_3 |isbn=978-0-387-95135-5 |access-date=2022-07-08 |last2=Lin |first2=X. Sheldon|series=Lecture Notes in Statistics }} It should not be confused with compound Poisson distribution or compound Poisson process.{{Cite journal |last=Willmot |first=Gord |date=1986 |title=Mixed Compound Poisson Distributions |journal=ASTIN Bulletin |language=en |volume=16 |issue=S1 |pages=S59–S79 |doi=10.1017/S051503610001165X |issn=0515-0361|doi-access=free }}
Definition
A random variable X satisfies the mixed Poisson distribution with density {{pi}}(λ) if it has the probability distribution{{Cite journal |last=Willmot |first=Gord |date=2014-08-29 |title=Mixed Compound Poisson Distributions |journal=Astin Bulletin |volume=16 |pages=5–7 |doi=10.1017/S051503610001165X|s2cid=17737506 |doi-access=free }}
If we denote the probabilities of the Poisson distribution by {{math|qλ(k)}}, then
Properties
- The variance is always bigger than the expected value. This property is called overdispersion. This is in contrast to the Poisson distribution where mean and variance are the same.
- In practice, almost only densities of gamma distributions, logarithmic normal distributions and inverse Gaussian distributions are used as densities {{pi}}(λ). If we choose the density of the gamma distribution, we get the negative binomial distribution, which explains why this is also called the Poisson gamma distribution.
In the following let be the expected value of the density and be the variance of the density.
= Expected value =
The expected value of the mixed Poisson distribution is
= Variance =
For the variance one gets
= Skewness =
The skewness can be represented as
= Characteristic function =
The characteristic function has the form
Where is the moment generating function of the density.
= Probability generating function =
For the probability generating function, one obtains
= Moment-generating function =
The moment-generating function of the mixed Poisson distribution is
Examples
{{Math theorem|Compounding a Poisson distribution with rate parameter distributed according to a gamma distribution yields a negative binomial distribution.}}
{{Math proof|Let be a density of a distributed random variable.
\operatorname{P}(X=k)&= \frac{1}{k!} \int_0^\infty \lambda^k e^{-\lambda} \frac{(\frac{p}{1-p})^r}{\Gamma(r)} \lambda^{r-1} e^{-\frac{p}{1-p}\lambda} \, d \lambda \\
& = \frac{p^r(1-p)^{-r}}{\Gamma(r) k!} \int_0^\infty \lambda^{k+r-1} e^{-\lambda \frac{1}{1-p}} \, d \lambda \\
& = \frac{p^r(1-p)^{-r}}{\Gamma(r) k!} (1-p)^{k+r} \underbrace{\int_0^\infty \lambda^{k+r-1} e^{-\lambda} \, d \lambda}_{= \Gamma(r+k)} \\
& = \frac{\Gamma(r+k)}{\Gamma(r) k!} (1-p)^k p^r
\end{align}
Therefore we get }}
{{Math theorem|Compounding a Poisson distribution with rate parameter distributed according to an exponential distribution yields a geometric distribution.}}
{{Math proof|Let be a density of a distributed random variable. Using integration by parts {{mvar|n}} times yields:
\operatorname{P}(X=k)&=\frac{1}{k!}\int_0^\infty \lambda^k e^{-\lambda} \frac1\beta e^{-\frac \lambda\beta} \, d\lambda\\
&=\frac{1}{k!\beta}\int_0^\infty \lambda^k e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\, d \lambda\\
&=\frac{1}{k!\beta}\cdot k!\left(\frac{\beta}{1+\beta}\right)^k\int_0^\infty e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\, d \lambda\\
&=\left(\frac{\beta}{1+\beta}\right)^k\left(\frac{1}{1+\beta}\right)
\end{align}
Therefore we get }}
Table of mixed Poisson distributions
class="wikitable"
!mixing distribution |
Dirac
|Poisson |
gamma, Erlang |
exponential |
inverse Gaussian |
Poisson
|Neyman |
generalized inverse Gaussian
|Poisson-generalized inverse Gaussian |
generalized gamma
|Poisson-generalized gamma |
generalized Pareto
|Poisson-generalized Pareto |
inverse-gamma
|Poisson-inverse gamma |
log-normal
|Poisson-log-normal |
Lomax
|Poisson–Lomax |
Pareto
|Poisson–Pareto |
Pearson’s family of distributions
|Poisson–Pearson family |
truncated normal
|Poisson-truncated normal |
uniform
|Poisson-uniform |
shifted gamma |
beta with specific parameter values
|Yule |
References
{{reflist}}
Further reading
- {{cite book |first=Jan |last=Grandell |title=Mixed Poisson Processes |publisher=Chapman & Hall |location=London |year=1997 |isbn=0-412-78700-8 }}
- {{cite book |first=Tom |last=Britton |title=Stochastic Epidemic Models with Inference |publisher=Springer |year=2019 |isbn= |doi=10.1007/978-3-030-30900-8 }}
{{Probability distributions}}