Information source (mathematics)

In mathematics, an information source is a sequence of random variables ranging over a finite alphabet Γ, having a stationary distribution.

The uncertainty, or entropy rate, of an information source is defined as

:H\{\mathbf{X}\} = \lim_{n\to\infty} H(X_n | X_0, X_1, \dots, X_{n-1})

where

: X_0, X_1, \dots, X_n

is the sequence of random variables defining the information source, and

:H(X_n | X_0, X_1, \dots, X_{n-1})

is the conditional information entropy of the sequence of random variables. Equivalently, one has

:H\{\mathbf{X}\} = \lim_{n\to\infty}

\frac{H(X_0, X_1, \dots, X_{n-1}, X_n)}{n+1}.

See also

References

  • Robert B. Ash, Information Theory, (1965) Dover Publications. {{ISBN|0-486-66521-6}}

zh-yue:資訊源

Category:Information theory

Category:Stochastic processes

{{statistics-stub}}