Coupling from the past

{{Short description|Method of sampling from a Markov chain}}

Among Markov chain Monte Carlo (MCMC) algorithms, coupling from the past is a method for sampling from the stationary distribution of a Markov chain. Contrary to many MCMC algorithms, coupling from the past gives in principle a perfect sample from the stationary distribution. It was invented by James Propp and David Wilson in 1996.

The basic idea

Consider a finite state irreducible aperiodic Markov chain M with state space S and (unique) stationary distribution \pi (\pi is a probability vector). Suppose that we come up with a probability distribution \mu on the set of maps f:S\to S with the property that for every fixed s\in S, its image f(s) is distributed according to the transition probability of M from state s. An example of such a probability distribution is the one where f(s) is independent from f(s') whenever s\ne s', but it is often worthwhile to consider other distributions. Now let f_j for j\in\mathbb Z be independent samples from \mu.

Suppose that x is chosen randomly according to \pi and is independent from the sequence f_j. (We do not worry for now where this x is coming from.) Then f_{-1}(x) is also distributed according to \pi, because \pi is M-stationary and our assumption on the law of f. Define

:F_j:= f_{-1}\circ f_{-2}\circ\cdots\circ f_{-j}.

Then it follows by induction that F_j(x) is also distributed according to \pi for every j\in\mathbb{N}. However, it may happen that for some n\in\mathbb{N} the image of the map F_n is a single element of S.

In other words, F_n(x)=F_n(y) for each y\in S. Therefore, we do not need to have access to x in order to compute F_n(x). The algorithm then involves finding some n\in \mathbb N such that F_n(S) is a singleton, and outputting the element of that singleton. The design of a good distribution \mu for which the task of finding such an n and computing F_n is not too costly is not always obvious, but has been accomplished successfully in several important instances.{{Cite web|url=http://www.dbwilson.com/exact/|title=Web Site for Perfectly Random Sampling with Markov Chains}}

The monotone case

There is a special class of Markov chains in which there are particularly good choices

for \mu and a tool for determining if |F_n(S)|=1. (Here |\cdot| denotes cardinality.) Suppose that S is a partially ordered set with order \le, which has a unique minimal element s_0 and a unique maximal element s_1; that is, every s\in S satisfies s_0\le s\le s_1. Also, suppose that \mu may be chosen to be supported on the set of monotone maps f:S\to S. Then it is easy to see that |F_n(S)|=1 if and only if F_n(s_0)=F_n(s_1), since F_n is monotone. Thus, checking this becomes rather easy. The algorithm can proceed by choosing n:=n_0 for some constant n_0, sampling the maps f_{-1},\dots,f_{-n}, and outputting F_n(s_0) if F_n(s_0)=F_n(s_1). If F_n(s_0)\ne F_n(s_1) the algorithm proceeds by doubling n and repeating as necessary until an output is obtained. (But the algorithm does not resample the maps f_{-j} which were already sampled; it uses the previously sampled maps when needed.)

References

{{Reflist}}

  • {{cite conference |mode=cs2 |last1=Propp |first1=James Gary |last2=Wilson |first2=David Bruce |book-title=Proceedings of the Seventh International Conference on Random Structures and Algorithms (Atlanta, GA, 1995) |mr=1611693 |year=1996 |chapter=Exact sampling with coupled Markov chains and applications to statistical mechanics |pages=223–252}}
  • {{Citation | last1=Propp | first1=James | last2=Wilson | first2=David | s2cid=2781385 | title=Microsurveys in discrete probability (Princeton, NJ, 1997) | publisher=American Mathematical Society | location=Providence, R.I. | series=DIMACS Ser. Discrete Math. Theoret. Comput. Sci. | mr=1630414 | year=1998 | volume=41 | chapter=Coupling from the past: a user's guide | pages=181–192 | doi=10.1090/dimacs/041/09 | isbn=9780821808276 }}

Category:Monte Carlo methods

Category:Markov chain Monte Carlo