Reversible Markov Chains

Here’s a pretty idea. A Markov chain is one of the simplest forms of dependence in random variables: an infinite sequence of dependent random variables, where the probability distribution of the next random variable only depends on the value of the current random variable. If you reverse the sequence of variables, you get another Markov chain, the reverse Markov chain. Some Markov chains, reversible Markov chains, have the property that when you reverse them, you get back the same chain. Markov chains represent processes that have no history, in that future is determined solely by the present, not the past. A reversible Markov chain not only has no history, but time has no direction.

Here is a draft of a book by Aldous and Fill on the theory of reversible Markov chains.

One thought on “Reversible Markov Chains

Comments are closed.