** Next:** 3.3.5 Monte Carlo Method
**Up:** 3.3 Random Sequences
** Previous:** 3.3.3 Wiener-Lévy Process (Unbiased

##

3.3.4 Markov Chains (Biased Random Walk)

A Markov sequence of discrete is called a *Markov chain*.

We generalize the discussion to
``state vectors''
.
The conditional probability

is called *transition probability* between the states
and .
Let be the total number of possible states. The -matrix
and
the -vector
consisting of the individual probabilities
determine the statistical properties of the Markov chain uniquely.

A Markov chain is *reversible* if

- Meaning?
The elements of the matrix
are not uniquely defined
by the reversibility conditions.
For a given distribution density
there are
many reversible transition matrices.
``Asymmetrical rule'' (N. Metropolis):

* Franz J. Vesely Oct 2005*

See also: "Computational Physics - An Introduction," Kluwer-Plenum 2001