next up previous
Next: 3.3.5 Monte Carlo Method Up: 3.3 Random Sequences Previous: 3.3.3 Wiener-Lévy Process (Unbiased


3.3.4 Markov Chains (Biased Random Walk)

A Markov sequence of discrete $x_{\alpha}$ is called a Markov chain.

We generalize the discussion to ``state vectors'' $\{\mbox{$\bf x$}_{\alpha}, \alpha=1, \dots M \}$. The conditional probability
$\displaystyle p_{\alpha \beta}$ $\textstyle \equiv$ $\displaystyle {\cal P} \, \{ \mbox{$\bf x$}(n)=\mbox{$\bf x$}_{\beta}\, \vert\,\mbox{$\bf x$}(n-1)=
\mbox{$\bf x$}_{\alpha} \}$  

is called transition probability between the states $\alpha$ and $\beta$.

Let $M$ be the total number of possible states. The $M \times M$-matrix $\mbox{${\bf P}$} \equiv \{ p_{\alpha \beta}\}$ and the $M$-vector $\mbox{$\bf p$}$ consisting of the individual probabilities $p_{\alpha} \equiv \mbox{$\cal P$}\{\mbox{$\bf x$}=\mbox{$\bf x$}_{\alpha} \}$ determine the statistical properties of the Markov chain uniquely.

A Markov chain is reversible if

$\displaystyle p_{\alpha}\, p_{\alpha \beta}$ $\textstyle =$ $\displaystyle p_{\beta}\,p_{\beta \alpha}$ (3.1)

- Meaning?

The $M^{2}$ elements of the matrix $\mbox{${\bf P}$}$ are not uniquely defined by the $M(M-1)/2$ reversibility conditions. $\Longrightarrow$For a given distribution density $\mbox{$\bf p$}$ there are many reversible transition matrices. $\Longrightarrow$``Asymmetrical rule'' (N. Metropolis):

\fbox{
\begin{minipage}{600pt}
{\bf N. Metropolis' asymmetric rule:}
\mbox{}\\ *...
...\ *[1ex]
$\Longrightarrow$$p_{\alpha \beta}$\ is reversible! \\
\end{minipage}}



Franz J. Vesely Oct 2005
See also:
"Computational Physics - An Introduction," Kluwer-Plenum 2001