next up previous
Next: 3.3.2 Generating a stationary Up: 3.3 Random Sequences Previous: 3.3 Random Sequences


3.3.1 Basics

So far: random numbers, preferably no serial correlations $\langle x_{n}\,x_{n+k} \rangle $.
Now: sequences of r. n. with given serial correlations.

Let $\{x(t)\}$ be an ensemble of functions of time $t$. Then
$\displaystyle P_{1}(x;t)$ $\textstyle \equiv$ $\displaystyle {\cal P} \, \{x(t) \leq x\}
\;\;\;\;
{\rm and}
\;\;\;\;
p_{1}(x;t) \equiv \frac{dP_{1}(x;t)}{dx}$  

are the probability distribution and the respective density.

EXAMPLE: Let $x_{0}(t)$ be a deterministic function of time, and assume that the quantity $x(t)$ at any time $t$ be Gauss distributed about the value $x_{0}(t)$:

\begin{displaymath}
p_{1}(x;t)=\frac{1}{\sqrt{2\pi \sigma ^{2}}}\,e^{\textstyle -\frac{1}{2}\,
\left[x-x_{0}(t)\right]^{2}/ \sigma ^{2}}
\end{displaymath}


A random process is called a random sequence if the variable $t$ may assume only discrete values $\{t_{k}\,; \; k=0,1,\dots \}$. In this case one often writes $x(k)$ for $x(t_{k})$.

The foregoing definitions may be generalized in the following manner:
$\displaystyle P_{2}(x_{1},x_{2};t_{1},t_{2})$ $\textstyle \equiv$ $\displaystyle {\cal P} \{ x(t_{1})\leq x_{1},
x(t_{2})\leq x_{2} \}$  
$\displaystyle \vdots$      
$\displaystyle P_{n}(x_{1},\dots,x_{n};t_{1},\dots,t_{n})$ $\textstyle \equiv$ $\displaystyle {\cal P}
\{ x(t_{1})\leq x_{1},\dots,x(t_{n})\leq x_{n} \}$  

Thus $P_{2}(..)$ is the compound probability for the events $x(t_{1})\leq x_{1}$ and $x(t_{2})\leq x_{2}$. These higher order distribution functions and the corresponding densities
$\displaystyle p_{n}(x_{1},\dots,x_{n};t_{1},\dots,t_{n})$ $\textstyle =$ $\displaystyle \frac{d^{n}P_{n}(x_{1},\dots,x_{n};t_{1},
\dots,t_{n})}{dx_{1}\,\dots\,dx_{n}}$  

describe the random process in ever more - statistical - detail.

Stationarity: A random process is stationary if
$\displaystyle P_{n}(x_{1},\dots,x_{n};t_{1},\dots,t_{n})$ $\textstyle =$ $\displaystyle P_{n}(x_{1},\dots,x_{n};t_{1}+t,\dots,t_{n}+t)\,,$  

This means that the origin of time is of no importance:
$\displaystyle p_{1}(x;t)=p_{1}(x)
\;\;\;\;$ $\textstyle {\rm and}$ $\displaystyle \;\;\;\;
p_{2}(x_{1},x_{2};t_{1},t_{2})=p_{2}(x_{1},x_{2};t_{2}-t_{1})$  



Autocorrelation:
$\displaystyle \langle x(0)\,x(\tau) \rangle$ $\textstyle \equiv$ $\displaystyle \int_{a}^{b}\!\int_{a}^{b}
x_{1} x_{2} \, p_{2}(x_{1},x_{2};\tau)\,dx_{1}\, dx_{2}\,,$  

For $\tau \rightarrow 0$ the autocorrelation function (acf) approaches the variance $\langle x^{2} \rangle $. For finite $\tau$ it tells us how rapidly a particular value of $x(t)$ will be ``forgotten''.

Gaussian process: The random variables $x(t_{1}), \dots , x(t_{n})$ obey a multivariate Gaussian distribution. The covariance matrix elements are $\langle x(0)\,x(t_{j}-t_{i}) \rangle $, i.e. the values of the autocorrelation function at the specific time displacement:
$\displaystyle p_{2}(x_{1}\,, x_{2}\,;\tau)$ $\textstyle =$ $\displaystyle \frac{1}{\sqrt{(2\pi)^{2} S_{2}(\tau)}}
\, e^{\textstyle -\frac{1}{2}\,Q}$  

with
$\displaystyle Q$ $\textstyle \equiv$ $\displaystyle \frac{\langle x^{2} \rangle x_{1}^{2} -2 \langle x(0)x(\tau ) \rangle
x_{1}\,x_{2}
+ \langle x^{2} \rangle x_{2}^{2}}{S_{2}(\tau)}$  

and
$\displaystyle S_{2}(\tau)$ $\textstyle \equiv$ $\displaystyle \vert\mbox{${\bf S}$}_{2}(\tau)\vert = \langle x^{2} \rangle^{2} -
\langle x(0)\,x(\tau) \rangle^{2}$  



Markov Process: A stationary random sequence $\{x_{n}; n=0, 1 \dots\,\}$ has the Markov property if its ``memory'' goes back only one time step:
$\displaystyle p(x_{n} \vert x_{n-1} \dots x_{1})$ $\textstyle =$ $\displaystyle p(x_{n}\vert x_{n-1})$  

where the conditional density
$\displaystyle p(x_{n}\vert x_{n-1};\tau)$ $\textstyle =$ $\displaystyle \frac{p_{2}(x_{n-1},x_{n})}{p_{1}(x_{n-1})}$  

is the density of $x_{n}$ under the condition that $x(n-1)=x_{n-1}$.
Thus all statistical properties of the process are contained in $p_{2}(x_{n-1},x_{n})$.

An even shorter memory would mean that successive elements of the sequence were not correlated at all.

Gaussian Markov processes: To describe them uniquely not even $p_{2}(\dots)$ is needed. If the autocorrelation function $\langle x(n)\, x(n+l) \rangle $ is known, $p_{2}(..)$ and consequently all statistical properties of the process follow.
Note: The acf of a stationary Gaussian Markov process is always an exponential:
$\displaystyle \langle x(0)\,x(\tau) \rangle$ $\textstyle =$ $\displaystyle \langle x^{2} \rangle e^{- \beta \tau}$  

or
$\displaystyle \langle x(n)\,x(n+k) \rangle$ $\textstyle =$ $\displaystyle \langle x^{2} \rangle
e^{- \beta \Delta t \, k}$  



How to produce a Markov sequence? $\Longrightarrow$


next up previous
Next: 3.3.2 Generating a stationary Up: 3.3 Random Sequences Previous: 3.3 Random Sequences
Franz J. Vesely Oct 2005
See also:
"Computational Physics - An Introduction," Kluwer-Plenum 2001