next up previous
Next: 3.2.6 Homogeneous distributions in Up: 3.2 Other Distributions Previous: 3.2.4 Rejection Method


3.2.5 Multivariate Gaussian Distribution


$\displaystyle \fbox{$
p(x_{1}, \dots, x_{n}) = \frac{1}{\sqrt{(2 \pi)^{n}\, S}}
\, e^{\textstyle -\frac{1}{2} \sum \sum g_{ij}\, x_{i}\, x_{j}}
$}$      

or
$\displaystyle p(\mbox{$\bf x$})$ $\textstyle =$ $\displaystyle \frac{1}{\sqrt{(2 \pi)^{n}\, S}}
\, e^{-\frac{1}{2} \mbox{$\bf x$...
...f x$}}
\equiv \frac{1}{\sqrt{(2 \pi)^{n}\,S}}
\, e^{\textstyle -\frac{1}{2}\,Q}$  

with the covariance matrix of the $x_{i}$
$\displaystyle \mbox{${\bf S}$}$ $\textstyle \equiv$ $\displaystyle \mbox{${\bf G}$}^{\,-1} \equiv
\left( \begin{array}{ccc}
\langle ...
...vdots & \langle x_{2}^{2} \rangle & \dots \\
& & \ddots \\
\end{array}\right)$  

$S \equiv \vert \mbox{${\bf S}$}\vert$ is the determinant of this matrix. $\mbox{${\bf S}$}$ and $\mbox{${\bf G}$}$ are symmetric, their eigenvalues are called $\sigma_{i}^{2}$ and $\gamma_{i}$ (sorry!).



EXAMPLE: Assume that two Gaussian variates have the variances $s_{11} \equiv \langle x_{1}^{2} \rangle =3$, $s_{22} \equiv \langle x_{2}^{2} \rangle =4$, and the covariance $s_{12} \equiv \langle x_{1}\,x_{2} \rangle =2$:

\begin{displaymath}
\mbox{${\bf S}$}=\mbox{$\left( \begin{array}{cc}3&2\\ \vspac...
... \vspace{-9pt}\\ -\frac{1}{4}&\frac{3}{8}\end{array} \right)$}
\end{displaymath}

The quadratic form $Q$ in the exponent is then $
Q=(1/2) \, x_{1}^{2} - (1/2)\,x_{1}\,x_{2} +
(3/8)\,x_{2}^{2}\, .
$, and the lines of equal density (that is, of equal $Q$) are ellipses which are inclined with respect to the $x_{1,2}\,$ coordinate axes:
\begin{figure}\includegraphics[height=180pt]{figures/f3mvg1.ps}
\end{figure}
Rotate the axes of the ellipsoids $Q=const$ to coincide with the coordinate axes: $\Longrightarrow$ cross correlations vanish!



\fbox{
\begin{minipage}{600pt}
{\bf Principal axis transformation:}
\begin{itemi...
...grightarrow$\ $\mbox{${\bf S}$}^{\,-1}$\ need
never be computed!
\end{minipage}}



Having found $\mbox{${\bf T}$}$, we arrive at the following prescription for the production of correlated Gaussian variables: $\Longrightarrow$

\fbox{ \begin{minipage}{600pt}
{\bf Multivariate Gaussian distribution:} \\ [6pt...
... x$}$\ are then random numbers obeying the
desired distribution.
\end{minipage}}



Let's try it out: $\Longrightarrow$



EXAMPLE: Once more, let

\begin{displaymath}
\mbox{${\bf S}$}=\mbox{$\left( \begin{array}{cc}3&2\\ \vspac...
... \vspace{-9pt}\\ -\frac{1}{4}&\frac{3}{8}\end{array} \right)$}
\end{displaymath}

Principal axis transformation: The eigenvalues of ${\bf S}$ are $\sigma_{1,2}^{2}= (7\pm\sqrt{17})/2 = 5.562\vert 1.438$, and the corresponding eigenvectors are

\begin{displaymath}
\mbox{$\bf s$}_{1}=\mbox{$\left( \begin{array}{r} 0.615 \\ \...
....615&0.788\\ \vspace{-9pt}\\ 0.788&-0.615\end{array} \right)$}
\end{displaymath}



Generator: To produce pairs $(x_{1},x_{2})$ of Gaussian random numbers with the given covariance matrix:





EXERCISE: Write a program that generates a sequence of bivariate Gaussian random numbers with the statistical properties as assumed in the foregoing example. Determine $\langle x_{1}^{2}\rangle$, $\langle x_{2}^{2}\rangle$, and $\langle x_{1}x_{2}\rangle$ to see if they indeed approach the given values of $3$, $4$, and $2$.



next up previous
Next: 3.2.6 Homogeneous distributions in Up: 3.2 Other Distributions Previous: 3.2.4 Rejection Method
Franz J. Vesely Oct 2005
See also:
"Computational Physics - An Introduction," Kluwer-Plenum 2001