next up previous
Next: 3.4.2 Genetic Algorithms Up: 3.4 Stochastic Optimization Previous: 3.4 Stochastic Optimization


3.4.1 Simulated Annealing

Consider a Metropolis walk through the space of ``states'' $\mbox{$\bf x$}_{\alpha}$ with
$\displaystyle p_{\alpha}$ $\textstyle =$ $\displaystyle A \exp{- \beta U(\mbox{$\bf x$})}$  

where $U(x_{1},\dots x_{M})$ is a ``cost function'' to be minimized, and $\beta$ a tunable parameter (a reciprocal ``temperature''.)

- Low $\beta$ $\Longrightarrow$smaller variation of $p_{\alpha}$; higher $U(\mbox{$\bf x$})$ are accessible

- High $\beta$ $\Longrightarrow$ $\mbox{$\bf x$}$ will tend to go ``downhill''

\fbox{
\begin{minipage}{600pt}
{\bf Simulated Annealing:} \\ [1ex]
Draw a starti...
... very probably (not with
certainty!) will be the global minimum.
\end{minipage}}



EXERCISE: Create (fake!) a table of ``measured values with errors'' according to
\begin{displaymath}
y_{i}=f(x_{i};c_{1},\dots c_{6})+\xi_{i}\,,\;\;\; i=1,20
\end{displaymath} (3.2)

with $\xi_{i}$ coming from a Gauss distribution with suitable variance, and with the function $f$ defined by
\begin{displaymath}
f(x;\mbox{$\bf c$})\equiv c_{1}e^{\textstyle - c_{2}(x-c_{3})^{2}}
+c_{4}e^{\textstyle - c_{5}(x-c_{6})^{2}}
\end{displaymath} (3.3)

( $c_{1} \dots c_{6}$ being a set of arbitrary coefficients).

Using these data, try to reconstruct the parameters $c_{1} \dots c_{6}$ by fitting the theoretical function $f$ to the table points $(x_{i},y_{i})$. The cost function is

\begin{displaymath}
U(\mbox{$\bf c$})\equiv \sum_{i}\left[ y_{i}-f(x_{i};\mbox{$\bf c$})\right]^{2}
\end{displaymath} (3.4)

Choose an initial vector $\mbox{$\bf c$}^{0}$ and perform an MC random walk through $\mbox{$\bf c$}$-space, slowly lowering the temperature.



next up previous
Next: 3.4.2 Genetic Algorithms Up: 3.4 Stochastic Optimization Previous: 3.4 Stochastic Optimization
Franz J. Vesely Oct 2005
See also:
"Computational Physics - An Introduction," Kluwer-Plenum 2001