\documentclass[a4paper]{article}
\usepackage{amsmath,amsfonts,amsthm,amssymb,graphicx}
\newcommand{\bvec}[1]{\boldsymbol{#1}} % vectors in bold instead of with an arrow on top
\title{Simulated annealing for the travelling salesman problem}
\date{November 4, 2009}
\author{Jannis Teunissen \and Florian Speelman}
\begin{document}
\maketitle
\section*{Introduction}
The travelling salesman problem is a very well known problem: given a list of cities and the distances between them, find the shortest connected path that visits all cities once. Solving this problem exactly has proven to be very hard, current exact solutions require at least exponential time. Therefore we study a heuristic method, simulated annealing, to find a good approximation of the solution.
\section*{Simulated annealing}
In 1953, Metropolis et al.~published an article\cite{metropolis} describing an algorithm that is now known as the Metropolis-Hastings algorithm. It can be used to obtain a sequence of random samples from a probabilty distribution. The sequence can then be used to approximate the distribution, something useful for distributions that are hard to sample from. In 1983 Kirkpatrick et al.\cite{kirk} proposed an adaptation of the Metropolis algorithm called `simulated annealing' that could be used to find a good approximation to the global minimum of some function in a large search space. In case of a discrete search space (as with the travelling salesman problem), the method works as follows:
\begin{enumerate}
\item Pick an initial state $S_i$ and `temperature' $T_i$.
\item Generate some neighbouring state $s$, and accept is as the next state with probability $$\text{min}\left(1,\exp\left[\frac{f(S_i)-f(s)}{T_i}\right]\right).$$
\item Lower the temperature according to a cooling scheme.
\item Keep track of the best state encountered and go back to step 2.
\end{enumerate}
More information about simulated annealing can be found in for example \cite{ross}, \cite{nr}. Note that the use of the word temperature is just because of the analogy with statistical mechanics, and does not imply that $T$ has something to do with Kelvins or so.
\section*{Implementation}
We tried to find the shortest path between a little over one hundred cities spread around the world. The input of our program is a textfile, stating the number of cities and below that a table with the distances between them in kilometers. Since the units of the problem are in not important in our results, and the use of the word `temperature' can already be confusing, we'll leave them out and just state numbers whenever possible.

Our implementation of simulated annealing is written in \emph{C}. The most important function is \texttt{anneal()}, which returns the shortest path found. Its initial state is a random sequence $c_i$ of the numbers $1-N$, where $N$ is the number of cities. The total travel distance is then given by $d(c_N,c1) + \sum_{n=1}^{N-1}d(c_n,c_{n+1})$, where $d(i,j)$ is the distance between city $i$ and $j$.

To generate a neighbouring state we implemented several rearrangement operations (see \cite{lin} for some discussion): rotate a part of the sequence, move a part of the sequence to another place and interchange two points in the sequence. Whenever a neighbouring state needs to be generated, one of these operations is applied to the current state.

Choosing a cooling schedule is quite difficult for simulated annealing, there exists no general best method for this. We included two variations, both update the temperature after every $m$ steps. In the first one the temperature is lowered as $T_{i+m} = T_0(1 - (i+m)/k)^\alpha$, where $T_0$ is the starting temperature, $k$ is the desired amount of steps and $\alpha$ is a positive number, and  in the second temperature is lowered with some factor as $T_{i+m} = c\cdot T_{i}$.\\
\\
For typical experiments we call the \texttt{anneal()} function several times, storing the average result and standard deviation. We then repeat this passing different parameters to \texttt{anneal()} to determine their effect on the pathlength found. The parameters we can adjust are:
\begin{itemize}
\item The starting temperature and end temperature.
\item The probabilities of generating a neighbouring state by moving part of the path, rotating a part or swapping two cities.
\item The cooling schedule used and how often the temperature should be updated. One of the schedules needs an additional constant (the $\alpha$ mentioned above).
\item The maximum number of steps after which the simulation stops. The cooling schedules are made in such a way that the temperature equals the end temperature when the simulation stops.
\end{itemize}
\section*{Results \& Discussion}
We try to analyze the behaviour of our implementation by altering one parameter at a time. Generally, this doesn't need to work well, but we try to set the other parameters to `reasonable' values so that we can still gain insight in the dependence on every parameter. If not stated otherwise every figure in this section is made using a maximum number of $5\cdot 10^5$ steps, and the temperatures are updated every 1000 steps.
\subsection*{Rearrangement operations}
The three operations mentioned before are not always equally useful. While some operations are more likely to give a better state, they are less likely to move out of a local minimum. We ran several tests, and noticed that swapping two cities was not really useful in any case. Leaving that operation out we varied the probabilities of a move and rotation, see figure \ref{fig:prot1} and \ref{fig:prot2}. One sees that the rotation operation has a higher influence on performance, though just rotations are worse than a mixture. This is to be expected since a move operation is less likely to result in a better state (it moves quite far away from some minimum, generally), but on the other hand only rotations will make it harder to move away from local minima. We choose $P_\text{move} = 0.25$ and $P_\text{rotation} = 0.75$ as the optimal probabilities from now on.
\begin{figure}
\includegraphics[width=12cm]{prot.png}
\caption{
Shown is mean pathlength as a function of the probability that a rotation
operation is used for generating the next state, instead of a move. Every data-point
is the mean of 100 simulations. Error bars show the square root of the sample variance divided by the number of simulations: an estimate of the standard-deviation of the shown average. It is clear that values around one or zero are bad.
\label{fig:prot1}}
\end{figure}

\begin{figure}
\includegraphics[width=12cm]{prot2.png}
\caption{
Mean pathlength versus the probability that a rotation instead of a move is
used for generating the next state, like figure \ref{fig:prot2}, only with shorter runs and thus
a faster cooling schedule. This makes the difference between using mostly rotations
and mostly moves much more pronounced. 
\label{fig:prot2}}
\end{figure}

\subsection*{Cooling schedule}
Choosing a starting temperature is somewhat arbitrary: since we start out with a random initial state, having a very high starting temperature will not have too much influence on the results (as long as the amount of steps at lower temperatures is kept constant). For this reason we initially set it rather high, $10^5$ units of length, a few times larger than the average pathlength difference between two neighbouring states. The end temperature is chosen to be $0.1$, because a value of zero would lead to problems with the cooling schemes.

First we look at the influence of the parameter $\alpha$ in first cooling schedule $T_{i+m} = T_0(1 - (i+m)/k)^\alpha$.
Figure \ref{fig:alpha} shows that its value should not be less than about five, and we think it should also not be higher than about 30. This means that the most important part of the annealing is at lower temperatures. We choose $\alpha = 10$ from now on.

\begin{figure}
\includegraphics[width=12cm]{coolingCoef.png}
\caption{
This plot shows the results of simulated annealing runs for different values
of the cooling parameter.
Error bars are the estimated standarddeviation on the mean of the pathlength. The
differences are not very large, but it is clear that the optimal value, when using
$5\cdot 10^5$ iterations, lies somewhere between 5 and 30.
The first few datapoints are very large and fall outside of the plot range.
\label{fig:alpha}}
\end{figure}

It is interesting to look at the dependence on starting temperature, we expect that a low starting temperature will give poor results, because the system is more likely to stay in local minima. We ran test with both our cooling schedules, see figure \ref{fig:startt1} and \ref{fig:startt2}. The results are surprisingly similar, and show that a starting temperature of $10^4$ works fine.

Finally we compare results for our two cooling schedules, increasing the number of iterations. The results are very similar, so that either cooling schedule can be used for the travelling salesman problem. For comparison, the shortest path we ever found was $1.02782\cdot10^5\text{km}$.

\begin{figure}
\includegraphics[width=12cm]{starttemp.png}
\caption{
Mean pathlength versus starting temperature for the first cooling schedule. A starting temperature of $10^4$ seems reasonable.
\label{fig:startt1}}
\end{figure}

\begin{figure}
\includegraphics[width=12cm]{starttemp2.png}
\caption{
Mean pathlength versus starting temperature for the second cooling schedule, the results are surprisingly similar to those of figure \ref{fig:startt1}.
\label{fig:startt2}}
\end{figure}

\begin{figure}
\includegraphics[width=12cm]{maxtotal.png}
\caption{
Comparison of the mean pathlength for the two cooling methods, with a starting temperature of $10^4$ and $\alpha = 10$. No significant differences can be seen, so that both are equally apt for our problem.
\label{fig:maxtotal}}
\end{figure}

\begin{thebibliography}{9}
\bibitem{metropolis}
N.~Metropolis, A.~W.~Rosenbluth, M.~N.~Rosenbluth, A.~H.~Teller and E.~Teller,
    Equations of State Calculations by Fast Computing Machines
\emph{Journal of Chemical Physics 21 (6), 1953}

\bibitem{kirk}
S.~Kirkpatrick, C.~D.~Gelatt and M.~P.~Vecchi,
    Optimization by Simulated Annealing.
\emph{Science, New Series, Vol. 220, No. 4598, 1983}

\bibitem{ross}
S.~M.~Ross,
  Simulation.
\emph{Elsevier Academic Press, fourth edition, 2006}

\bibitem{nr}
W.~H.~Press, S.~A.~Teukolsky, W.~T.~Vetterling,
    Numerical Recipes - The Art of Scientific Computing.
\emph{Cambridge University Press, third Revised edition, 2007}

\bibitem{lin}
S.~Lin,
    Computer solutions of the traveling salesman problem.
\emph{Bell System Technical Journal 44, 1965}

\end{thebibliography}
\end{document} 