[algorithm] matlab simulated annealing algorithm to solve TSP problem of realization

Foreword

       Simulated annealing algorithm (SA) is one of the more common modern optimization algorithms commonly used in traveling salesman (TSP) issues. Mathematical modeling where students often use the algorithm, even in order to use this algorithm using this algorithm, the judges teachers fatigued. Judges' clear that the use of so-called "psychic Law" (neural networks, simulated annealing, genetic algorithms, etc.) and too far-fetched to those who can not get high marks (see: http://special.univs.cn/service/jianmo/sxjmyw/2018 /1128/1187951_15.shtml ). I hope you do not think it would noun senior judges think it can attract the eye, after all, is a professor at the judges, it can not be intimidated several terms.

       But then, we are students, not because it can not just use do not learn it, and in the programming session, we also gain Moreover, Einstein is also a plus from the beginning to learn, it is still a simulated annealing algorithm learning necessary. Saying a bit more, enter the following topic.

 

Algorithm framework

       Simulated annealing algorithm can be roughly divided into the following steps:

  1, the initial temperature, the generated initial solution, provided the number of solutions is generated at each temperature.

  2, a new solution.

  3, the cost function is calculated difference.

  4, Metropolis discrimination. (Do not be intimidated noun form is a very simple principle)

  5, cooling.

  6, it is determined temperature is less than a given amount. Yes, ending; otherwise, skip to step 2.

       The following detailed explanation of each step.

 

 Setting the initial temperature, the generated initial solution, provided the number of solutions generated at each temperature:

  The initial temperature coefficient of the cooling termination temperature is closely related, they Sa determines the number of iterations, the specific formula is: * initial temperature ≈ end temperature (cooling coefficient) ^ number of iterations, wherein a cooling coefficient (0,1) is a constant. For small-scale problem, the initial temperature typically take 1000, the termination temperature was set 1e-4, the cooling factor of 0.9. These parameters directly affect the Metropolis discrimination.

       Generating a random initial solution is generally generated because of SA dependency, the initial solution is not significant.

    Solutions of each generated object provided the number is to increase the temperature of the sample size, enhance the robustness of the algorithm. Metropolis plurality of solutions called chain length.

 A new solution

  A new way of solution there are many, the simplest solution is to randomly generate new, of course, can generate a new solution in accordance with a certain algorithm. Benefits randomly generated new solution is easy to find the global optimum, the benefits of the new solution produced by a certain algorithm is the premise know the approximate location of the solution may be better convergence.

Calculate the difference between the cost function

  Calculation of the cost function is poor, the upcoming latest solution, the optimal solution to make the difference, get the increment df.

Metropolis discrimination

  The judgment formula is as follows:

                                                

  Meaning, when the cost function is less than 0 (ie, the new solution is smaller than the value of the optimal solution), to accept the new solution becomes the optimal solution; when the cost function is not less than 0, with a certain probability P to accept the solution as the most optimal solution. It can be seen, the higher the temperature, the more inferior solution acceptance rate is low, in other words, global search ability. So when setting the initial temperature, to set the temperature is relatively high, to avoid falling into local optimum.

Lower the temperature

  Simple cooling method, i.e., new = current temperature * Cooling temperature coefficient. Which you can see the number of iterations and the initial temperature, cooling coefficient have a relationship, but noted that the initial temperature on the Metropolis discrimination, so if the number of iterations in order to increase and improve the cooling factor is more appropriate. Finally, it is to determine whether the current temperature is less than the stop temperature. These are the individuals on simulated annealing algorithm flow of understanding, the next program will be designed to solve the TSP problem.

 

programming

% Simulated annealing algorithm 
% the TSP problem 
CLC, Clear 

T0 = 500;% initial temperature 
Tend = 1e-4;% end temperature 
L = 200; chain length of at% each temperature 
q = 0.98;% cooling rate 
cityNum = 30;% city number 
city = rand (cityNum, 2) ;% randomly generated coordinate city 

%%%%%%%%%%%%%% main %%%%%%%%%%%%%%%%% %%% 
D = Distance (City); 
N = cityNum; 
Sl = randperm (N); generating a random% solution 
% solution of the equation, computing the number of iterations 
Time = ceil (double (solve ( [ '1000 * (0.9) ^ x ', num2str (Tend is)]))); 
COUNT = 0;% iteration counter 
Obj = zeros (Time, 1) ;% per generation path and 
track = zeros (Time, N) ;% per generation optimal solution 
% iterative 
while T0> Tend is 
    COUNT = COUNT +. 1; 
    TEMP = zeros (L, N +. 1); 
    for K =. 1: L 
        % L group generating a new solution of  
        S2 = newSolution (S1);
        [S1, R] = Metropolis ( S1, S2, D, T0);
        TEMP (K,:) = [R & lt Sl]; 
    End 
    % optimum route recording each iteration of 
    [D0, index] = min (TEMP (:, End)); 
    IF COUNT == D0. 1 || <the Obj ( COUNT-. 1) 
        the Obj (COUNT) = D0; 
    the else 
        the Obj (COUNT) = the Obj (COUNT -. 1); 
    End 
    
    Track (COUNT,:) = TEMP (index,. 1: End-. 1); 
    T0 = Q * T0; 
End 
fprintf ( 'iterations:% D \ n-', COUNT); 
fprintf ( 'shortest path:% F \ n-', the Obj (End)); 
% iterative FIG 
Figure 
Plot (. 1: COUNT, the Obj); 
the xlabel ( 'iterative number '); 
ylabel (' distance '); 
title (' optimization '); 
DrawPath (Track (End, :), City); 


%%%%%%%%%%%%%%%%% Functions , a total of five %%%%%%%%%%%%%%%%%%%%% 
function D = distance (A) 
    R & lt size = (A,1);
 
% function: distance matrix Solution
% Input: Output City coordinates: distance matrix 
% output: solution of the determination, the solution path 
function [S, R & lt] = the Metropolis (Sl, S2, D, T) 
    Rl = pathLength (D, Sl);
    Zeros = D (R & lt, R & lt); 
    for I =. 1: R & lt 
        for J = I +. 1: R & lt 
            D (I, J) = sqrt ((A (I,. 1) -a (J,. 1)) ^ 2 + (A (I, 2) -a (J, 2)) ^ 2); 
            D (J, I) = D (I, J); 
        End 
    End 
End 

% function: generating a new solution 
% input: solution old output: new solution 
% randomly selected from the two switching positions. 
S2 = newSolution function (Sl) 
    N = length (Sl); 
    S2 = Sl; 
    A = round (RAND (1,2) * (. 1-N) + 1'd);% for the exchange generates two random positions 
    temp = S2 (A (. 1)); 
    S2 (A (. 1)) = S2 (A (2)); 
    S2 (A (2)) = TEMP; 
End 

% the Metropolis criterion function 
% input: A new old solution, distance matrix, current temperature 
    R2 = pathLength (D, S2); 
    dT = R2 - Rl; 
    IF dT <0 
       S = S2; 
       R & lt = R2;
    exp ELSEIF (-dT / T)> = RAND 
       S = S2; 
       R & lt = R2; 
    the else 
       S = Sl; 
       R & lt Rl =; 
    End 
End 

% trace route shown in FIG. 
% Input: the optimal path output coordinates City: FIG path 
function DrawPath (BestTrack, City) 
    N = size (BestTrack, 2); 
    cityArray = zeros (N, 2); 
    for I =. 1: N 
        cityArray (I,. 1) = City (BestTrack (I),. 1); 
        cityArray (I , 2) = City (BestTrack (I), 2); 
    End 
    Figure; 
    HOLD ON  
    Plot (cityArray ( :, 1), cityArray (: , 2), '- o', 'color', [0.5 0.5 0.5]);
    Plot ([cityArray (1,1), cityArray (End,. 1)], [cityArray (1,2), cityArray (End, 2 )], ... 
        '-', 'Color', [0.5 0.5 0.5]); 
    HOLD OFF 
    the xlabel ( 'abscissa'); 
    ylabel ( 'ordinate'); 
    title ( 'the TSP FIG'); 
    Box ON 
End 

function% calculated route length 
% input: distance matrix, sequentially walking output: path and 
function len = pathLength (D, S) 
    [R & lt, C] = size (D); 
    NIND = size (S,. 1); 
    for I =. 1: NIND 
        P = [S (I, :) S (I,. 1)]; 
        I1 = P (. 1: End-. 1); 
        I2 = P (2: End); 
        len (I,. 1) = SUM (D ((I1-1) * C + I2)); 
    End 
End

  Note Functions newSolution, my strategy is to randomly selected two position to be exchanged. In fact you can also use other strategies to generate a new solution, but I'm too lazy, so the thought of a better method for generating a new solution.

 

Program results

                                

 

 

                                  

 

 

 

   Finally, I tested when the number of cities to 50, need to increase the number of iterations, the initial increase in temperature in order to get a better solution. Of course, the way I create new solution is also relatively simple, you can also consider this aspect with better ideas to generate new solution.

 

 

references

[1] Yu Lei, Shi Feng, Wang Hui, etc., "matlab intelligent algorithms 30 case studies (Second Edition)" 2015.8.

 

 

 

 

 

 

 

      

Guess you like

Origin www.cnblogs.com/wenyehousheng/p/11486598.html