Tip: MATLAB code and experimental data are attached at the end of the article. I hope you can give your valuable suggestions after reading. Thank you first!
preface
It is suggested that the classical simulated annealing algorithm is an intelligent optimization algorithm with strong robustness, few setting parameters and high fault tolerance. When solving the TSP problem, it was criticized for too many invalid generations and poor stability of multiple solutions. The following is an improvement of simulated annealing algorithm to solve the TSP problem.
Tip: the following is the main content of this article
1, The shortcomings of classical simulated annealing algorithm in solving TSP problem
1. Slow optimization speed
Usually the algorithm is based on the initial temperature T(Usually 100*n,n Starting with the number of cities, The end condition of the algorithm is to make it less than a small number through continuous iteration. But it also leads to high time-consuming algorithm.
2. Too many invalid generations
The so-called invalid generation is the generation whose result is almost unchanged (defined by yourself). (with T>0.001 As an example, the invalid generation of classical simulated annealing algorithm can reach nearly one thousand generations. It's a waste of time.
2, Solution
By accelerating the end of the algorithm, increasing the length of Markov chain and changing the decreasing speed of temperature, Step cooling can speed up the end of the algorithm, greatly improve the speed of the algorithm and improve the accuracy of the solution, So that the expectation, variance and maximum value of the tenth century generation are more stable.
1. Experimental code
The improvement code of the first generation is as follows:
%%%The original example of this code is solved by the most preliminarily improved simulated annealing algorithm tsp problem%% %The following is the original example after modification% close all; clear; clc; tic C = read_location('dsj32.text'); %%change %%%%%%%%%%%%%%%%%%%%%%%%%% n = size(C,1); L = 40*ceil(n*0.1); %%The length of Markov chain changes with the incoming size of data, so as to be adaptive D = mat_distance(C,n); %%Change, simplify operation % for gen = 1:10 % result = zeros(1,100); %%Test cycle % for y = 1:100 K = 0.99; %%%%%%%%%%%%%%%%%%%%%%%%%% T = 100*n; % clear len; for i = 1:n city(i) =i; %%change end l = 1; len(l) = func(city, D, n); %%change % figure(1); %%%%%%%%%%%%%%%%%%%%%%%%%% while T > 0.001 for i = 1:L len1 = func(city, D, n); %%change p1 = floor(1+n*rand()); p2 = floor(1+n*rand()); while p1==p2 p1 = floor(1+n*rand()); p2 = floor(1+n*rand()); end tmp_city = city; tmp = tmp_city(p1); tmp_city(p1) = tmp_city(p2); tmp_city(p2) = tmp; len2 = func(tmp_city, D, n); %%change delta_e = len2 - len1; if delta_e <0 city = tmp_city; else if exp(-delta_e/T) > rand() city = tmp_city; end end end l = l+1; len(l) = func(city,D,n); %%change T = T*K; % for i = 1:n-1 % plot([C(city(i),1),C(city(i+1),1)], [C(city(i),2), C(city(i+1),2)], 'bo-'); %%change % hold on; % end % plot([C(city(n),1),C(city(1),1)], [C(city(n),2), C(city(1),2)], 'ro-'); %%change % hold off; % pause(0.01); end % figure(2); % plot(len) % xlabel('Number of iterations') % ylabel('Objective function value') % title('Fitness evolution curve') min(len) toc % result(y) = min(len); %Record the experimental results % end %End of experiment % End_result = [mean(result),median(result),min(result),max(result),std(result)]; % pFile = fopen('SA_tsp.txt','a+'); % if ~pFile % sprintf('Open_files_error!'); % return % end % fprintf(pFile,'%.0f %.0f %.0f %.0f %.4f \n',End_result(1), End_result(2), End_result(3), End_result(4), End_result(5)); % fclose(pFile); % end
The experimental data of the first generation improvement are as follows:
Single point perturbation no variable speed annealing no early end judgment
Number of experiments: 100 times / group number of positions: 32
Number of iterations: 1400 < gen < 1550, average generation time: 0.272s
Expected median minimum maximum standard deviation
566173 561349 498613 664315 32937.0719
567024 559306 489836 673330 37308.1659
568564 564743 494002 663386 38425.2259
572630 566098 509389 691034 35376.3587
574995 570499 506533 680560 34439.1190
562617 560192 497545 659842 31493.1453
564005 557556 498730 662859 37184.1595
571313 567119 510284 669393 33314.2090
566368 562717 500297 684307 36246.0635
568557 564762 498730 692183 38245.2470
Single point perturbation no variable speed annealing no early end judgment
Number of experiments: 100 times / group number of positions: 64
Number of iterations: 1500 < gen < 1600, average generation time: 0.723s
Expected median minimum maximum standard deviation
846473 846076 744589 980017 44072.5765
841996 842793 737311 952923 43199.4095
847630 841254 754021 981148 46654.6824
841312 838790 751072 946979 42931.0817
838554 834583 754667 990047 45172.0612
854117 850099 763757 951398 47274.6587
831714 828882 744542 941534 41155.0120
835390 835150 747665 985258 49525.0937
838086 834468 749835 944272 44002.4997
850357 840045 762102 982172 47294.8956
Single point perturbation no variable speed annealing no early end judgment
Number of experiments: 100 times / group number of positions: 128
Number of iterations: 1550 < gen < 1700, average time of 1000 generations: 2.27s
Expected median minimum maximum standard deviation
1225100 1225461 1100145 1356212 55245.5015
1225893 1218932 1103113 1445951 54925.9379
1214384 1204214 1075619 1403661 57908.2771
1232958 1234479 1117244 1353722 47298.9318
1237550 1238951 1064543 1366085 63151.2416
1219350 1216716 1077357 1375301 61219.1011
1226711 1225557 1081612 1431949 64763.5568
1234622 1224319 1076702 1414355 66330.6193
1229287 1226389 1135880 1374757 48683.7266
1218712 1209982 1124012 1407612 53705.8428
The improvement code of the best version is as follows:
close all; clear; clc; tic C = read_location('dsj32.text'); n = size(C,1); L = 40*ceil(n*0.1); D = mat_distance(C, n); delay_idx = ceil(n*0.03); Chaos_idx = ceil(n*0.4); %%%%%%%%%%%%%%%%%%%%%%%%%% % for gen = 1:10 % result = zeros(1,100); % for y = 1:100 T = 100*n; count = 0; pan = 1; K = 0.99; l = 1; Iter = 0; % clear len; record = ones(1,50); for i = 1:n city(i) = i; end len(l) = func(city, D, n); %%%%%%%%%%%%%%%%%%%%%%%% while pan == 1 for i = 1:L len1 = func(city, D, n); p1 = floor(1+n*rand()); p2 = floor(1+n*rand()); while p1==p2 p1 = floor(1+n*rand()); p2 = floor(1+n*rand()); end tmp_city = city; tmp = tmp_city(p1); tmp_city(p1) = tmp_city(p2); tmp_city(p2) = tmp; len2 = func(tmp_city, D, n); delta_e = len2 - len1; if delta_e <0 city = tmp_city; else if exp(-delta_e/T) > rand() city = tmp_city; end end end l = l+1; len(l) = func(city, D, n); %%%%%%Variable speed cooling%%%%%%%%% if l >= Chaos_idx if count == delay_idx if K> 0.9 K = 1 - abs(log(K)); %When 0.99 Is the initial value, and the first 179 are greater than 0.9,196 To 197 are positive and negative boundaries else K=0.9; %Insurance option to avoid unlimited descent end count = 0; end T = T*K; count = count + 1; end %%%%%%Early termination judgment%%%%%%% Iter = Iter + 1; record(Iter)=len(l); if mod(l,50) == 0 Iter = 0; end if l >100 if ( record(randi([31,40])) - record(randi([41,50])) == 0 || ... record(randi([1,10])) - record(randi([11,20])) == 0 ) && ... record(randi([1,10])) - record(randi([41,50])) == 0 pan = 0; end end end %End of one experiment %%%%%%%%%%%%%%%%%%%%%%%%%% % figure(1); % plot(len) % xlabel('Number of iterations') % ylabel('Objective function value') % title('Fitness evolution curve') min(len) % result(y) = min(len); %Record the experimental results % end %End of experiment % % End_result = [mean(result),median(result),min(result),max(result),std(result)]; % pFile = fopen('result_tsp_plus_2_32.txt','a+'); % if ~pFile % sprintf('Open_files_error!'); % return % end % fprintf(pFile,'%.0f %.0f %.0f %.0f %.4f \n',End_result(1), End_result(2), End_result(3), End_result(4), End_result(5)); % fclose(pFile); % end toc
The improved experimental data of the optimal version are as follows:
Single point perturbation has slow annealing: ceil(n*0.03) has early end judgment
Number of experiments: 100 times / group number of positions: 32
Number of iterations: 150 < gen < 220, average generation time: 0.027s
Expected median minimum maximum standard deviation
566165 565087 500297 692471 35380.0116
565677 560941 485409 663355 36670.1092
562519 559887 485225 671126 33646.0139
564830 562887 487063 655802 34553.1247
566538 558890 506121 654428 34557.2271
565394 559291 500911 685008 34606.4036
563719 566444 491807 678816 33781.5247
570993 565151 505156 676283 35252.6242
569010 561412 493222 667139 37338.7443
563925 560576 486350 662886 36849.2120
Single point perturbation has slow annealing: ceil(n*0.03) has early end judgment
Number of experiments: 100 times / group number of positions: 64
Number of iterations: 220 < gen < 310, average time of 1000 generations: 0.122 s
Expected median minimum maximum standard deviation
835042 830176 738481 918978 41547.5318
828589 822509 743191 939677 40363.5212
833490 833704 732472 932675 42346.8312
829592 831291 735157 942589 40438.3012
832698 829725 730337 956147 42701.8277
828210 825905 746766 941774 41221.3060
834143 831956 733450 955474 43492.0436
824923 824682 740975 924024 38341.2791
828110 828581 718824 943491 38189.2520
831634 828052 729814 946426 40390.8248
Single point perturbation has variable speed annealing: ceil(n*0.03) has early end judgment
Number of experiments: 100 times / group number of positions: 128
Number of iterations: 330 < gen < 420, average generation time: 0.602 s
Expected median minimum maximum standard deviation
1216339 1218170 1116461 1357596 52514.7934
1221005 1221660 1066876 1347514 53025.7216
1215320 1215239 1099742 1376208 53300.8834
1219961 1216238 1083560 1358916 54055.5805
1227963 1228090 1098800 1367395 54136.2007
1223414 1219345 1101338 1383174 62595.5185
1223857 1220346 1089884 1387819 65265.6193
1212248 1210289 1073372 1336598 56157.3665
1218565 1219338 1063221 1317497 49642.5716
1218159 1209786 1100458 1440181 68872.1991
summary
Through the improvement, under the same number of location points, it not only ensures the stability of the expected value of the solution, but also improves the accuracy of the solution.