[translated from: Dual Annealing Optimization With Python]
[Note: Jason Brownlee likes PhD's articles very much, so he will do some translation and learning practice in his spare time. Here is the practice record of the corresponding work, hoping to help people in need!]
Dual Annealing is a stochastic global optimization algorithm. It is an implementation of generalized simulated annealing algorithm and an extension of simulated annealing algorithm. In addition, it is paired with a local search algorithm that is automatically executed at the end of the simulated annealing process. This combination of effective global and local search procedures provides a powerful algorithm for challenging nonlinear optimization problems.
In this tutorial, you will find the double annealing global optimization algorithm. After completing this tutorial, you will learn:
Double annealing optimization is a global optimization. It is a modified version of simulated annealing. It also uses local search algorithm. How in python Double annealing optimization algorithm is used in API. An example of using double annealing to solve a global optimization problem with multiple optimal solutions.
Tutorial overview
This tutorial is divided into three parts; They are:
What is double annealing Double annealing API Double annealing example
What is double annealing
Dual Annealing is a global optimization algorithm. Therefore, it is designed for the objective function with nonlinear response surface. It is a random optimization algorithm, which means that it uses randomness in the search process, and each search run may find different solutions.
Dual Annealing is based on simulated annealing optimization algorithm.
Simulated annealing is a random mountain climbing, in which the candidate solution is modified in a random way, and the modified solution is accepted to replace the current candidate solution probabilistic. This means that a worse solution may replace the current candidate solution. The probability of this substitution is high at the beginning of the search and decreases with each iteration, which is controlled by the "temperature" superparameter.
Double annealing is an implementation of classical simulated annealing (CSA) algorithm. It is based on the generalized simulated annealing (GSA) algorithm described in the 1997 paper "generalized simulated annealing algorithm and its application in Thomson model". It combines the probability acceptance of the annealing program (the rate at which the temperature decreases in the algorithm iteration) in the "fast simulated annealing" (FSA) and the alternative statistical program "Tsallis statistics" specified for the author. Experimental results show that the performance of this generalized simulated annealing algorithm seems to be better than the classical or fast version of the compared algorithm.
In addition to these modifications of simulated annealing, the local search algorithm can be applied to the solution found by simulated annealing search. This is desirable because global search algorithms are usually good at locating basins (areas in the search space) for the best solution, but often perform poorly in finding the best solution in the basin. The local search algorithm is good at finding the optimal value of the basin.
Pairing a local search with a simulated annealing procedure ensures that the search makes full use of the candidate solutions located. Now that we are familiar with the double annealing algorithm at a high level, let's take a look at the API of double annealing in Python.
Double annealing API
Through dual in Python_ The announcing() SciPy function can use the dual announcing global optimization algorithm. This function takes the name of the objective function and the boundary of each input variable as the minimum parameters of the search.
# perform the dual annealing search result = dual_annealing(objective, bounds)
There are many additional super parameters for search that have default values, but you can configure them to customize the search. The "maximum" parameter specifies the total number of iterations of the algorithm (not the total number of function evaluations). The default is 1000 iterations. If you need to limit the total number of function evaluations, you can specify "maxfun", and the default value is 10 million. The initial temperature of the search is specified by the "initial_temp" parameter, and the default is 5230. Once the temperature reaches a value equal to or less than (initial_temp * restart_temp_ratio), the annealing process will be restarted. This ratio defaults to a very small number 2e-05 (i.e. 0.00002), so the default trigger for re annealing is (5230 * 0.00002) or a temperature of 0.1046. The algorithm also provides control of hyperparameters specific to the generalized simulated annealing on which it is based. This includes how far you can jump through the "access" parameter during the search, which defaults to 2.62 (a value less than 3 is preferred), and the "accept" parameter that controls the possibility of accepting a new solution, which defaults to - 5 Use the default super parameter for local search and call the minimum() function. Local search can be configured by providing a dictionary of the name and value of the super parameter to the "local_search_options" parameter.
You can disable the local search component of the search by setting the "no_local_search" parameter to True. The result of the search is an OptimizeResult object in which the properties can be accessed like a dictionary. You can access whether the search is successful through the success or message key. The total number evaluated by the function can be accessed through "nfev" and the best input found for the search can be accessed through the "x" key.
Now that we are familiar with the dual annealing API in Python, let's look at some working examples.
Double annealing example
In this section, we will look at an example of using double annealing algorithm on multimodal objective function. Ackley function is an example of multimodal objective function. It has a single global optimal solution and multiple local optimal solutions, in which the local search may get stuck.
Therefore, global optimization technology is needed. It is a two-dimensional objective function with a global optimal value at [0,0], and its calculation result is 0.0. The following example implements Ackley and creates a three-dimensional surface graph showing global optima and multiple local optima.
# ackley multimodal function from numpy import arange from numpy import exp from numpy import sqrt from numpy import cos from numpy import e from numpy import pi from numpy import meshgrid from matplotlib import pyplot from mpl_toolkits.mplot3d import Axes3D # objective function def objective(x, y): return -20.0 * exp(-0.2 * sqrt(0.5 * (x**2 + y**2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20 # define range for input r_min, r_max = -5.0, 5.0 # sample input range uniformly at 0.1 increments xaxis = arange(r_min, r_max, 0.1) yaxis = arange(r_min, r_max, 0.1) # create a mesh from the axis x, y = meshgrid(xaxis, yaxis) # compute targets results = objective(x, y) # create a surface plot with the jet color scheme figure = pyplot.figure() axis = figure.gca(projection='3d') axis.plot_surface(x, y, results, cmap='jet') # show the plot pyplot.show()
Running this example creates a surface graph of the Ackley function that shows a large number of local optima.
We can apply double annealing algorithm to Ackley objective function. First, we can define the boundary of the search space as the limit of the function in each dimension.
# define the bounds on the search bounds = [[r_min, r_max], [r_min, r_max]]
We can then apply the search by specifying the name of the objective function and the search scope. In this case, we will use the default super parameter.
# perform the simulated annealing search result = dual_annealing(objective, bounds)
When the search is complete, it reports the search status and the number of iterations performed, as well as the best results found through the evaluation.
# summarize the result print('Status : %s' % result['message']) print('Total Evaluations: %d' % result['nfev']) # evaluate solution solution = result['x'] evaluation = objective(solution) print('Solution: f(%s) = %.5f' % (solution, evaluation))
Taken together, a complete example of applying double annealing to the Ackley objective function is listed below.
# dual annealing global optimization for the ackley multimodal objective function from scipy.optimize import dual_annealing from numpy.random import rand from numpy import exp from numpy import sqrt from numpy import cos from numpy import e from numpy import pi # objective function def objective(v): x, y = v return -20.0 * exp(-0.2 * sqrt(0.5 * (x**2 + y**2))) - exp(0.5 * (cos(2 * pi * x) + cos(2 * pi * y))) + e + 20 # define range for input r_min, r_max = -5.0, 5.0 # define the bounds on the search bounds = [[r_min, r_max], [r_min, r_max]] # perform the dual annealing search result = dual_annealing(objective, bounds) # summarize the result print('Status : %s' % result['message']) print('Total Evaluations: %d' % result['nfev']) # evaluate solution solution = result['x'] evaluation = objective(solution) print('Solution: f(%s) = %.5f' % (solution, evaluation))
Run the example, perform the optimization, and then report the results.
Note: your results may be different due to the randomness or numerical accuracy of the algorithm or evaluation program. Consider running the example multiple times and comparing the average results.
In this case, we can see that the algorithm finds the optimal value whose input is very close to zero and the objective function evaluation is actually zero.
We can see that a total of 4136 function evaluations were performed.
Status : ['Maximum number of iteration reached'] Total Evaluations: 4136 Solution: f([-2.26474440e-09 -8.28465933e-09]) = 0.00000