Optimization algorithm notes

My personal understanding of the optimization algorithm is that there is a system F. We want to find the best parameter X to make F ( X ) run in the best state.

First of all, there are many possibilities for this X. We can try it out in the system one by one, but it is not necessary. This is like a man looking for a wife. Men want to find a wife X that is most suitable for them . The purpose is to make family F run better. This is a process of optimizing. The collection of women from all over the world is the domain of optimizing. But you can't try to marry women from all over the world, although the result of the test is definitely the best overall, but it is not necessary. So men have a steelyard in their hearts, what kind of woman is more suitable for them, this is the evaluation function E , such as E=0.5*goodness+0.5beautiful, so that some unsuitable parameters X can be quickly eliminated , such as a woman’s X =[Dirty, Vicious]. However, E and F are fundamentally different. As a system , F exists objectively, and as an evaluation function, E is subjective.

Genetic algorithm

(1) initialize the M th X

(2) Evaluate these Xs , sort them , and select the top N better Xs as the candidate set

(3) Exchange the genes of X in the candidate set to generate some new X

(4) Change the gene of X in the candidate set to generate some new X

(5) Return to (2), the previous round is one generation, the N generation is the termination condition, and the best X is output

 

Simulated annealing algorithm

(1) Initialize an X

(2) Add random disturbance to X to generate X '

(3) Evaluate whether X 'is better than X , that is Δ T = E ( X ')- E ( X ), if it is, replace X, otherwise replace X according to a certain probability, the probability calculation formula is

                                                                              exp(-ΔT/T)>Random(0,1)

(4) The number of iterations is reached, and no better X'is found, then output X, otherwise reduce T and continue to iterate

 

Particle swarm optimization algorithm

(1) initialize the M th X

(2) Evaluate these X , calculate the individual extreme value pBest and the overall extreme value gBest

(3 ) Update each X by the formula V = wV + C 1* Random (0,1)*( pBest - X )+ C 2* Random (0,1)*( gBest - X )

(4) Finally all X will tend to the same point

 

Differential evolution algorithm

(1) initialize the M th X

(2) Corresponding to each X , select the other three Xs and perform mutation. The mutation formula X '= X 1+ f ( X 2- X 3)

(3) Corresponding to each X and X ', randomly generate a new Xnew

(4) Evaluate X and Xnew , and replace X if Xnew is better

(5) Reach the number of iterations and output the best X

 

 

(Too many, slowly sort it out later. Please leave a message for discussion, please correct me if you make any mistakes.)

Guess you like

Origin blog.csdn.net/XLcaoyi/article/details/107915110