On Europe algorithm - Simulated Annealing

I heard the beginning of the term annealing feel very (zhuang) Shuai (A__CDEFG ...)

After annealing until learned, I found:

Annealing is not only handsome, but also very versatile

Even better than D (large) F (France) S (division) but also universal

It is simply a lie to (de) points artifact ah

Brief introduction

As a computer algorithm, it actually has a physical entry on Baidu!

annealing

At that time I looked on the ignorant, you say a computer algorithm, have anything to do with metal smelting ah?

Then I saw the entry algorithm ...

Annealing algorithm

Is not ignorant ... more

To facilitate understanding (Become more ignorant), I moved on the definition of Baidu:


Simulate Anneal Arithmetic (SAA, simulated annealing algorithm)
according to the Metropolis criterion, the particles tend to equilibrium at the temperature T probability e-ΔE / (kT), where T e is the temperature of the
internal energy, its amount of change [Delta] E, k is Boltzmann's constant. Simulated annealing optimization problems with solid compositions, the internal energy E is modeled as the objective function value F, the temperature T evolved into control parameter t, simulated annealing algorithm to obtain Solution combination optimization problems: i the initial solutions and the control start initial parameters t, repeat of the current solution, "a new objective function is calculated difference solution → → accept or discard" iteration and gradually attenuation value t, the current solution that is obtained when the algorithm terminates near-optimal solution, which is based on Monte Carlo method iterative solver a heuristic random search process. Annealing process is controlled by the cooling schedule (Cooling Schedule), including the control parameters and their initial value t attenuation factor Δt, when the number of iterations for each value of t L and stop conditions S.

(Lazy, then do not look worth mentioning, originally used to scrape together words of)


(?) To tell the truth after reading quite clear, what is the total of:

The new solution by constantly generating random, and the relatively new solution to the current size of the gap between the best answer; if there is a new solution better than most of today, a new solution in the case re-produced in the vicinity of the new solution. Through a lot of cycles, so as to achieve the ultimate guide answers tend purpose to standard optimal solution.

Feeling it a little bit.

Besides the simple point is this:

Mongolian lightly answer, if you feel better Mongolia, Mongolia will continue around it

As long as you enough Mongolia, you can always correct answer to Mongolia

...

(In fact, it seems with metal annealing also have nothing, nothing Los upper right corner of the valley Daily 2018: On the metaphysics algorithm - annealing Read on)

->continue

Implementation process

Since it is annealed, it would have to fire it.

Ok...

Sorry

This really did not fire (Open the host computer, you will see the fire on your CPU

Inside the computer, the annealing process is mainly used three parameters: the initial temperature T, the temperature of annealing after each change amount delta, end temperature t. Wherein a is a number slightly smaller than 1.

After these three variables explain the use of it, in fact it should be understood that the use of the annealing can understand the principle.

Initial temperature T:

The main determinant of our random Mongolia answers range. T greater, we will randomly answer the more widely distributed. The main guarantee is that the answer to the algorithm will not offset data (too big or too small) but missed a possible positive solution.

As long as the data, T would be treated equally.

End temperature t:

Our final decision algorithm accuracy. t is smaller, we finally generate random answers range will be smaller. Primarily to prevent a large number of random meaningless because the random range caused by excessive, to ensure our final answer and the answer is almost the same standard (basic willTest points are ignoredOr exceed the required accuracy of the output), give images chestnuts, we t and generally is -15 power of 1e-15 (10 in). Here the answer even if there is a difference with the standard answer, you will have to get the title of this output accuracy of it?

Change in delta (I will not play that mathematical symbols ...):

Let T be a wide range of random, t said will randomly given to a small area. What in the end random with ah!

This time delta appeared, its role is to make random after each temperature is reduced a little bit (really just a little bit). Let's start from the pursuit of random breadth of T gradually changes to emphasize the details of t. It means we annealing both the breadth of T, the answer will not leak; t have accuracy.

Some people might look at all here would senseless. You randomly necessary to enumerate breadth, but also behind the narrow range. I know that ye range is not right. Chances are you do random Mongolia for a long time in the vicinity of the wrong answer?

The answer to this question is actually the essence of annealing algorithm:

For each random answer: if it is better than the current optimal solution, it is accepted as the current optimal solution; if not, accept it places a probability associated with the current temperature.

->continue

understanding

The first person here to see all the possible questions people will ask.

Q: When random if not created a new current best solution, why not just start the next round of random, but rather to do with the probability of a current temperature of accepting it do?

Ah ... In fact, thisSimply can not explainVery good explanation. As we all know, in fact, it does not represent current best overall optimum. So when we randomly generated a new solution, even if the current solution is optimal, we still can not guarantee that it will be the best overall. And every time we enumeration is associated with a range of t, so in fact not begin to enumerate all possible solutions; but the enumeration of solving a super range (in the best fluctuate on the basis of a current of about t function). The standard optimal solution may be the optimal solution to the current location or number of quite different. Large randomized to once could not reach that range. So we need some intermediate number as excessive. So that we can random the optimal solution with the whole, and will not beMalicious card data out of the question framed Current best blinded.

So it is time to complete this talk about the algorithm;

Rand random function, the current local optima random minus (the floating rate is positively correlated with the current temperature) to generate a new solution. For each new solution, if it is better than the current. Unconditionally accept it as the current optimal solution, otherwise places a probability (probability size is still to be positively correlated with the current temperature) to accept it. After the end of each random, the current temperature will decrease a little.

If you carefully read the previous estimate of people should have autistic I understand.

It is very clearly reflects the style of the computer (simple and crude, seeking to speed quality)

Of course, some people will want to ask.

Q: Now that the probability of accepting the current not-so-optimal solution at high temperatures close to 1, which is basically accepted. So what finally able to get the optimal solution it?

In fact, when I was in school before annealing also have this problem. The answer to this question is actually annealing algorithm theory has been able to get the optimal solution.

Because if better solution appears, we do not consider the probability unconditional acceptance; therefore a more optimal solution, which is acceptable to the greater total probability (though probably only large-optimal solution times than 0.00001%). But we continue to expand through random operation millions of times the difference between the probabilities (the expansion is exponential growth). Eventually making optimal solution survival probability close to one.

This random in the end with how many times it?

Each question has a different number, but the magnitude of the problem is most similar (program time so 1000ms).

Take P2210 Haywire example to answer (in fact, I do not know before). I am in my code which added a countt variable, and each time increment random operation it once, and finally its value be?

Probably so much

(17 is the answer, ignoring their own out on the line)

A full 2,357,060! ! !

It is worth mentioning that I have this problem with the parameters:

Initial temperature: 3000

End temperature: (power of 10 minus 17) 1e -17

After each change in temperature end random value: 0.996 (* new temperature = temperature change value)

Annealing: 19 (on the first three parameters called repeated 20 times)

In fact, do not be too surprised, basically all the questions annealing parameters are these (the number may be a little difference)

->continue

example

(Or put examples speak easily point) P1337 balance

As a pure geometric and physical problems, not even to people who do of natural ...

This problem simulated annealing to do is to consider the final solution because this problem is only one, besides better verification solution (O n the complexity is also acceptable); and may result in loss of precision if the floating-point operations, then ultimately affect the answer.

#include<iostream>
#include<stdlib.h>
#include<cmath>
#include<cstdio>
#define cold 0.996 // 降温系数
#define temperature 3000 // 初始温度
#define time 4 // 退火次数( 这种纯物理题不用太多次 )
#define INF 999999
using namespace std ;
struct spot{
    int x ;
    int y ;
    int weight ;
};
spot map[1001] ;
int n ;
double t ;
double tmpx , tmpy , tmpw ;
double ansx , ansy , anse ;
//double nowx , nowy , nowe ;
double energy ( double x , double y ){ // 判断当前点的能量,能量越小越稳定
    double e = 0 ;
    for( int i = 1 ; i <= n ; i ++ ){
        tmpx = x -map[i].x ;
        tmpy = y - map[i].y ;
        tmpw = map[i].weight ;
        e += sqrt( tmpx * tmpx + tmpy * tmpy ) * tmpw ;
    }
    return e ;
}
void colddown(){
    t = temperature ;
    while( t > 1e-15 ){ // 开始模拟退火
//      cout << "debug" << " " << t << endl ;
        double nowx = ansx + ( ( rand() * 2 ) - RAND_MAX ) * t ; // 随机产生一个值,使点位移
        double nowy = ansy + ( ( rand() * 2 ) - RAND_MAX ) * t ;
        double nowe = energy( nowx , nowy ) ;
        double de = nowe - anse ;
//      cout << nowx << " " << nowy << endl ;
        if( de < 0 ){ // 如果当前解更优,就直接接受
            ansx = nowx ;
            ansy = nowy ;
            anse = nowe ;
//          cout << anse << endl ;
        }
        else{
            if( exp( -de / t ) * RAND_MAX > rand() ){ // 否则以一个概率接受
                ansx = nowx ;
                ansy = nowy ;
            }
        }
        t *= cold ;
    }
}
void solve(){
    for( int i = 1 ; i <= time ; i ++ ){
        colddown() ; // 多来几次,保证结果正确性
    }
}
int main () {
    srand( 201821307 ) ; // 一个好的随机数种子很重要
    cin >> n ;
    for( int i = 1 ; i <= n ; i ++ ){
        cin >> map[i].x >> map[i].y >> map[i].weight ;
        ansx += map[i].x ;
        ansy += map[i].y ;
    }
    ansx /= n ;
    ansy /= n ;
    anse = energy( ansx , ansy ) ;
    solve() ;
    printf( "%.3lf  %.3lf" , ansx , ansy) ;
    return 0 ;
}

(With a few commented that when the transfer program)

In fact, this question should also Needless to say, if you understand the basic principles of the previous is not difficult to understand.

(Even water has more than 200 lines)

->continue

Scope

Although annealing algorithm looks like a very violent panacea, but in fact dfs root, just as there are a range suitable.

  1. The answer is simple, and not too much data to be output topics

  2. The answer can be verified in a relatively short period of time (NP class problems can try)

  3. Relationship between the different solutions have significant conversion of each other and can be made into a solution by an operation of another random Solutions

  4. Solution Set 1 to 3 only half of the peak, exhibits a monotonic function of the overall number

  5. Own enough white faceNon-Emirates tears came to WA

Finally add that, you may algorithm and without any problems, but you A topic is not. In fact, this is caused by annealing algorithm, after all, is a random algorithm. If the three coefficients annealing or random seed used is not good enough. WA is normal.

(Such as this question I did not change any code, but with different parameters to annealing, the result but it can result in different scores)

Finally, I wish you only find the seeds of all Ac Tuoourufei

Guess you like

Origin www.cnblogs.com/CHNmuxii/p/12232477.html