Optimization algorithm - a common classification and optimization summary

Before doing feature selection, realized through optimized search algorithm based on swarm intelligence, read some papers swarm intelligence optimization algorithm, to do something in this summary.

  There are a variety of optimization problems in life or work, a problem such as enterprise and individual should be considered "at a certain cost, how to maximize profits" and so on. Optimization method is a mathematical method, which is the study of how to find some of the factors under the given constraints (the amount), so that a (or some) to reach the general term for some of the best indicators of disciplines.

  General reference to engineering design optimization problem (optimalization problem) is to select a set of parameters (variables), to meet a series of constraints (constraints), the design index (target) to achieve optimal value. Thus, the optimization problem can usually be expressed as a mathematical problem in the form of planning. Engineering design optimization, engineering design issues should be expressed in the above form into a mathematical problem, then solve the optimization method. This work is a mathematical model to optimize the design.

optimalization

  
  The optimization problem is divided into function optimization problem and combinatorial optimization problem into two categories, wherein the object is constant function optimization section continuous variable , the optimized object is the combination of the solution space of the discrete states . A typical combinatorial optimization problem where the Traveling Salesman (Traveling salesman problem, TSP) issues, process scheduling (Scheduling problem, such as Flow-shop, Job-shop) , 0-1 knapsack problem (Knapsack problem), the problem packing (the Bin packing problem), FIG coloring problem (Graph coloring problem), clustering (clustering problem) and the like.

Optimization algorithms

According to their own understanding of optimization, using optimization algorithms to solve practical problems can be divided into the following two steps:

  • Mathematical model. Viable coding scheme (variable), constraining conditions and the objective function.
  • Search strategy optimal value. In (under the constraint) feasible solution is the optimal solution search method, there is exhaustive, heuristic search methods and random.

Optimization algorithm has three elements: variable (Decision Variable), constraints (Constraints) and the objective function (in Objective function) . Optimization algorithms, in fact, a search process or rule, it is based on some ideas and mechanisms, to get to meet user requirements of the problem solution through a certain way or rules.

Optimization problems associated algorithm has the following classification:

optimization

Precise algorithm (absolute optimal solution)

Precise algorithms include traditional methods of operations research in linear programming, dynamic programming, integer programming and branch and bound method, the algorithm computational complexity is generally large, only suitable for solving the problems of small-scale , often not practical in the project.

Heuristic Algorithm (approximation algorithm)

  Heuristics refers to a person in solving problems taken discovered a method based on rules of thumb. Its characteristics are in solving problems, using past experience, choice has proven method, rather than systematically, in order to determine the steps to find the answer.

Field search algorithm. Starting from either a solution, its fields and constantly search and replace the current solution to achieve optimization. According search behavior, it can be divided into local search and direct search method.

  • Local search field method (also known as hill-climbing). Local greedy search strategy to optimize the current solution in the field, as the state is better than the current solution only accepted as the next current solution climbing method; to receive the best current solution neighborhood as the method of steepest descent the next current solution Wait.

  • Guidance search method. Use some guidance on rules to guide the exploration excellent solution for the entire solution space, such as SA, GA, EP, ES and TS and so on.

Individual inspiration (look for relatively optimal)

Features: each output is the same. From the beginning of a solution, to find the optimal, easy to fall into local optimum.

Hill-climbing algorithm

Algorithm thinking: from the beginning of the current node, and the value of the surrounding neighbors were compared. If the current node is the largest, then the current node returns, as a maximum value (i.e. HSP); otherwise replaces the current node with the highest neighbors, in order to achieve the purpose of the high mountain climbing.

In fact, in the vicinity of the initial value, find the biggest one.

  • advantage

    • Readily understood, easy to implement, it has a strong versatility;
    • Local development ability, fast convergence
  • Shortcoming

    • Global development capacity is weak, you can only search for local optimal solution;
    • Search results are entirely dependent on the mapping between the initial solution and the neighborhood.

Taboo algorithm (Tabu Search, TS)

The basic idea: the optimal solution or the solution process based on improved hill-climbing algorithm, the mark has been obtained partial solution and avoid further iterations in these local optimal solution or the solution process. Local search drawback is that too for a Bureau large column  optimization algorithm - a common classification and optimization algorithm are summarized region and search their neighborhood, causing blinders. In order to find global optimal solution, tabu search is to find the optimal solution portion of the part, consciously avoid it, or even more so the search area

  • Features:

    • Avoid circulating in the search process
    • Only to subside into the principle by taboos table implementation
    • Not as a stop at a local optimum criterion
    • Neighborhood optimal selection rules to simulate the human memory function
  • Tabu list: to prevent the emergence of search cycle

    • Through several steps before recording point, or target direction, prohibiting return
    • Table is updated dynamically
    • The length of the table is called Tabu-Size
  • The main indicators of tabu list (two indicators)

    • Contraindicated in: those elements of change in the taboo list of banned
    • Taboo length: the number of steps taboo
  • Contraindicated (three variations)

    • Varying state or the state itself as contraindicated
    • Varying state component and a component contraindicated
    • Contour similar approach to target values ​​as the change contraindicated
  • Taboo length: may be a fixed constant (T = c), it may also be dynamic, may vary in the interval according to certain rules or formulas.

    • Contraindications length is too short, once trapped in local optima, there can not jump out of the cycle;
    • Taboo is too long, candidate solutions are all taboo, resulting in greater calculation time, can also cause calculation can not continue.

reference:

  1. Tabu Search Algorithm (Tabu Search)
  2. Tabu Search Detailed

Greedy algorithm

Starting a solution to the problem from an initial successive approximation given target, as quickly as possible to seek a better solution. Upon reaching a certain step in the algorithm can not continue to move forward, the algorithm stops.

Basically you have to sort of turn that sort of judgment from the beginning, in line on the left does not conform to remove.

Simulated annealing (simulated annealing, SA)

Extended simulated annealing algorithm as a local search algorithm, a modified model in each process, randomly generates a new state model, then the probability of choosing a certain large energy neighborhood state value. This way accept the new model make it a global optimal algorithm, and verified proof theory and practical application. SA Although no doubt on the optimization ability, but it is a strict plan in order to ensure the annealing, specifically, a sufficient number of disturbances is sufficiently high initial temperature, slow speed annealing, a large number of iterations and the same temperature .

The story of a rabbit for: rabbit drunk. He randomly jump for a long time. During this period, it may go up high, may also step into the ground. However, he began to wake up and jump up towards the direction he stepped on. This is simulated annealing.

In fact, the first carried out with an initial value of the random update, record the value of each update, and finally take the history of the greatest value.

Reference: simulated annealing algorithm

Swarm intelligence (global optimum)

category:

  • Particle swarm optimization (PSO)
  • Ant colony algorithm (ACO)
  • Artificial bee colony algorithm (ABC)
  • AFSA (AFSA)
  • Leapfrog shuffling algorithm (SFLA)
  • Fireworks algorithm (FWA)
  • Bacterial foraging optimization (BFO)
  • Firefly algorithm (FA)

Features:

  • Global optimization
  • Every solution is different
  • A long time

Intelligent computing include:

  • Evolutionary Algorithms (EC), such as genetic algorithms.
  • Fuzzy Logic
  • Swarm Intelligence (SI) algorithm
  • Artificial immune system (AIS)
  • Artificial Neural Network (ANN)

reference:

  1. Optimization problem and its classification
  2. Genetic Algorithms
  3. "MATLAB neural network 30 case studies" of 13 cases of SVM parameter optimization GA
  4. It taught you how to achieve SVM algorithm (a)
  5. Genetic Algorithm Study Notes (a): common selection policy
  6. PSO algorithm introduced (on the very clear, the principle of the PSO algorithm, algorithm characteristics and parameter settings)
  7. About smart groups PPT (PSO and artificial ant colony optimization)
  8. Optimization Category

Guess you like

Origin www.cnblogs.com/lijianming180/p/12026703.html