2021-01-28 Particle swarm optimization algorithm-Python version and Matlab function particleswarm call

PSO -Python version and Matlab function particleswarm call

Two days ago, I shared the principle of particle swarm optimization algorithm and the realization of Matlab principle. This article shares the implementation of PSO under Python code and the particle swarm function under Matlab.

Refer to the previous article: Particle Swarm Optimization Algorithm (PSO)


Taking Ras function (Rastrigin's Function) as the objective function, find its minimum value on x1, x2 ∈ [-5, 5]. This function is very deceptive to algorithms such as simulated annealing and evolutionary computing, because it has a lot of local minimum points and local maximum points, which can easily make the algorithm fall into the local optimum, and cannot get the global optimum solution. As shown in the figure below, this function only has a global minimum value of 0 at (0,0).

image

 

image

Python code implementation

​​​​​​​

import numpy as npimport matplotlib.pyplot as plt

# 目标函数定义def ras(x):    y = 20 + x[0] ** 2 + x[1] ** 2 - 10 * (np.cos(2 * np.pi * x[0]) + np.cos(2 * np.pi * x[1]))    return y

# 参数初始化w = 1.0c1 = 1.49445c2 = 1.49445
maxgen = 200  # 进化次数sizepop = 20  # 种群规模
# 粒子速度和位置的范围Vmax = 1Vmin = -1popmax = 5popmin = -5
# 产生初始粒子和速度pop = 5 * np.random.uniform(-1, 1, (2, sizepop))v = np.random.uniform(-1, 1, (2, sizepop))
fitness = ras(pop)  # 计算适应度i = np.argmin(fitness)  # 找最好的个体gbest = pop  # 记录个体最优位置zbest = pop[:, i]  # 记录群体最优位置fitnessgbest = fitness  # 个体最佳适应度值fitnesszbest = fitness[i]  # 全局最佳适应度值
# 迭代寻优t = 0record = np.zeros(maxgen)while t < maxgen:
    # 速度更新    v = w * v + c1 * np.random.random() * (gbest - pop) + c2 * np.random.random() * (zbest.reshape(2, 1) - pop)    v[v > Vmax] = Vmax  # 限制速度    v[v < Vmin] = Vmin
    # 位置更新    pop = pop + 0.5 * v    pop[pop > popmax] = popmax  # 限制位置    pop[pop < popmin] = popmin
    '''    # 自适应变异    p = np.random.random()             # 随机生成一个0~1内的数    if p > 0.8:                          # 如果这个数落在变异概率区间内,则进行变异处理        k = np.random.randint(0,2)     # 在[0,2)之间随机选一个整数        pop[:,k] = np.random.random()  # 在选定的位置进行变异     '''
    # 计算适应度值    fitness = ras(pop)
    # 个体最优位置更新    index = fitness < fitnessgbest    fitnessgbest[index] = fitness[index]    gbest[:, index] = pop[:, index]
    # 群体最优更新    j = np.argmin(fitness)    if fitness[j] < fitnesszbest:        zbest = pop[:, j]        fitnesszbest = fitness[j]
    record[t] = fitnesszbest  # 记录群体最优位置的变化
    t = t + 1
# 结果分析print(zbest)
plt.plot(record, 'b-')plt.xlabel('generation')plt.ylabel('fitness')plt.title('fitness curve')plt.show()

The result is

[0.99699579 0.00148844]

image

It can be known that the solved point is not a minimum, and the algorithm is trapped in a local minimum.

Delete the annotations in the adaptive mutation part, and the results after running are as follows, you can see that it converges to the global optimal solution.

[0.00022989 0.00014612]

 

image

 

Matlab has its own particle swarm optimization function particleswarm that can also be used. The code for this example is as follows:

 

y = @(x) 20 + x(1).^2 + x(2).^2 - 10*(cos(2*pi*x(1))+cos(2*pi*x(2)));rng defaultoptions = optimoptions('particleswarm','SwarmSize',200,'HybridFcn',@fmincon,'MaxIterations',200, 'Display','iter');lb = [-5 -5];     % 这是变量的下限ub = [5 5];       % 这是变量的上限[x,fval,exitflag,output] = particleswarm(y,length(lb),lb,ub,options);

The result is as follows

image

Details of particleswarm reference:

https://www.mathworks.com/help/gads/particleswarm.html

Matlab documentation

Guess you like

Origin blog.csdn.net/qingfengxd1/article/details/113355493