The multi-strategy enhanced parrot optimization algorithm (MEPO) integrates several optimization techniques and improvements to enhance global search capabilities and convergence speed. Below is an overview of each improvement strategy:
- Population Initialization Using Cat Mapping + Reverse Strategy:
- Cat Mapping Initialization: Utilizes the 'cat mapping' technique to effectively initialize the population, enhancing diversity and initial exploration capability. Cat mapping is a heuristic method that simulates cat behavior to generate initial solutions.
- Reverse Strategy: Introduces a reverse strategy during initialization to increase population diversity and avoid getting trapped in local optima.
- Adaptive Weight Switching Factor:
- Weight Switching Factor: Typically refers to a factor that adjusts the weights of different strategies or parameters during the algorithm's execution. The adaptive weight switching factor dynamically adjusts based on the algorithm's performacne and problem characteristics to optimize global search and local convergenec capabilities.
- Hybrid Cauchy and Gaussian Mutation:
- Cauchy Mutation: The Cauchy distribution is often used in optimization algorithms to increase population diversity, with its heavy tails aiding in broader exploration of the search space.
- Gaussian Mutation: The Gaussian distribution is more suitable for fine-tuning and local search, helping the algorithm converge quickly to a local optimum.
- Hybrid Mutation Strategy: Combining Cauchy and Gaussian mutation balances the need for global exploration and local search, improving the algorithm's adaptability and convergence speed in complex optimization problems.
Overall, the multi-strategy enhanced parrot optimization algorithm improves exploration and convergence through these enhancements, making it more suitable for complex, multimodal optimization problems.
One. Population Initialization Using Cat Mapping + Reverse Strategy
To maintain population diversity and ensure that the initial population individuals are as uniformly distributed as possible, this paper combines a chaotic initialization method with a reverse learning initialization strategy, proposing a chaotic reverse learning initialization strategy that helps accelerate the convergence of the algorithm. Hybrid optimization algorithms that combine chaotic mapping sequences with traditional optimization algorithms have emerged and achieved relatively good results. However, these algorithms are all based on the Logistic map, which is affected by the uneven traversal of the Logistic map, and the Logistic map is sensitive to initial value settings, with relatively poor traversal and uniformity (high density at the edges of the mapping points and lower density in the middle of the interval), directly affecting the traversal performance of the chaotic search.
Formula as follows:
- CatMap.m function
%% cat mapping
function Xout = catMap(dim)
a = 1;b = 1;
x1 = zeros(1,dim);
y1 = zeros(1,dim);
x = rand(dim);
y = rand(dim);
N = 1;
for i = 1:dim
x1(i) = mode(x(i) + b.*y(i),N);
y1(i) = mode(a*x(i) + a.*b.*y(i),N);
end
Xout = (x1 - min(x1))./(max(x1) - min(x1));
end
Reference: Xu Chenhua, Li Chengxian, Yu Xin, Huang Qingbao. Improved gray wolf optimization algorithm based on Cat chaos and Gaussian mutation [J]. Computer Engineering and Applications, 2017, 53(04): 1-9+50.
Two. Adaptive Switching Factor
The exchange behavior phase can modify and transform the H variable using an adaptive transformation formula
H = rand(1)*((Max_iter-i)/Max_iter);
Three. Hybrid Cauchy and Gaussian Mutation
Formula reference this
Of course, you can also use only the Gaussian mutation formula
%% ★★Improved 3: Hybrid Cauchy and Gaussian Mutation
for j =1:N
%% Cauchy Mutation
pd = makedist('tLocationScale','mu',0,'sigma',1,'nu',1);
kexi(j, :) = random(pd,1,1);
%% Gaussian Mutation
gaosi(j, :) = X_new(j, :).*randn();
%% Hybrid Cauchy and Gaussian Mutation
w1 = rand();
w2 = rand();
temp = X_new(j, :) + (1 + w1 * gaosi(j, :) + w2 * kexi(j, :));
Flag4ub=temp>ub;
Flag4lb=temp<lb;
temp=(temp.*(~(Flag4ub+Flag4lb)))+ub.*Flag4ub+lb.*Flag4lb;
if fobj(temp) < fobj(X_new(j, :))
X_new(j, :) = temp;
end
end