The evolutionary algorithm (EA) is one of the optimization algorithms, and is an optimization method that incorporates natural selection, gene recombination, mutation, etc. as a model.
Evolutionary algorithms are roughly divided into the following four types. --Genetic algorithm --Genetic programming --Evolution strategy --Evolutionary programming
This time, we will deal with the genetic algorithm (GA).
Use the following python library called DEAP. https://deap.readthedocs.io/en/master/
Search for the minimum value of Bird Bunction shown below.
sample_GA.py
import numpy as np
def bird(x):
    x1 = x[0]
    x2 = x[1]
    t = np.sin(x1)*np.exp((1-np.cos(x2))**2) + np.cos(x2)*np.exp((1-np.sin(x1))**2) + (x1-x2)**2 
    return t,
#drawing
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
c1 = c2 = np.arange(-10, 10, 0.1)
xx, yy = np.meshgrid(c1, c2)
len=np.size(c1)  
z=np.empty((len, len))
for i in range(len):
    for j in range(len):
        xin=np.array([xx[i,j], yy[i,j]])
        z[i,j], = bird(xin)
fig0 = plt.figure(figsize=(12,12), dpi=80).add_subplot(111, projection='3d')
fig0.plot_surface(xx, yy, z, alpha=1)
plt.xlim(10, -10)            
plt.ylim(-10, 10)

Import the required libraries.
Since DEAP uses the random library, the results change with each trial.
To avoid this, use random.seed (1).
sample_GA.py
import pandas as pd
import random
from deap import algorithms
from deap import base,creator,tools
# random.seed(1)
Create a function to generate an initial individual. Here, initial individuals were generated in a 10 × 10 grid pattern in the range of -10 to 10 for both x and y.
sample_GA.py
def initPopulation(pcls, ind_init):
    x = np.linspace(-10, 10, 10)
    y = np.linspace(-10, 10, 10)
    inix, iniy = np.meshgrid(x,y)
    contents = np.concatenate([inix.reshape([-1,1]), iniy.reshape([-1,1])], axis=1)
    pcls = [ind_init(c) for c in contents]
    return pcls
Methods such as evaluation ("evaluate"), population ("population"), crossover ("mate"), mutation ("mutate"), and selection ("select") are registered in the toolbox.
For example, the following ``` toolbox.register (" mate ", tools.cxBlend, alpha = 0.2)` `` It means that the function tool.cxblend is registered as a crossover method (alpha = 0.2 is an argument of tools.cxBlend).
Blend crossing as the crossing method, Gauss mutation as the mutation, and the selection format will be carried out in a tournament of 10 individuals. If you want to specify the initial population, register it in "population_guess".
sample_GA.py
creator.create("FitnessMax", base.Fitness, weights=(-1.0,))
creator.create("Individual", np.ndarray, fitness=creator.FitnessMax)
toolbox = base.Toolbox()
toolbox.register("evaluate", bird)
toolbox.register("population_guess", initPopulation, list, creator.Individual)
toolbox.register("mate", tools.cxBlend, alpha=0.2)
toolbox.register("mutate", tools.mutGaussian, mu=[0.0, 0.0], sigma=[2.0, 2.0], indpb=1)
toolbox.register("select", tools.selTournament, tournsize=10)
If you set up so far, the rest will work with simple commands.
In the example below, the number of evaluated individuals, mean value, standard deviation, maximum value, and minimum value of each generation are recorded in the `` `logbook. hof```Is an abbreviation for holl of fame, and you can save the specified number in descending order of evaluation.
The argument cxpb of algorithms.eaSimple indicates the crossover rate, mutpb indicates the mutation rate, and ngen indicates the number of generations. The number of individuals per generation will be 10 × 10 = 100 individuals, inheriting the one defined by `` `pop```. It is also possible to specify the initial sampling and the number of individuals at the time of generation change separately.
sample_GA.py
#Generate an early population
pop = toolbox.population_guess()
#Setting
hof = tools.HallOfFame(5, similar=np.array_equal)
stats = tools.Statistics(lambda ind: ind.fitness.values)
stats.register("avg", np.mean)
stats.register("std", np.std)
stats.register("min", np.min)
stats.register("max", np.max)
#Run
pop, logbook= algorithms.eaSimple(pop, toolbox,
                        cxpb=0.9, mutpb=0.1, ngen=20,
                        stats=stats, halloffame=hof, verbose=True)
#Show top 3 individuals
print("holl of fame.1 : %s" % (hof[0]))
print("holl of fame.2 : %s" % (hof[1]))
print("holl of fame.3 : %s" % (hof[2]))
terminal
gen	nevals	avg    	std    	min     	max   
0  	100   	80.0665	99.3005	-83.8352	414.98
1  	74    	7.49092	62.2371	-103.255	301.724
2  	89    	11.499 	108.449	-105.693	665.594
3  	84    	-60.283	53.2721	-106.71 	161.645
4  	87    	-95.3896	32.299 	-106.723	40.9594
5  	75    	-99.3516	28.4967	-106.758	23.0967
6  	73    	-104.068	15.7185	-106.764	4.33984
7  	76    	-103.476	18.6955	-106.764	10.0824
8  	80    	-101.665	22.5172	-106.764	16.8155
9  	92    	-102.631	20.6472	-106.764	26.921 
10 	77    	-102.882	19.2791	-106.764	6.34586
11 	90    	-99.4555	30.4939	-106.764	56.3788
12 	89    	-100.566	27.1489	-106.764	30.2934
13 	79    	-100.978	25.2596	-106.764	51.745 
14 	78    	-98.4071	32.1796	-106.764	85.5625
15 	76    	-105.728	10.3096	-106.764	-3.14868
16 	89    	-95.2292	38.3427	-106.764	80.6272 
17 	91    	-102.44 	25.6436	-106.764	96.6545 
18 	80    	-105.432	11.2501	-106.764	4.33866 
19 	83    	-102.271	23.504 	-106.764	67.6912 
20 	79    	-103.856	16.5553	-106.764	-3.86946
holl of fame.1 : [4.70021671 3.1529326 ]
holl of fame.2 : [4.70021671 3.15293287]
holl of fame.3 : [4.70021671 3.15293358]
Globally optimal solution-Converged around 106.765. Since the algorithm does not use the differential information of each individual, a minute error will occur.
--Function optimization was performed with the genetic algorithm DEAP. ――Since it uses random numbers, it is doubtful whether it will converge with each trial. -Is it for multivariable optimization with discrete values?
Recommended Posts