Solve optimization problems in Python

Solve optimization problems in Python. If you're looking for a good sample, [Nelder Mead Method](https://ja.wikipedia.org/wiki/%E3%83%8D%E3%83%AB%E3%83%80%E3 % 83% BC% E2% 80% 93% E3% 83% 9F% E3% 83% BC% E3% 83% 89% E6% B3% 95) is implemented in [MATLAB Function fminsearch](https: / /jp.mathworks.com/help/matlab/ref/fminsearch.html) looks good.

MATLAB example

MATLAB has a paid option called optimization toolbox that has functions for optimization, but you can use this fminsearch even if only MATLAB itself is installed.

a = sqrt(2);
banana = @(x)100*(x(2)-x(1)^2)^2+(a-x(1))^2;
[x,fval] = fminsearch(banana, [-1.2, 1], optimset('TolX',1e-8));

This is exactly the example described on the fminsearch page, but Nelder-Mead X to minimize this formula. Required by law. eq.png

Ported to Python

Port the above banana function example to Python. To solve this optimization problem in Python, we rely on Scipy. Scipy.optimize In Scipy.optimize, various optimization algorithms are implemented, when you look at the document I understand.

A simple port would look like this:

from scipy.optimize import fmin
import math

banana = lambda X, a: 100*(X[1] - X[0]**2)**2 + (a - X[0])**2
a = math.sqrt(2)
arg = (a, )
fmin(banana, [-1, 1.2], args=arg)

20170128_001.png

With the initial value of (x 1 </ sub>, x 2 </ sub>) = (-1, 1.2), optimization was performed to minimize the value of the above formula. However, it shows that the result (x 1 </ sub>, x 2 </ sub>) = (1.41420186, 1.99996718) was obtained.

The first important thing here is the definition of the function by lambda. That said, it's something that comes up every time you do a Python tutorial, and it may not be the point to worry about. Here, it is important to "pass a list called X and a variable called a as arguments", and the variables adjusted by `` `fmin``` are the two variables passed in the list called X, and a is It means that it is passed as a constant. Of course, it may be normally defined as follows.

def banana(X, a):
	return 100*(X[1] - X[0]**2)**2 + (a - X[0])**2

Now, the argument of `fmin``` is the objective function to be minimized, the variable to be optimized to minimize the function value, and the optimization by fmin. It is a tuple of arguments to be input to the function, although it is not the target of. This time, among the arguments to be input to the objective function banana, ato be input as a constant is passed as a tuple to the argumentargs of `` fmin. Therefore, ```arg = (a,) `` is used. When defining a tuple with 1 element, it must end with " , `".

Option settings for optimization

Various options can be set for optimization. The optimization guy does "tweak the parameters until the value of the function converges", but there are various conditions for judging "converged". The conditions can be set. See Documentation for details.

Now, let's set the maximum number of function evaluations to 400, the maximum number of iterations to 400, the function end value tolerance to 1e-4, and the X tolerance to 1e-4 (MATLAB fminsearch default value), and to explicitly specify the error, call `` `fmin``` as follows.

fmin(banana, [-1, 1.2], args=arg, xtol=1e-4, ftol=1e-4, maxiter=400, maxfun=400)

I want to visualize the optimization process

By the way, if the optimization is about the banana function, it will end in an instant, but there are times when you want to visualize the process of this optimization calculation. In such a case, specify the callback function to retrieve the value for each iteration.

count = 0
def cbf(Xi):
	global count
    count += 1
    print('%d, %f, %f, %f' % (count, Xi[0], Xi[1], banana(Xi, math.sqrt(2))))

I'm not sure if this is the right way to do it, but I tried to count up and display the number of iterations as a global variable.

Instead of outputting the calculation result as text for each iteration, it can be expressed in a graph.

from scipy.optimize import fmin
import math
import matplotlib.pyplot as plt

count = 0
plt.axis([0, 100, 0, 6.5])
plt.ion()

def cbf(Xi):
    global count
    count += 1
    f = banana(Xi, math.sqrt(2))
    print('%d, %f, %f, %f' % (count, Xi[0], Xi[1], f))
    plt.scatter(count, f)

banana = lambda X, a: 100*(X[1] - X[0]**2)**2 + (a - X[0])**2
a = math.sqrt(2)
arg = (a, )
fmin(banana, [-1, 1.2], args=arg, callback=cbf)

Now you can see how points are plotted in real time on the graph with `` `matplotlib```. 20170128_002.png

Extraction of optimization calculation results

As a result of calculation by fmin, as described above(x<sub>1</sub>, x<sub>2</sub>) = (1.41420186, 1.99996718)People often ask to explain in a little more detail the process by which the result was obtained. No, my boss too(The following is omitted)…。



 The `` `fmin``` options for that are the `` `retall``` and `` `full_output``` arguments, and if you set them to `` `True``` You can get various return values of `` `.

```python
[xopt, fopt, iter, funcalls, warnflag, allvecs] = fmin(banana, [-1, 1.2], args=arg, retall=True, full_output=True)

xoptIs an optimized parameter,foptIs the return value of the minimized function at that time.iterWhenfuncalllsHow many iterations were donewarnflagは「収束した」When判断した条件Is stored.allvecs には、各iterationで最適化の対象Whenなっている変数The value of the(上述のbanana関数の例でいうWhenx1Whenx2The value of the)Is stored. So, if you need a history of variable adjustments for each iteration, you can visualize it by graphing it after optimization with `` `fmin``` without processing it in the callback function. ..

Today's summary

So, I tried to do the same thing in Python, citing the example of the fminsearch function in MATLAB.

from scipy.optimize import fmin
import math
import matplotlib.pyplot as plt


def cbf(Xi):
    global count
    count += 1
    f = banana(Xi, math.sqrt(2))
    plt.scatter(count, f)
    plt.pause(0.05)


def banana(X, a):
    return 100*(X[1] - X[0]**2)**2 + (a - X[0])**2


def main():
    a = math.sqrt(2)
    arg = (a, )
    [xopt, fopt, iter, funcalls, warnflag, allvecs] = fmin(
        banana,
        [-1, 1.2],
        args=arg,
        callback=cbf,
        xtol=1e-4,
        ftol=1e-4,
        maxiter=400,
        maxfun=400,
        disp=True,
        retall=True,
        full_output=True)
    for item in allvecs:
        print('%f, %f' % (item[0], item[1]))

if __name__ == '__main__':
    count = 1
    plt.axis([0, 100, 0, 6.5])
    main()

Today's code

Recommended Posts

Solve optimization problems in Python
Python in optimization
Solve ABC168D in Python
Solve ABC167-D in Python
Solve ABC146-C in Python
Solve ABC098-C in Python
Solve ABC159-D in Python
Solve ABC169 in Python
Solve ABC160-E in Python
Solve ABC176 E in Python
Solve Wooldridge exercises in Python
FX Systre Parameter Optimization in Python
Solve AtCoder Problems Recommendation with python (20200517-0523)
Solve ABC037 A ~ C in Python
Solve ordinary differential equations in Python
13th Offline Real-time How to Solve Writing Problems in Python
Quadtree in Python --2
CURL in python
Metaprogramming in Python
Solve ABC175 A, B, C in Python
Python 3.3 in Anaconda
Geocoding in python
SendKeys in Python
Meta-analysis in Python
Unittest in python
ABC 157 D --Solve Friend Suggestions in Python!
The 17th Offline Real-time How to Solve Writing Problems in Python
Epoch in Python
Sudoku in Python
I tried using Bayesian Optimization in Python
DCI in Python
quicksort in python
nCr in python
N-Gram in Python
How to write offline real time Solve E04 problems in Python
Programming in python
[Python] Solve 10 past elite problems of Atcoder
Plink in Python
Constant in python
I wanted to solve ABC159 in Python
Lifegame in Python.
FizzBuzz in Python
Sqlite in python
StepAIC in Python
Solve ABC165 A, B, D in Python
N-gram in python
LINE-Bot [0] in Python
Csv in python
Disassemble in Python
Reflection in Python
Solve the maximum subarray problem in Python
Constant in python
nCr in Python.
format in python
Scons in Python3
Puyo Puyo in python
python in virtualenv
PPAP in Python
Quad-tree in Python
Reflection in Python
Chemistry in Python