Hyperopt is an auto-optimization framework for hyperparameters. It seems to be used mainly for hyperparameter tuning of machine learning.
First, let's install the library. You can install it with pip install hyperopt </ font>.
This time
x^2+y^2+z^2
Let's optimize the minimization problem of.
First, let's define the objective function.
#Set objective function
def objective_hyperopt(args):
x, y, z = args
return x ** 2 + y ** 2 + z ** 2
First, let's set the search space for the parameters to be optimized. Then use fmin () to start the search. Let's set the number of searches with the argument max_evals.
#Optimized with hyperopt
def hyperopt_exe():
space = [
hp.uniform('x', -100, 100),
hp.uniform('y', -100, 100),
hp.uniform('z', -100, 100)
]
#An object for recording the state of the search
trials = Trials()
#Start exploration
best = fmin(objective_hyperopt, space, algo=tpe.suggest, max_evals=500, trials=trials)
If you want to know the final result, add the following.
#Output the result
print(best)
Let's retrieve the information being searched from the trials object. You can display the parameters and objective function values for each trial by adding the following:
#Examine the search process
for i, n in zip(trials.trials, range(500)):
vals = i['misc']['vals']
result = i['result']['loss']
print('vals:', vals, 'result:', result)
The code this time is as follows.
# -*- coding: utf-8 -*-
import hyperopt
from hyperopt import hp
from hyperopt import fmin
from hyperopt import tpe
from hyperopt import Trials
import matplotlib.pyplot as plt
#Set objective function for hyperopt
def objective_hyperopt(args):
x, y, z = args
return x ** 2 + y ** 2 + z ** 2
#Optimized with hyperopt
def hyperopt_exe():
#Search space settings
space = [
hp.uniform('x', -100, 100),
hp.uniform('y', -100, 100),
hp.uniform('z', -100, 100)
]
#An object for recording the state of the search
trials = Trials()
#Start exploration
best = fmin(objective_hyperopt, space, algo=tpe.suggest, max_evals=500, trials=trials)
#Output the result
print(best)
epoches = []
values = []
best = 100000
#Examine the search process
for i, n in zip(trials.trials, range(500)):
if best > i['result']['loss']:
best = i['result']['loss']
epoches.append(n+1)
values.append(best)
vals = i['misc']['vals']
result = i['result']['loss']
print('vals:', vals, 'result:', result)
#Draw graph
plt.plot(epoches, values, color="red")
plt.title("hyperopt")
plt.xlabel("trial")
plt.ylabel("value")
plt.show()
if __name__ == '__main__':
hyperopt_exe()
The figure of the result of this experiment is as follows. It has converged at an early stage.
Function optimization using Hyperopt Python: Select hyperparameters of machine learning model with Hyperopt
Recommended Posts