Recently I'm studying machine learning using Jupyter Lab. Among them, I thought that I would like to write an article about Optuna's research on LightGBM hyperparameter optimization, which had an old search result on Google.
jupyter lab
is specified when starting the above Docker image--I want to optimize LightGBM piper parameters with Optuna ――It doesn't work well even if I copy the code that came out by google --Somehow best_params is said to have no arguments --Apply the official document --Try the official document sample → It works --Check past samples from Github and confirm that there are best_params as of 1.3 It seems that it will not be 1.4 or later ――If you have a problem, read the official document. ――If the behavior is different from the information on the net, it's a good idea to check the version of the library.
This is the first thing that came out when I tried google Hyperparameter automatic optimization by LightGBM Tuner extension Below is an excerpt of the code on the above page
booster = lgb.train(params, dtrain, valid_sets=dval,
verbose_eval=0,
best_params=best_params,
tuning_history=tuning_history)
When I tried the above, I got the following error.
TypeError: __init__() got an unexpected keyword argument 'best_params'
HM. There is no keyword argument for best_params
. Is said to be.
I wondered if the different arguments mean different versions, so I decided to check the official documentation. Optuna There was a link to Github, so I moved there optuna/optuna
Search in the repository with lightgbm
at optuna / optuna
Search results in repository
Looking at the search results, I found that examples / README.md was like a sample. Confirm it.
examples/lightgbm_tuner_simple.py
#Some line breaks are used to make it easier to compare citation sources.
model = lgb.train(
params, dtrain, valid_sets=[dtrain, dval],
verbose_eval=100,
early_stopping_rounds=100
)
There is no best_params
here, so the current version looks like this.
I tried it above and it worked! I did it.
By the way, when I checked, there is best_params
up to v1.3.0, and [v1.4.0](https:: From //github.com/optuna/optuna/blob/v1.4.0/examples/lightgbm_tuner_simple.py), there was no best_params
.
Actually, I checked pages other than the above, but I omitted the ones that did not produce any results (or rather, I forgot) When investigating and resolving such problems and errors does not produce results, it does not come out at all. It's hard to get results. In such a case, it is recommended to change the day or consult with other people lightly because you can see the way of life unexpectedly.
If the result of google does not work, it is often due to different environment or version. If the specifications and samples are publicly available, it is often a shortcut to solve the problem by referring to the official documentation and Github. (Although this time is different) It is also important to check the content of the error message and search by message. If the number of arguments or the arguments themselves are different, it is likely that the version is different. If you can't find a solution, it's important to talk to someone else or change the date.
Recommended Posts