Let's set the regression SVR of SVM.
The main settings were gamma, C, ʻepsilon. I also put kernelas a comment so as not to forget the setting. It is possible to calculate withkernel included, but as soon as the polypolynomial is entered, the calculation becomes slow, so for the time being, I think it is better to calculate with onlyrbf` (radial basis function). ..
Parameter setting
def objective(trial):
#kernel = trial.suggest_categorical('kernel', ['linear','rbf','poly','sigmoid','precomputed'])
gamma = trial.suggest_loguniform('gamma',1e-5,1e5)
C = trial.suggest_loguniform('C',1e-5,1e5)
epsilon = trial.suggest_loguniform('epsilon',1e-5,1e5)
#If you choose a kernel, use the above
#regr = SVR(kernel = kernel, gamma = gamma, C = C ,epsilon = epsilon)
regr = SVR(kernel = 'rbf', gamma = gamma, C = C ,epsilon = epsilon)
score = cross_val_score(regr, X_train_std, y_train, cv=3, scoring="r2")
r2_mean = score.mean()
print(r2_mean)
return r2_mean
Learning with Optuna
#optuna study
study = optuna.create_study(direction='maximize')
study.optimize(objective, n_trials=100)
#Fits tuned hyperparameters
optimised_regr = SVR(kernel = 'rbf' , gamma = study.best_params['gamma'],
C = study.best_params['C'],epsilon = study.best_params['epsilon'])
'''
#If you select kernel, use this
optimised_regr = SVR(kernel = study.best_params['kernel'] , gamma = study.best_params['gamma'],
C = study.best_params['C'],epsilon = study.best_params['epsilon'])
'''
optimised_regr.fit(X_train_std ,y_train)
The result was like this and I was able to fit with a good feeling.

Recommended Posts