[PYTHON] SVM (multi-class classification)

What is SVM

Abbreviation for Support Vector Machine. A method of estimating the classification to which unknown data belongs by obtaining a line that classifies multiple classes using training data (learning model creation).

Overview of classification in SVM

The plane one level lower than the original dimension is called the hyperplane, and the optimum hyperplane (separation plane) is searched for.

For example, in the figure below, the lines (H1 and H2) that separate the black and white circles are drawn. (This figure shows two-dimensional data of X1 and X2, but H1 and H2 are straight lines and one-dimensional. If 3D data is an input, you will be looking for a plane that can be classified in 2D. )

スクリーンショット 2016-05-05 13.31.17.png

The optimum method for finding the separation surface is to search for the separation surface that maximizes the margin. The margin is the gray line in the figure, which is the distance of the perpendicular line from the point of each class to the separation surface.

For example, both H1 and H2 can be said to be separation surfaces that classify black and white circles, but H2, which has a large margin, has higher classification power. (H3 is not even classified, so it is completely useless.)

Kernel trick (kernel method)

The above example seems to be categorized neatly, but in most cases it is not. The method of finding the separation surface in such a case is called a kernel trick.

For example, in the figure below, it is difficult to separate the red and blue circles with a straight line.

スクリーンショット 2016-05-05 13.46.57.png

Therefore, the sample is mapped to another space [feature space](points are moved according to a certain rule) so that the samples can be separated neatly, and the separation surface is searched for in that space.

スクリーンショット 2016-05-05 13.45.05.png

When the separation surface with the maximum margin is obtained in this way, when unknown data is input, it is mapped to the feature space according to the rules at the time of mapping, and the class is determined according to where the separation surface belongs on the feature space.

This is very easy to understand. → Kernel trick reference video

Data preparation

Use iris data. With here.

Data overview

{describe_iris.py}


iris.head()
スクリーンショット 2016-05-04 3.45.38.png

スクリーンショット 2016-05-04 3.50.37.png  の長さと幅のデータ

Try

{svm.py}


import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
from sklearn import datasets
from sklearn.cross_validation import train_test_split #For cross-validation

#Data preparation
iris = datasets.load_iris()    #Data load
X = iris.data                  #Explanatory variable set
Y = iris.target                #Objective variable set
X_train, X_test, Y_train, Y_test = train_test_split(X, Y, random_state=0) # random_state is the seed value.

#SVM execution
from sklearn.svm import SVC #For SVM
model = SVC()               #Instance generation
model.fit(X_train, Y_train) #SVM execution

#Predictive execution
from sklearn import metrics       #For accuracy verification
predicted = model.predict(X_test) #Predictive execution for test data
metrics.accuracy_score(Y_test, predicted)
> 0.97368421052631582

Accuracy 97.4%. high.

Recommended Posts

SVM (multi-class classification)
Multi-class SVM with scikit-learn
Naive Bayes (multiclass classification)
Keras multiclass classification Iris
K-nearest neighbor method (multiclass classification)
ROC curve for multiclass classification
Multi-class, multi-label classification of images with pytorch
Machine learning algorithm (implementation of multi-class classification)
svm experiment 1
Machine learning algorithms (from two-class classification to multi-class classification)