Widrow-Hoff learning rules implemented in Python

About this post

Continuing from the previous "Implementing Perceptron Learning Rules in Python" This time, I implemented the learning rule of Widrow-Hoff, which is one of the pattern recognition methods, in Python without using a library. Since I am a beginner in both Python and machine learning, please point out the bad points.

Widrow-Hoff's Theory of Learning Rules

The outline and formulas of Widrow-Hoff's learning rules are roughly summarized in the following slides (from the middle of the slide).

https://speakerdeck.com/kirikisinya/xin-zhe-renaiprmlmian-qiang-hui-at-ban-zang-men-number-3

Implementation

For the training data that exists on one dimension as shown in the figure below and belongs to one of the two classes, the discriminant function of each class is obtained.

スクリーンショット 2014-12-09 22.19.17.png

As a point of implementation,

The actual code looks like this:

main.py


# coding: UTF-8
# #1D Widrow-Implementation example of Hoff learning rules
import numpy as np
import matplotlib.pyplot as plt
from widrow_hoff import get_wvec

if __name__ == '__main__':

    data = np.array([[1.0, 1],[0.5, 1],[-0.2, 2],[-0.4, 1],[-1.3, 2],[-2.0, 2]])#Data group
    
    features = data[:,0].reshape(data[:,0].size,1)#Feature vector
    labels = data[:,1]#Class (this time c1=1,c2=2)
    wvec = np.array([0.2, 0.3])#Initial weight vector
    xvecs = np.c_[np.ones(features.size), features]#xvec[0] = 1
    
    #About class 1
    tvec1 = labels.copy()#Class 1 teacher vector
    tvec1[labels == 1] = 1
    tvec1[labels == 2] = 0
    
    wvec1 = get_wvec(xvecs, wvec, tvec1)
    print "wvec1 = %s" % wvec1
    print "g1(x) = %f x + %f" % (wvec1[1], wvec1[0])
    
    for xvec,label in zip(xvecs,labels):
        print "g1(%s) = %s (class:%s)" % (xvec[1],np.dot(wvec1, xvec), label)    
    
    #About class 2
    tvec2 = labels.copy()#Class 2 teacher vector
    tvec2[labels == 1] = 0
    tvec2[labels == 2] = 1
    
    wvec2 = get_wvec(xvecs, wvec, tvec2)
    print "wvec2 = %s" % wvec2
    print "g2(x) = %f x + %f" % (wvec2[1], wvec2[0])
    
    for xvec,label in zip(xvecs,labels):
        print "g2(%s) = %s (class:%s)" % (xvec[1],np.dot(wvec, xvec), label)     

widrow_hoff.py



# coding: UTF-8
#Widrow-Learning logic of Hoff learning rules

import numpy as np

#Learning the weighting factor
def train(wvec, xvecs, tvec):
    low = 0.2#Learning coefficient
    for key, w in zip(range(wvec.size), wvec):
        sum = 0
        for xvec, b in zip(xvecs, tvec):
            wx = np.dot(wvec,xvec)
            sum += (wx - b)*xvec[key]
        wvec[key] = wvec[key] - low*sum
    return wvec

#Find the weighting factor
def get_wvec(xvecs, wvec, tvec):
    loop = 100
    for j in range(loop):
        wvec = train(wvec, xvecs, tvec)
    return wvec

When this was done, the following results were obtained.

スクリーンショット 2014-12-09 22.23.22.png

The discriminant function of each class is as follows.

g1(x) = 0.37x + 0.69 #Class 1 discriminant function
g2(x) = -0.37x + 0.35 #Class 2 discriminant function

The Widrof-Hoff learning rule is g1 (x)> g2 (x) for class 1 data and g1 (x) <g2 (x) for class 2 data. It was good if it became `(it can be identified well!).

スクリーンショット 2014-12-09 22.19.17.png

Based on this, if you look at the execution result

-For data x = 1.0 (class 1)   g1(1.0) > g2(1.0) => OK

・ When data x = 0.5 (class 1)   g1(0.5) > g2(0.5) => OK

-For data x = -0.2 (class 2)   g1(-0.2) > g2(-0.2) => NG

-For data x = -0.4 (class 1)   g1(-0.4) = g2(-0.4) => NG

-For data x = -1.3 (class 2)   g1(-1.3) < g2(-1.3) => OK

-For data x = -2.0 (class 2)   g1(-2.0) < g2(-2.0) => OK

Therefore, the data x = -0.2 and x = -0.4 near the middle of class 1 and class 2 are not well identified (misidentification), and the others are well identified. This result is consistent with the intuition that it is difficult to discriminate near the middle of the class, and it is considered that the discriminant function is well determined.

zz_スクリーンショット_2014-12-09_22_19_17.png

Recommended Posts

Widrow-Hoff learning rules implemented in Python
Implemented Perceptron learning rules in Python
Implemented SimRank in Python
Implemented Shiritori in Python
Sudoku solver implemented in Python 3
6 Ball puzzle implemented in python
Implement stacking learning in Python [Kaggle]
Implemented image segmentation in python (Union-Find)
Implemented label propagation method in Python
python learning
Python: Preprocessing in Machine Learning: Overview
Implemented in 1 minute! LINE Notify in Python
Implemented in Python PRML Chapter 7 Nonlinear SVM
[python] Frequently used techniques in machine learning
Python: Preprocessing in machine learning: Data acquisition
I implemented Cousera's logistic regression in Python
Implemented in Python PRML Chapter 5 Neural Networks
[Python] Saving learning results (models) in machine learning
Implemented Stooge sort in Python3 (Bubble sort & Quicksort)
Python: Preprocessing in machine learning: Data conversion
Implemented in Python PRML Chapter 1 Bayesian Inference
Residual analysis in Python (Supplement: Cochrane rules)
Quadtree in Python --2
CURL in python
Metaprogramming in Python
Python 3.3 in Anaconda
Geocoding in python
SendKeys in Python
[Python] Learning Note 1
Python learning notes
Meta-analysis in Python
Unittest in python
Discord in Python
DCI in Python
quicksort in python
nCr in python
Python learning site
N-Gram in Python
Programming in python
Python learning day 4
Plink in Python
Constant in python
Python Deep Learning
Python learning (supplement)
Lifegame in Python.
FizzBuzz in Python
Sqlite in python
StepAIC in Python
Deep learning × Python
N-gram in python
LINE-Bot [0] in Python
Csv in python
Disassemble in Python
Reflection in Python
Constant in python
nCr in Python.
format in python
Scons in Python3
Puyo Puyo in python
python in virtualenv
PPAP in Python