[PYTHON] Error back propagation method (back propagation)

Third day

Error back propagation method (back propagation)

I didn't have much time today, so I'm going to implement the Neural Network just before the implementation.

The logistic regression analysis implemented on the first day was good at linear regression, but non-linear ones cannot handle it well. In such a case, if it is a neural network, it is possible to regress even with non-linearity such as XOR.

There is no clear basis for bias, as this area is so ingrained in machine learning. The bias can be changed or fixed according to the purpose.

model

This time, we will make 3 layers of 2 inputs, 1 intermediate layer, and output.

The image is the same as logistic regression, like a combination of them.

Error function

Error derivation


'''
input: x1,x2
Hidden layer: O1,O2
output: Output
error: Error
bias: b
weight: w[Which layer,From where,where]
E = (y - Output)^2
'''
#Forward Propagation
Z1 = b11 + x1*w111 + x2*w121
Z2 = b12 + x1*w112 + x2*w122

O1 = Sig(Z1)
O2 = Sig(Z2)

Z = b21 + O1*w211 + O2*w221

Output = Sig(Z)

Error = (y - Output)**2

Differentiate and adjust bias

Bias adjustment


#differential
dw111 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*Sig(Z1)*(1 - Sig(Z1))*x1
dw121 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*Sig(Z1)*(1 - Sig(Z1))*x2
db11 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*Sig(Z1)*(1 - Sig(Z1))
dw112 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*Sig(Z2)*(1 - Sig(Z2))*x1
dw122 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*Sig(Z2)*(1 - Sig(Z2))*x2
db12 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*Sig(Z2)*(1 - Sig(Z2))
dw211 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*O1
dw221 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))*O2
db21 = 2*(y - Output)*Sig(Z)*(1 - Sig(Z))

#Gradient Descent / Steepest Descent
w111 -= alpha(dw111)
w121 -= alpha(dw121)
b11 -= alpha(db11)
w112 -= alpha(dw112)
w122 -= alpha(dw122)
b12 -= alpha(db12)
w211 -= alpha(dw211)
w221 -= alpha(dw221)
b21 -= alpha(db21)

code

day3.py

The third day is over!

I was a little busy today, but I was able to do the minimum! I think it would be nice to be able to catch up even a little like this I am indebted to Udemy. Thank you.

Recommended Posts

Error back propagation method (back propagation)
Deep learning / error back propagation of sigmoid function
[Deep Learning from scratch] Implement backpropagation processing in neural network by error back propagation method
Implemented label propagation method in Python
Chapter 7 [Error back propagation method] P275 ~ (Middle) [Learn by moving with Python! New machine learning textbook]
Let's build a belief propagation method (Python)
Apply the error propagation formula to the standard error