Hello! I'm a super beginner studying using O'Reilly's book "Deep Learning from scratch". Since it's a big deal, I wanted to record what I learned, so I'm writing an article that also serves as an output practice. I'm sorry if there are any mistakes. It would be very helpful if you could point out.
Windows10 Python3.7.3 Jupyter Notebook
An algorithm that receives multiple signals and outputs one signal. This time, we will implement this perceptron in a simple way.
The AND gates are shown in the table below. The signal received by x1 and x2, and the signal output by y. If both x1 and x2 are 1, output 1.
x1 | x2 | y |
---|---|---|
0 | 0 | 0 |
1 | 0 | 0 |
0 | 1 | 0 |
1 | 1 | 1 |
The code. This is in the book. w1 and w2 are weights. Put the value obtained by multiplying the input x1 and x2 by the weights w1 and w2, respectively, in tmp. theta (θ) is the threshold, which returns 1 if tmp exceeds the threshold and 0 otherwise.
def AND(x1, x2):
w1, w2, theta = 0.5, 0.5, 0.7
tmp = x1*w1 + x2*w2
if tmp <= theta:
return 0
elif tmp > theta:
return 1
The output.
AND(0,0) #0 is output
AND(1,0) #0 is output
AND(0,1) #0 is output
AND(1,1) #1 is output
I got the results I expected!
The NAND gates are shown in the table below. The signal received by x1 and x2, and the signal output by y. If both x1 and x2 are 1, 0 is output.
x1 | x2 | y |
---|---|---|
0 | 0 | 1 |
1 | 0 | 1 |
0 | 1 | 1 |
1 | 1 | 0 |
The code. It is the same as the AND gate except that w1, w2, and theta are negative.
def NAND(x1, x2):
w1, w2, theta = -0.5, -0.5, -0.7
tmp = x1*w1 + x2*w2
if tmp <= theta:
return 0
elif tmp > theta:
return 1
The output.
NAND(0,0) #1 is output
NAND(1,0) #1 is output
NAND(0,1) #1 is output
NAND(1,1) #0 is output
The result was as expected!
The OR gate is shown in the table below. The signal received by x1 and x2, and the signal output by y. If both x1 and x2 are 0, 0 is output.
x1 | x2 | y |
---|---|---|
0 | 0 | 0 |
1 | 0 | 1 |
0 | 1 | 1 |
1 | 1 | 1 |
I thought about it while carrying out the parameter addressing strategy. The final code. The basics are the same as AND and NAND gates, but the parameters are changed again.
def OR(x1, x2):
w1, w2, theta = 0.5, 0.5, 0.4
tmp = x1*w1 + x2*w2
if tmp <= theta:
return 0
elif tmp > theta:
return 1
The output.
OR(0,0) #0 is output
OR(1,0) #1 is output
OR(0,1) #1 is output
OR(1,1) #1 is output
I got the results I expected!
By the way, these are the dead parameters. Please do not refer to it.
w1, w2, theta = 0.5, 0.5, -0.7
All became 1.
w1, w2, theta = 0.5, 0.5, 0.5
It was an AND gate lol
There is still room around here. (It's not alive, it will be very difficult after this ...)
This time, people thought about w1, w2, and theta (θ), but in machine learning such as deep learning, the computer automatically executes the work of determining these values. Next time, with that in mind, I would like to introduce weights and biases and rewrite what I wrote this time.
Thank you for reading!
2020/6/8 I added a title change and commentary.
Recommended Posts