[PYTHON] I tried to implement Perceptron Part 1 [Deep Learning from scratch]

Introduction

Hello! I'm a super beginner studying using O'Reilly's book "Deep Learning from scratch". Since it's a big deal, I wanted to record what I learned, so I'm writing an article that also serves as an output practice. I'm sorry if there are any mistakes. It would be very helpful if you could point out.

environment

Windows10 Python3.7.3 Jupyter Notebook

What is Perceptron?

An algorithm that receives multiple signals and outputs one signal. This time, we will implement this perceptron in a simple way.

Implementation of AND gate using threshold θ

The AND gates are shown in the table below. The signal received by x1 and x2, and the signal output by y. If both x1 and x2 are 1, output 1.

x1 x2 y
0 0 0
1 0 0
0 1 0
1 1 1

The code. This is in the book. w1 and w2 are weights. Put the value obtained by multiplying the input x1 and x2 by the weights w1 and w2, respectively, in tmp. theta (θ) is the threshold, which returns 1 if tmp exceeds the threshold and 0 otherwise.

def AND(x1, x2):
    w1, w2, theta = 0.5, 0.5, 0.7
    tmp = x1*w1 + x2*w2
    if tmp <= theta:
        return 0
    elif tmp > theta:
        return 1

The output.

AND(0,0) #0 is output
AND(1,0) #0 is output
AND(0,1) #0 is output
AND(1,1) #1 is output

I got the results I expected!

Implementation of NAND gate using threshold θ

The NAND gates are shown in the table below. The signal received by x1 and x2, and the signal output by y. If both x1 and x2 are 1, 0 is output.

x1 x2 y
0 0 1
1 0 1
0 1 1
1 1 0

The code. It is the same as the AND gate except that w1, w2, and theta are negative.

def NAND(x1, x2):
    w1, w2, theta = -0.5, -0.5, -0.7
    tmp = x1*w1 + x2*w2
    if tmp <= theta:
        return 0
    elif tmp > theta:
        return 1

The output.

NAND(0,0) #1 is output
NAND(1,0) #1 is output
NAND(0,1) #1 is output
NAND(1,1) #0 is output

The result was as expected!

Implementation of OR gate using threshold θ

The OR gate is shown in the table below. The signal received by x1 and x2, and the signal output by y. If both x1 and x2 are 0, 0 is output.

x1 x2 y
0 0 0
1 0 1
0 1 1
1 1 1

I thought about it while carrying out the parameter addressing strategy. The final code. The basics are the same as AND and NAND gates, but the parameters are changed again.

def OR(x1, x2):
    w1, w2, theta = 0.5, 0.5, 0.4
    tmp = x1*w1 + x2*w2
    if tmp <= theta:
        return 0
    elif tmp > theta:
        return 1

The output.

OR(0,0) #0 is output
OR(1,0) #1 is output
OR(0,1) #1 is output
OR(1,1) #1 is output

I got the results I expected!

By the way, these are the dead parameters. Please do not refer to it. w1, w2, theta = 0.5, 0.5, -0.7 All became 1. w1, w2, theta = 0.5, 0.5, 0.5 It was an AND gate lol

in conclusion

There is still room around here. (It's not alive, it will be very difficult after this ...)

This time, people thought about w1, w2, and theta (θ), but in machine learning such as deep learning, the computer automatically executes the work of determining these values. Next time, with that in mind, I would like to introduce weights and biases and rewrite what I wrote this time.

Thank you for reading!

2020/6/8 I added a title change and commentary.

Recommended Posts

I tried to implement Perceptron Part 1 [Deep Learning from scratch]
[Deep Learning from scratch] I tried to explain Dropout
[Deep Learning from scratch] I tried to implement sigmoid layer and Relu layer.
I tried to implement Deep VQE
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
"Deep Learning from scratch" Self-study memo (No. 16) I tried to build SimpleConvNet with Keras
"Deep Learning from scratch" Self-study memo (No. 17) I tried to build DeepConvNet with Keras
Deep Learning from scratch
I tried deep learning
[Deep Learning from scratch] I tried to explain the gradient confirmation in an easy-to-understand manner.
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Deep Learning from scratch Chapter 2 Perceptron (reading memo)
I tried to implement deep learning that is not deep with only NumPy
Deep Learning from scratch 1-3 chapters
I tried to implement PCANet
I tried to implement StarGAN (1)
[Deep Learning from scratch] I implemented the Affine layer
I tried to implement an artificial perceptron with python
I'm not sure, but I feel like I understand Deep Learning (I tried Deep Learning from scratch)
[Python] [Natural language processing] I tried Deep Learning ❷ made from scratch in Japanese ①
[Python] Deep Learning: I tried to implement deep learning (DBN, SDA) without using a library.
I tried to extract a line art from an image with Deep Learning
I tried to implement Cifar10 with SONY Deep Learning library NNabla [Nippon Hurray]
I tried to classify Oba Hana and Emiri Otani by deep learning (Part 2)
[Part 4] Use Deep Learning to forecast the weather from weather images
[Part 1] Use Deep Learning to forecast the weather from weather images
[Part 3] Use Deep Learning to forecast the weather from weather images
I tried to implement anomaly detection by sparse structure learning
I tried to predict horse racing by doing everything from data collection to deep learning
Deep learning from scratch (cost calculation)
I tried to implement adversarial validation
[Part 2] Use Deep Learning to forecast the weather from weather images
I tried to implement hierarchical clustering
I tried deep learning using Theano
Deep Learning memos made from scratch
I tried to implement Realness GAN
I tried to divide with a deep learning language model
"Deep Learning from scratch" Self-study memo (Part 8) I drew the graph in Chapter 6 with matplotlib
I tried to make deep learning scalable with Spark × Keras × Docker
Lua version Deep Learning from scratch Part 6 [Neural network inference processing]
I tried to make Kana's handwriting recognition Part 1/3 First from MNIST
I tried to implement PLSA in Python
Reinforcement learning to learn from zero to deep
I tried to implement Autoencoder with TensorFlow
I tried to implement permutation in Python
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning from scratch (forward propagation edition)
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
I tried to implement PLSA in Python 2
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
I tried to implement ADALINE in Python
Image alignment: from SIFT to deep learning
I tried to implement PPO in Python
I tried to implement CVAE with PyTorch
"Deep Learning from scratch" in Haskell (unfinished)
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Windows 10] "Deep Learning from scratch" environment construction
[Deep Learning from scratch] About hyperparameter optimization
[Learning memo] Deep Learning made from scratch [~ Chapter 4]