[PYTHON] An amateur stumbled in Deep Learning from scratch Note: Chapter 2

Introduction

I suddenly started studying in Chapter 2 of "Deep Learning from scratch-The theory and implementation of deep learning learned with Python". It is a memo of the trip.

The execution environment is macOS Mojave + Anaconda 2019.10. For details, refer to Chapter 1 of this memo.

(To other chapters of this memo: Chapter 1 / Chapter 2 / Chapter 3 ) / Chapter 4 / Chapter 5 / [Chapter 6](https: / /qiita.com/segavvy/items/ca4ac4c9ee1a126bff41) / Chapter 7 / Chapter 8 / Summary)

Chapter 2 Perceptron

This chapter describes Perceptron.

2.1 What is Perceptron?

I didn't have any particular stumbling blocks, but why call them neurons? I thought that it would be more interesting to have a little more background such as, so I will supplement it a little as far as I can understand.

In 1943, neurophysiologist / surgeon McCulloch and logician / mathematician Pitts teamed up to come up with a model that realizes nerve cells in the brain on a computer [^ 1]. If you investigate the mechanism of the brain and manage it with a computer, you should be able to complete an AI that can think like a human being! It's such a strange approach.

In the human brain, it is known that multiple nerve cells are intricately connected to form a network. These individual nerve cells are called neurons in English, and one of the models that tried to realize these nerve cells on a computer is the perceptron (artificial neuron, simple perceptron). That is why the word neuron is used in the description of perceptron.

A nerve cell receives an electrical signal from another nerve cell, and when it exceeds a certain amount, it becomes a firing state (excited state) for a moment and transmits it to another nerve cell. The mechanism of Perceptron expresses exactly that. Input signals $ x_1 $ and $ x_2 $ indicate inputs from other neurons, weights $ w_1 $ and $ w_2 $ indicate the ease with which the firing state is transmitted from other neurons, and the threshold $ \ theta $ is It shows how many signals your nerve cells need to fire.

In addition, the neural network that will appear in the next chapter is an attempt to realize a network of multiple nerve cells (neurons) on a computer in order to realize the human brain.

The approach of imitating the human brain is interesting, isn't it? Since the mechanism of the human brain has not been clarified yet, the algorithms in the world do not faithfully reproduce the mechanism of the human, and AI researchers should also faithfully reproduce the human brain. I don't seem to be particular about it.

2.2 Simple logic circuit

You can't do much with just one neuron, but you can achieve three types of calculations: AND, OR, and NAND by simply changing the parameters $ w_1 $, $ w_2 $, and $ \ theta $ without changing the logic. Is interesting.

2.3 Implementation of Perceptron

The appearance of the bias $ b $ is a little confusing, but it is the same as $ \ theta $ in the sense that it shows the ease of firing itself, so let's proceed without thinking too much because the expression is different. ..

2.4 Limits of Perceptron

As you can see in the book, XOR ◯ and △ cannot be realized in a straight line no matter how hard you try. This is the limit for a perceptron with one neuron.

In addition, it seems that the words linear and non-linear should be advanced with an understanding that they can be / cannot be divided by straight lines as in the book.

2.5 Multilayer Perceptron

It turns out that there are more things that can be done, just as XOR was possible with multiple layers.

2.6 From NAND to computer

I didn't understand the story that a computer can be made only with NAND, but the book "From NAND to Tetris" that is introduced seems to solve it. I can't afford to do anything right now, but I'd love to read it if I get the chance.

2.7 Summary

I didn't trip over Chapter 2 either.

That's all for this chapter. If you have any mistakes, I would be grateful if you could point them out. (To other chapters of this memo: Chapter 1 / Chapter 2 / Chapter 3 ) / Chapter 4 / Chapter 5 / [Chapter 6](https: / /qiita.com/segavvy/items/ca4ac4c9ee1a126bff41) / Chapter 7 / Chapter 8 / Summary)

[^ 1]: [Wikipedia: Formal Neuron](https://ja.wikipedia.org/wiki/%E5%BD%A2%E5%BC%8F%E3%83%8B%E3%83%A5%E3 From% 83% BC% E3% 83% AD% E3% 83% B3)

Recommended Posts

An amateur stumbled in Deep Learning from scratch Note: Chapter 1
An amateur stumbled in Deep Learning from scratch Note: Chapter 7
An amateur stumbled in Deep Learning from scratch Note: Chapter 5
An amateur stumbled in Deep Learning from scratch Note: Chapter 4
An amateur stumbled in Deep Learning from scratch Note: Chapter 2
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 2
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 7
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 4
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 6
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
"Deep Learning from scratch" in Haskell (unfinished)
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
Deep Learning from scratch
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Deep Learning from scratch 1-3 chapters
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
Deep Learning / Deep Learning from Zero Chapter 3 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Create an environment for "Deep Learning from scratch" with Docker
Deep learning from scratch (cost calculation)
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 8 Memo
Deep Learning / Deep Learning from Zero Chapter 5 Memo
Deep Learning / Deep Learning from Zero Chapter 4 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 3 Memo
Deep Learning memos made from scratch
Deep Learning / Deep Learning from Zero 2 Chapter 6 Memo
"Deep Learning from scratch" Self-study memo (No. 14) Run the program in Chapter 4 on Google Colaboratory
Why ModuleNotFoundError: No module named'dataset.mnist' appears in "Deep Learning from scratch".
Write an impression of Deep Learning 3 framework edition made from scratch
Deep learning from scratch (forward propagation edition)
Deep learning / Deep learning from scratch 2-Try moving GRU
[Windows 10] "Deep Learning from scratch" environment construction
Learning record of reading "Deep Learning from scratch"
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
Python vs Ruby "Deep Learning from scratch" Chapter 2 Logic circuit by Perceptron
Python vs Ruby "Deep Learning from scratch" Chapter 4 Implementation of loss function
"Deep Learning from scratch" self-study memo (unreadable glossary)
An amateur tried Deep Learning using Caffe (Introduction)
Good book "Deep Learning from scratch" on GitHub
An amateur tried Deep Learning using Caffe (Practice)
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
An amateur tried Deep Learning using Caffe (Overview)
Python vs Ruby "Deep Learning from scratch" Summary
"Deep Learning from scratch" Self-study memo (10) MultiLayerNet class
"Deep Learning from scratch" Self-study memo (No. 11) CNN
Python vs Ruby "Deep Learning from scratch" Chapter 3 Implementation of 3-layer neural network
[Python] [Natural language processing] I tried Deep Learning ❷ made from scratch in Japanese ①
Deep Learning from scratch The theory and implementation of deep learning learned with Python Chapter 3
Lua version Deep Learning from scratch Part 5.5 [Making pkl files available in Lua Torch]
[Deep Learning from scratch] I implemented the Affine layer
"Deep Learning from scratch 2" Self-study memo (No. 21) Chapters 3 and 4
Deep Learning Experienced in Python Chapter 2 (Materials for Journals)
[Deep Learning from scratch] I tried to explain Dropout
Python vs Ruby "Deep Learning from scratch" Chapter 3 Graph of step function, sigmoid function, ReLU function