[PYTHON] [Deep Learning from scratch] Speeding up neural networks I explained back propagation processing

Introduction

This article is an easy-to-understand output of ** Deep Learning from scratch Chapter 6 Error back propagation method **. I was able to understand it myself, so I hope you can read it comfortably. Also, I would be more than happy if you could refer to it when studying this book.

What is back propagation processing?

As I talked about a little last time, backpropagation processing is the processing of solving the calculation graph from right to left, and the derivative of each variable can be obtained.

The gradient formula of the neural network implemented in the previous chapter used a method of finding the derivative by adding a minute value to a variable called numerical differentiation. Numerical differentiation is simple and easy to understand, but it has the disadvantage of long processing time. That's where ** backpropagation processing ** is used. By implementing back propagation processing in the processing of the neural network, the gradient can be obtained faster and more efficiently than before.

As a starting point, I would like to implement forward propagation processing and back propagation processing of a simple calculation graph in python. スクリーンショット 2019-10-31 20.35.55.png

a = 100
as = 3
o = 50
os = 4
t = 1.1

#Forward propagation processing
az = a * as #Total amount of apples
oz = o * os #Total amount of mandarin oranges
z = az + oz #Total amount of apple oranges
y = z * t #total fee

#Back propagation processing
dy = 1 #Differentiation of answer y of forward propagation processing

# z *Since t is multiplication, the values of each variable are exchanged and the previous derivative is applied.
dz = t * dy = 1.1 #Differentiation of z
dt = z * dy = 500 #Differentiation of t

# az +Since oz is addition, it inherits the previous derivative as it is
daz = dz = 1.1 #Differentiation of az
doz = dz = 1.1 #Differentiation of oz

# o *Since os is multiplication, the values of each variable are exchanged and the previous derivative is applied.
do = os * doz = 4.4 #Differentiation of o
dos = o * doz = 55 #Differentiation of os

# a *Since as is multiplication, the values of each variable are exchanged and the previous derivative is applied.
da = as * daz = 3.3 #Differentiation of a
das = a * adz = 330 #Differentiation of as

Since the backpropagation processing of multiplication and addition can be easily performed as described above, the back propagation processing is implemented in the neural network using this.

Recommended Posts

[Deep Learning from scratch] Speeding up neural networks I explained back propagation processing
[Deep Learning from scratch] Implement backpropagation processing in neural network by error back propagation method
[Deep Learning from scratch] Main parameter update methods for neural networks
Deep learning from scratch (forward propagation edition)
Lua version Deep Learning from scratch Part 6 [Neural network inference processing]
Deep Learning from scratch
[Python] [Natural language processing] I tried Deep Learning ❷ made from scratch in Japanese ①
[Deep Learning from scratch] I implemented the Affine layer
[Deep Learning from scratch] I tried to explain Dropout
Deep Learning from scratch 1-3 chapters
I tried to implement Perceptron Part 1 [Deep Learning from scratch]
Deep learning from scratch (cost calculation)
Deep Learning memos made from scratch
[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network
Deep Learning from scratch 4.4.2 Gradient for neural networks The question about the numerical_gradient function has been solved.
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Deep Learning from scratch] I tried to implement sigmoid layer and Relu layer.
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
"Deep Learning from scratch" in Haskell (unfinished)
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Windows 10] "Deep Learning from scratch" environment construction
Learning record of reading "Deep Learning from scratch"
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
Python vs Ruby "Deep Learning from scratch" Chapter 3 Implementation of 3-layer neural network
I'm not sure, but I feel like I understand Deep Learning (I tried Deep Learning from scratch)
[Deep Learning from scratch] Initial value of neural network weight using sigmoid function
"Deep Learning from scratch" self-study memo (unreadable glossary)
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Good book "Deep Learning from scratch" on GitHub
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
Python vs Ruby "Deep Learning from scratch" Summary
"Deep Learning from scratch" Self-study memo (10) MultiLayerNet class
"Deep Learning from scratch" Self-study memo (No. 11) CNN
"Deep Learning from scratch" Self-study memo (No. 16) I tried to build SimpleConvNet with Keras
"Deep Learning from scratch" Self-study memo (No. 17) I tried to build DeepConvNet with Keras
[Deep Learning from scratch] Initial value of neural network weight when using Relu function
"Deep Learning from scratch 2" Self-study memo (No. 21) Chapters 3 and 4
I set up Django from scratch (Vagrant, Centos, Python3)
Application of Deep Learning 2 made from scratch Spam filter
Deep Learning 2 Made from Zero Natural Language Processing 1.3 Summary
[Deep Learning] Execute SONY neural network console from CUI
Chapter 3 Neural Network Cut out only the good points of deep learning made from scratch
"Deep Learning from scratch" Self-study memo (Part 8) I drew the graph in Chapter 6 with matplotlib
I tried deep learning
[Deep Learning from scratch] Implementation of Momentum method and AdaGrad method
Try to build a deep learning / neural network with scratch
An amateur stumbled in Deep Learning from scratch Note: Chapter 1
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 2
Create an environment for "Deep Learning from scratch" with Docker
Non-information graduate student studied machine learning from scratch # 2: Neural network
An amateur stumbled in Deep Learning from scratch Note: Chapter 7
An amateur stumbled in Deep Learning from scratch Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 7
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 1
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 4