[PYTHON] [Deep Learning from scratch] I tried to implement sigmoid layer and Relu layer.

Introduction

This article is an easy-to-understand output of ** Deep Learning from scratch Chapter 6 Error back propagation method **. I was able to understand it myself in the humanities, so I hope you can read it comfortably. Also, I would be more than happy if you could refer to it when studying this book.

Implementation of sigmoid layer

class Sigmoid: #Layer implementation of sigmoid function
    def __init__(self):
        self.out = None #The output signal of forward propagation processing is required for the back propagation processing of the sigmoid function.
        
    def forward(self, x):
        out = 1 / (1 + np.log(-x))
        self.out = out #Forward propagation output storage
        
        return out
    
    def backward(self, dout):
        dx = dout * self.out * (1 - self.out) #Processing to find the back propagation of the sigmoid function
        
        return dx

In the forward propagation process, the process is performed according to the formula of the sigmoid function, and it is returned by return. However, since the result of the forward propagation process is used in the back propagation process, it is saved in the instance variable.

In the back propagation process, it would be quite complicated if it is normal, so here we use a simplified formula to find the derivative. Find the derivative of the input value of the sigmoid layer using the previous derivative and the result of the forward propagation process saved in the instance variable.

Relu layer implementation

class Relu: #Layer implementation of Relu function
    def __init__(self):
        self.mask = None #An array of False is entered if the input signal below 0 is True and greater than 0.
        
    def forward(self, x):
        self.mask = (x <= 0)
        out = x.copy()
        out[self.mask] = 0 #Set the input signal below 0 to 0
        
        return out
    
    def backward(self, dout):
        dout[self.mask] = 0 #If the forward propagation is 0, the back propagation is also 0, so set it to 0 here.
        dx = dout #Others inherit the previous derivative
        
        return dx

In the forward propagation process, in order to implement without using if, first, all the input values that do not exceed 0 are set to True, and those that exceed 0 are set to False, and that is saved in the instance variable. This will be used later.

Next, copy the array containing the input values, and use the True and False arrays saved earlier to convert all True ones to 0. And return it with return.

In the back propagation process, all True is converted to 0 using the True and False arrays saved in the forward propagation process. If the forward propagation is 0, the reverse propagation is also 0.

After that, it inherits the previous derivative and returns it with return.

Recommended Posts

[Deep Learning from scratch] I tried to implement sigmoid layer and Relu layer.
I tried to implement Perceptron Part 1 [Deep Learning from scratch]
[Deep Learning from scratch] I tried to explain Dropout
[Deep Learning from scratch] I implemented the Affine layer
I tried to implement Deep VQE
"Deep Learning from scratch" Self-study memo (No. 17) I tried to build DeepConvNet with Keras
[Deep Learning from scratch] I tried to explain the gradient confirmation in an easy-to-understand manner.
I tried to implement deep learning that is not deep with only NumPy
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Deep Learning from scratch
I tried deep learning
I tried to classify Oba Hana and Emiri Otani by deep learning
[Deep Learning from scratch] Layer implementation from softmax function to cross entropy error
I'm not sure, but I feel like I understand Deep Learning (I tried Deep Learning from scratch)
[Python] [Natural language processing] I tried Deep Learning ❷ made from scratch in Japanese ①
I tried to extract a line art from an image with Deep Learning
I tried to implement Cifar10 with SONY Deep Learning library NNabla [Nippon Hurray]
I tried to classify Oba Hana and Emiri Otani by deep learning (Part 2)
I tried to implement and learn DCGAN with PyTorch
Artificial intelligence, machine learning, deep learning to implement and understand
I tried to implement Grad-CAM with keras and tensorflow
Deep Learning from scratch 1-3 chapters
I tried to implement PCANet
I tried to implement StarGAN (1)
Python vs Ruby "Deep Learning from scratch" Chapter 3 Graph of step function, sigmoid function, ReLU function
I tried to predict horse racing by doing everything from data collection to deep learning
[Deep Learning from scratch] Implementation of Momentum method and AdaGrad method
I tried to implement anomaly detection by sparse structure learning
I tried to implement ListNet of rank learning with Chainer
I tried to divide with a deep learning language model
Deep learning from scratch (cost calculation)
I tried to implement adversarial validation
I tried to implement hierarchical clustering
I tried deep learning using Theano
Deep Learning memos made from scratch
I tried to implement Realness GAN
I tried to make deep learning scalable with Spark × Keras × Docker
I tried to extract players and skill names from sports articles
[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network
Reinforcement learning to learn from zero to deep
I tried to implement Autoencoder with TensorFlow
I tried to implement permutation in Python
[Learning memo] Deep Learning made from scratch [Chapter 7]
Deep learning from scratch (forward propagation edition)
Introduction to Deep Learning ~ Convolution and Pooling ~
Deep learning / Deep learning from scratch 2-Try moving GRU
Deep learning / Deep learning made from scratch Chapter 6 Memo
I tried to implement PLSA in Python 2
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
I tried to implement ADALINE in Python
Image alignment: from SIFT to deep learning
I tried to implement PPO in Python
I tried to implement CVAE with PyTorch
"Deep Learning from scratch" in Haskell (unfinished)
Deep learning / Deep learning made from scratch Chapter 7 Memo
[Windows 10] "Deep Learning from scratch" environment construction
Learning record of reading "Deep Learning from scratch"
[Deep Learning from scratch] About hyperparameter optimization
"Deep Learning from scratch" Self-study memo (Part 12) Deep learning
[Learning memo] Deep Learning made from scratch [~ Chapter 4]