[PYTHON] 3. Normal distribution with neural network!

Introduction

This is the third in the series.

Last time, we confirmed that the neural network can be trained to output the mean and standard deviation of the given 10 numbers.

This time, we will give the normal distribution data and see if we can learn to output the three parameters that are the basis of the normal distribution.

normal distribution

The normal distribution consists of only three parameters. Here, the three parameters are µ, σ, and k.

  1. µ is the x coordinate of the central axis of the normal distribution
  2. σ is a measure of how wide the normal distribution is
  3. k is the magnification of how much to expand in the vertical direction

The general formula for the normal distribution is

y=k \times\frac{1}{\sqrt{2\pi\sigma^2}}exp{\left(-\frac{
                        (x - \mu)^2
                       }{
                         2\sigma^2
                       }\right)}

is. If you look at the equation, you can see the meaning of the three parameters (at least for k and µ). Make this expression available in python.

3-001.py


import math
#Define the function f.
f = lambda x,mu,sigma,k: k * (math.exp(-(x - mu)**2/2/sigma**2)) / math.sqrt(2*math.pi*sigma**2)

Generation of training data

I decided to generate 50,000 pieces of data as follows. The three parameters are random values with a range as shown below.

  1. x coordinate from 0 to 10,
  2. σ from 0.1 to 2
  3. µ from 3 to 7
  4. k from 0.5 to 10

First, let's divide the x coordinate between 0 and 10 into 100 and create an ndarray.

3-002.py


import numpy as np
n = np.linspace(0, 10, 100)

Let's create one waveform data so that the image will come out. There is no particular reason, but I tried setting µ = 3, σ = 1, and k = 5.

3-003.py


import matplotlib.pyplot as plt
exampleData = []
for i in range(len(n)):
	exampleData.append(f(n[i],3, 1, 5))

plt.title("Example of normal distribution")
plt.scatter(n, exampleData,label = "µ=3, σ=1, k=5",marker='.', s=20,alpha=1)
plt.legend(fontsize=14) #Show legend
plt.xlabel("x")
plt.ylabel("y")
plt.show()

100 numerical data are thrown into NN. I used a scatter plot here so that I can see that it is discrete data.

Figure_3-3.png

Now, let's generate the data that is actually used for learning. Create two lists to store the learning data and the correct answer data.

3-004.py


p = []
y = []
for kkk in range(50000):
	mu1 = np.random.rand()*4 + 3     #Randomly determine the value. 3 to 7
	si1 = np.random.rand()*1.9 + 0.1 #Randomly determine the value. 0.from 1 to 2
	k1 = np.random.rand()*9.5 + 0.5  #Randomly determine the value. 0.5 to 10
	y .append(mu1)#Record correct answer data
	y .append(si1)#Record correct answer data
	y .append(k1)#Record correct answer data
	for i in range(len(n)):
		p.append(f(n[i],mu1, si1, k1))#Pass the x-coordinate value and three parameters to the defined function f, and store the return value in the list p.

Change the list to ndarray and then change the shape to throw it into the NN.

3-005.py


#Make it an ndarray and change the shape.
t = np.array(p)
t = t.reshape(50000,len(n))
label = np.array(y)
label = label.reshape(50000,3)

Divide the data into the first half 40000 and the second half 10000. The first half is for training and the second half is for evaluation.

3-006.py


#Training in the first half 40,000. Evaluated in the latter half 10000.
d_training_x = t[:40000,:]
d_training_y = label[:40000,:]
d_test_x = t[40000:,:]
d_test_y = label[40000:,:]

NN design using keras

I don't know if the following is optimal, but I connected 5 fully connected layers. Gradually reduce the number of outputs so that 3 numbers are output at the end.

3-007.py


import keras

from keras.models import Sequential
from keras.layers import Dense
from keras.optimizers import Adam
batch_size = 128  #Train 128 training data at a time
epochs = 20 #How many laps to learn training data

model = Sequential()
model.add(Dense(100, activation='linear', input_shape=(len(n),)))
model.add(Dense(100, activation='tanh'))
model.add(Dense(40, activation='linear'))
model.add(Dense(20, activation='tanh'))
model.add(Dense(3, activation='linear'))
#Stochastic Gradient Descent Adam
optimizer = Adam(lr=0.001, beta_1=0.9, beta_2=0.999)
#Loss function root mean square error
model.compile(loss='mean_squared_error',optimizer=optimizer)
model.summary()##Check the shape of NN

training

Training has started. The return value of fit () is stored in the variable history, and the progress of learning is graphed later.

3-008.py


#Learning
history = model.fit(d_training_x, d_training_y,
batch_size=batch_size,
epochs=100,
verbose=1,# verbose..Redundant, talkative
validation_data=(d_test_x, d_test_y))

Visualization of learning

Let's graph how the learning progressed.

3-009.py


#Drawing a graph
import matplotlib.pyplot as plt
plt.plot(history.history['val_loss'], label = "val_loss")
plt.plot(history.history['loss'], label = "loss")
plt.legend() #Show legend
plt.title("Can NN learn to calculate normal distribution?")
plt.xlabel("epoch")
plt.ylabel("Loss")
plt.show()

Figure_3-1.png The vertical axis Loss is the sum-of-squares error of how different the correct data and the output data from the NN are. Two plots are plotted: loss for the data used for training and val_loss when predicted using the evaluation data.

Evaluation of NN

Let's give evaluation data to the trained NN.

3-010.py


#Give data to the trained NN
inp = d_test_x[:200,:]
out = d_test_y[:200,:]
pred = model.predict(inp, batch_size=1)

Let's graph the output.

3-011.py



plt.title("Can NN learn to calculate normal distribution?")
plt.scatter(out[:,0], pred[:,0],label = "µ",marker='.', s=20,alpha=0.3)
plt.scatter(out[:,1], pred[:,1],label = "σ",marker='.', s=20,color="green",alpha=0.3)
plt.scatter(out[:,2], pred[:,2],label = "k",marker='.', s=20,color="red",alpha=0.3)
plt.legend(fontsize=14) #Show legend
plt.xlabel("expected value")
plt.ylabel("prediction")
plt.show()

You can see that the correct answer value and the output from NN are quite close. In other words, I was able to study properly.

Figure_3-2.png

Summary

It's hard to see because it overlaps, but all three parameters are output fairly correctly. It's done! Series 1st Preparation Series 2nd Mean and Standard Deviation Series 3rd Normal Distribution Series 4th Yen

Recommended Posts

3. Normal distribution with neural network!
Neural network with Python (scikit-learn)
Neural network starting with Chainer
4. Circle parameters with neural network!
Neural network with OpenCV 3 and Python 3
Generate a normal distribution with SciPy
Simple classification model with neural network
[TensorFlow] [Keras] Neural network construction with Keras
Compose with a neural network! Run Magenta
Predict time series data with neural network
Try drawing a normal distribution with matplotlib
Bivariate normal distribution
Persist the neural network built with PyBrain
Random number generator with normal distribution N (0,1)
2. Mean and standard deviation with neural network!
Parametric Neural Network
Verification of Batch Normalization with multi-layer neural network
Implement Convolutional Neural Network
Implement Neural Network from 1
Convolutional neural network experience
Verification of normal distribution
Train MNIST data with a neural network in PyTorch
Implement a 3-layer neural network
Create a web application that recognizes numbers with a neural network
Simulate neural activity with Brian2
Try to build a deep learning / neural network with scratch
Python sample to learn XOR with genetic algorithm with neural network
[Deep learning] Image classification with convolutional neural network [DW day 4]
Neural network implementation in python
Pytorch Neural Network (CNN) Tutorial 1.3.1.
Neural network implementation (NumPy only)
TensorFlow Tutorial-Convolutional Neural Network (Translation)
Network programming with Python Scapy
Network performance measurement with iperf
Simple neural network implementation using Chainer
Measuring network one-way delay with python
Implementation of a two-layer neural network 2
PRML Chapter 5 Neural Network Python Implementation
What is a Convolutional Neural Network?
Standardize non-normal distribution with robust Z-score
Write a Residual Network with TFLearn
I implemented a two-layer neural network
Linear regression with Student's t distribution
Simple neural network theory and implementation
Touch the object of the neural network
[Language processing 100 knocks 2020] Chapter 8: Neural network
Cat detection with OpenCV (model distribution)
Operate Linux Network Namespace with Go
Build a classifier with a handwriting recognition rate of 99.2% with a TensorFlow convolutional neural network