[PYTHON] I tried super easy linear separation with Chainer

As a practice of Chainer, I tried a simple linear separation problem.

environment

task

I want to learn a function that determines whether or not I am obese by inputting height (cm), weight (kg), and chest circumference (cm). However, obesity is defined here as having a BMI (body weight divided by the square of height) of 25 or more. Therefore, information on body weight and height is sufficient to determine whether or not the patient is obese, and information on chest circumference is not required. Then, will the learning device made this time be able to determine whether or not it is obese by focusing only on height and weight without being confused by the information on the chest circumference?

data

I made dummy data in Excel. Height, weight, chest circumference, and obesity flags are arranged in one line separated by spaces. Height, weight, and chest circumference were each generated by adding a normal random number with an appropriate variance to the average value of men. The obesity flag was set to 1 if the BMI calculated from height and weight was 25 or more. We made 1000 of these independently, 900 of which were for learning and 100 for evaluation.

 Height Weight Chest circumference Obesity flag
152.5110992	70.64096855	76.24909648	1
176.5483602	72.54812988	79.99468908	0
171.9815877	78.13768514	80.87788608	1
180.013773	77.60660479	79.71464192	0
171.9685041	81.20240554	84.93720091	1
186.3999693	77.03393024	82.25099179	0
175.1117213	81.23388203	86.89111757	1

data.png

As you can see, it's almost linear.

Learner

After practicing Chainer, I tried to build a multi-layer perceptron. It has a three-layer structure with three dimensions for input, four dimensions for hidden elements, and two dimensions for output. (Since it is a linear separation task, it can also be a single-layer perceptron.) Other settings are as follows.

--Activation function: ReLu --Optimization algorithm: Adam --Error function: Softmax cross entropy --Dropout rate: 0.5 --Number of mini batches: 5 --Number of iterations (epoch): 100

class MLP(Chain):
  def __init__(self):
    super(MLP, self).__init__(
 # 3-4-2 dimensional network
      l1=L.Linear(3, 4),
      l2=L.Linear(4, 2),
     )
  def forward(self, x, t, train):
    h1 = F.dropout(F.relu(self.l1(x)), train=train)
    y = self.l2(h1)
    return F.softmax_cross_entropy(y, t), F.accuracy(y, t)

# Instantiation
model = MLP()
# Adam is used as the optimization algorithm
optimizer = optimizers.Adam()
optimizer.setup(model)

 N = 900 # Number of training data
 N_test = 100 # Number of evaluation data
 n_epoch = 100 # number of iterations
 batchsize = 5 # mini batch
# Omitted below

result

The error function and the accuracy rate were plotted by the number of epochs. loss_acc.png

--Left blue: Error function in training data --Left green: Correct answer rate in learning data --Right blue: Error function in evaluation data --Right green: Correct answer rate in evaluation data

It was a little less than 80% performance. subtle?

We also looked at what kind of output is made to the evaluation data after learning.

 Height Weight Weight Obesity Flag Estimated by Chest circumference System Correct Obesity Flag
[ 179.30055237   69.73477936   84.73832703] 0 0
[ 176.89619446   84.05502319   85.10128021] 1 1
[ 172.04129028   77.36618805   87.89541626] 1 1
[ 168.48660278   73.91072845   84.5171814 ] 1 1
[ 166.53656006   71.42696381   83.17546844] 0 1
[ 163.44270325   77.11021423   90.57539368] 1 1
[ 180.63993835   77.33372498   85.33548737] 0 0
[ 165.73175049   71.87976837   80.57328033] 0 1

The second and fourth from the bottom, which were originally obese, were judged to be normal. All of them are low when looking only at their weight. Haven't you grasped the relationship with your height?

Where to get stuck

--Fall into a local solution There were times when I fell into a local solution and learning did not proceed. I reduced the number of mini-batch, reassigned the initial weights, tried many times, and tried again until it was learned well.

from now on

Performance of this task is less than 80% low. I want to get a feel for it by making various adjustments such as learning rate, mini-batch size, dropout rate, and data normalization.

Recommended Posts

I tried super easy linear separation with Chainer
I tried to learn the sin function with chainer
I tried fp-growth with python
I tried scraping with Python
I tried Learning-to-Rank with Elasticsearch!
I tried clustering with PyCaret
I tried gRPC with Python
I tried scraping with python
I tried follow management with Twitter API and Python (easy)
I tried to implement ListNet of rank learning with Chainer
I tried trimming efficiently with OpenCV
I tried summarizing sentences with summpy
I tried machine learning with liblinear
I tried web scraping with python.
I tried implementing DeepPose with PyTorch
I tried face detection with MTCNN
I tried running prolog with python 3.8.2.
I tried SMTP communication with Python
I tried sentence generation with GPT-2
I tried learning LightGBM with Yellowbrick
I tried face recognition with OpenCV
I tried running the DNN part of OpenPose with Chainer CPU
I tried multiple regression analysis with polynomial regression
I tried sending an SMS with Twilio
I tried using Amazon SQS with django-celery
[Python] Super easy test with assert statement
I tried to implement Autoencoder with TensorFlow
I tried linebot with flask (anaconda) + heroku
I tried to visualize AutoEncoder with TensorFlow
I tried scraping Yahoo News with Python
Make GUI apps super easy with tkinter
I tried using Selenium with Headless chrome
I tried factor analysis with Titanic data!
I tried learning with Kaggle's Titanic (kaggle②)
I tried sending an email with python.
765 I tried to identify the three professional families by CNN (with Chainer 2.0.0)
[AWS] [GCP] I tried to make cloud services easy to use with Python
I tried a functional language with Python
I tried to learn the angle from sin and cos with chainer
I tried batch normalization with PyTorch (+ note)
(Machine learning) I tried to understand Bayesian linear regression carefully with implementation.
I tried recursion with Python ② (Fibonacci sequence)
I tried implementing DeepPose with PyTorch PartⅡ
I tried to implement CVAE with PyTorch
I tried playing with the image with Pillow
I tried to solve TSP with QAOA
I tried simple image recognition with Jupyter
I tried CNN fine tuning with Resnet
I tried natural language processing with transformers.
[Zaif] I tried to make it easy to trade virtual currencies with Python
#I tried something like Vlookup with Python # 2
I tried to make my own source code compatible with Chainer v2 alpha
I tried handwriting recognition of runes with scikit-learn
I tried to predict next year with AI
I tried hundreds of millions of SQLite with python
I tried to detect Mario with pytorch + yolov3
I tried to implement reading Dataset with PyTorch
I tried to use lightGBM, xgboost with Boruta
I tried image recognition of CIFAR-10 with Keras-Learning-
I tried to learn logical operations with TF Learn
I tried to move GAN (mnist) with keras