[PYTHON] You will be an engineer in 100 days ――Day 85 ――Programming ――About machine learning 10

Click here until yesterday

You will become an engineer in 100 days --Day 76 --Programming --About machine learning

You will become an engineer in 100 days-Day 70-Programming-About scraping

You will become an engineer in 100 days --Day 66 --Programming --About natural language processing

You will become an engineer in 100 days --Day 63 --Programming --Probability 1

You become an engineer in 100 days --Day 59 --Programming --Algorithm

You will become an engineer in 100 days --- Day 53 --Git --About Git

You will become an engineer in 100 days --Day 42 --Cloud --About cloud services

You will become an engineer in 100 days --Day 36 --Database --About the database

You will be an engineer in 100 days --Day 24 --Python --Basics of Python language 1

You will become an engineer in 100 days --Day 18 --Javascript --JavaScript basics 1

You will become an engineer in 100 days --Day 14 --CSS --CSS Basics 1

You will become an engineer in 100 days --Day 6 --HTML --HTML basics 1

This time, the continuation of the story about machine learning An implementation of deep learning.

About deep learning

I hope you can see the last time for a little explanation.

Roughly, deep learning is of neural network It has two or more intermediate layers.

There is also a neural network library in scikit-learn.

The code for classifying chords can be implemented as follows:

from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.neural_network import MLPClassifier

iris = load_iris()
X = iris.data
Y = iris.target
x_train, x_test, y_train, y_test = train_test_split(X, Y, test_size=0.3, random_state=0)
clf = MLPClassifier(solver="sgd",random_state=0,max_iter=10000)
clf.fit(x_train, y_train)

print (clf.score(x_test, y_test))

0.9555555555555556

In deep learning, this neural network The point is how to assemble.

Therefore, there are some dedicated libraries.

About the deep learning library

The current mainstream deep learning library TensorFlow: Library developed by Google Keras: Made to work on top of other deep learning libraries such as TensorFlow and Theano Pytorch: Many researchers implement and publish the contents of their latest recent papers on PyTorch.

Can be mentioned. I think that the number of examples of Pytorch has increased relatively recently, but if you suppress these three usages I think it's good.

Implement deep learning with Keras

Let's implement deep learning using Keras. Since installation etc. is required to use Keras Let's install it with TensorFlow.

Once installed, run it. First, prepare the data.

Separate the data for training and testing.

import numpy as np
from sklearn.model_selection import train_test_split as split
from sklearn import datasets

iris = datasets.load_iris()
x_train, x_test, y_train, y_test = split(iris.data,iris.target,train_size=0.8,test_size=0.2)

With this, we have prepared the data for the end.

Next, we will build a neural network. First, load the required library.

After loading the model, we will add the structure there. This time, I will build a model with 32 intermediate layers.

First, specify 32 neurons in the middle layer and 4 neurons in the input layer. For each intermediate layer, specify the ReLU function as the activation function.

Specify 3 output layers and apply the softmax function to the activation function

from keras.models import Sequential
from keras.layers import Dense,Activation
 
#Creating a model for use in a neutral network
model = Sequential()
model.add(Dense(32, activation = 'relu' , input_dim=4))
model.add(Dense( 3, activation = 'softmax'))
model.compile(loss='sparse_categorical_crossentropy',optimizer='sgd',metrics=['accuracy'])
 
#Learning execution
model.fit(x_train,y_train,epochs=100)

This is the end of learning. Performance is ...

#Evaluation execution
score = model.evaluate(x_test,y_test,batch_size = 1)
print(score[1])

30/30 [==============================] - 0s 886us/step 0.9666666388511658

Isn't it pretty good?

Let's make a prediction. Converts the index value of the largest value to the predicted value.

#Forecast
predicts = model.predict(x_test)
print(predicts)

#Take the index value of the largest value
predict = [p.argmax()  for p in predicts]

[[1.1208696e-01 6.7907917e-01 2.0883384e-01] [2.8818967e-03 2.8327599e-01 7.1384215e-01] [9.5001817e-01 4.9240895e-02 7.4097671e-04] [7.4494570e-03 5.3676081e-01 4.5578974e-01] [4.5553190e-03 4.4827569e-01 5.4716897e-01] [9.0425771e-01 9.3258828e-02 2.4834664e-03] [9.8394436e-01 1.5963007e-02 9.2610091e-05] [9.3106699e-01 6.7388512e-02 1.5446048e-03] [2.9033832e-03 3.8279051e-01 6.1430609e-01] [7.6781757e-02 7.3785144e-01 1.8536681e-01] [9.0473723e-01 9.2504598e-02 2.7581297e-03] [2.3145874e-03 2.9854658e-01 6.9913888e-01] [1.1571125e-03 2.1894032e-01 7.7990258e-01] [3.8370032e-02 5.8638370e-01 3.7524620e-01] [1.8353970e-03 3.2460487e-01 6.7355973e-01] [4.0023820e-03 3.6881861e-01 6.2717897e-01] [9.3579787e-01 6.3313372e-02 8.8873273e-04] [2.8993792e-03 3.9125395e-01 6.0584664e-01] [1.7156457e-03 3.7600714e-01 6.2227720e-01] [5.9143644e-02 7.4147564e-01 1.9938073e-01] [4.4851364e-03 4.1915748e-01 5.7635736e-01] [2.1494372e-01 6.4855641e-01 1.3649981e-01] [7.4421586e-03 4.6687931e-01 5.2567846e-01] [4.7624888e-04 3.0563667e-01 6.9388705e-01] [9.5614207e-01 4.3193795e-02 6.6417205e-04] [5.6969654e-02 7.1488911e-01 2.2814126e-01] [2.0755678e-03 3.4245396e-01 6.5547043e-01] [9.3328977e-01 6.5471925e-02 1.2382563e-03] [1.8808943e-03 2.6230952e-01 7.3580956e-01] [9.6379587e-04 2.6067016e-01 7.3836607e-01]]

Since the probability of the numerical value of each category comes out, the largest one is the correct answer.

Let's compare it with the correct answer.

for y_pred,y_true in zip(predict,y_test):
    print(y_pred,y_true)

1 1 2 2 0 0 1 2 2 2 0 0 0 0 0 0 2 2 1 1 0 0 2 2 2 2 1 1 2 2 2 2 0 0 2 2 2 2 1 1 2 2 1 1 2 2 2 2 0 0 1 1 2 2 0 0 2 2 2 2

Except for one, it's very accurate.

Summary

Today we have sent you an implementation in deep learning. Deep learning is too deep for me to learn at all.

There is no doubt that it will continue to be the center of IT trends. You shouldn't lose even if you study.

Let's keep it down.

15 days until you become an engineer

Author information

Otsu py's HP: http://www.otupy.net/

Youtube: https://www.youtube.com/channel/UCaT7xpeq8n1G_HcJKKSOXMw

Twitter: https://twitter.com/otupython

Recommended Posts

You will be an engineer in 100 days ――Day 81 ――Programming ――About machine learning 6
You will be an engineer in 100 days ――Day 82 ――Programming ――About machine learning 7
You will be an engineer in 100 days ――Day 79 ――Programming ――About machine learning 4
You will be an engineer in 100 days ――Day 80 ――Programming ――About machine learning 5
You will be an engineer in 100 days ――Day 78 ――Programming ――About machine learning 3
You will be an engineer in 100 days ――Day 84 ――Programming ――About machine learning 9
You will be an engineer in 100 days ――Day 83 ――Programming ――About machine learning 8
You will be an engineer in 100 days ――Day 77 ――Programming ――About machine learning 2
You will be an engineer in 100 days ――Day 85 ――Programming ――About machine learning 10
You will be an engineer in 100 days ――Day 71 ――Programming ――About scraping 2
You will be an engineer in 100 days ――Day 61 ――Programming ――About exploration
You will be an engineer in 100 days ――Day 74 ――Programming ――About scraping 5
You will be an engineer in 100 days ――Day 73 ――Programming ――About scraping 4
You will be an engineer in 100 days ――Day 75 ――Programming ――About scraping 6
You will be an engineer in 100 days --Day 68 --Programming --About TF-IDF
You will be an engineer in 100 days ――Day 70 ――Programming ――About scraping
You will be an engineer in 100 days --Day 63 --Programming --Probability 1
You will be an engineer in 100 days --Day 65 --Programming --Probability 3
You will be an engineer in 100 days --Day 64 --Programming --Probability 2
You will be an engineer in 100 days --Day 86 --Database --About Hadoop
You will be an engineer in 100 days ――Day 60 ――Programming ――About data structure and sorting algorithm
You will be an engineer in 100 days --Day 27 --Python --Python Exercise 1
You will be an engineer in 100 days --Day 34 --Python --Python Exercise 3
You will be an engineer in 100 days --Day 31 --Python --Python Exercise 2
You become an engineer in 100 days ――Day 67 ――Programming ――About morphological analysis
You become an engineer in 100 days ――Day 66 ――Programming ――About natural language processing
You will be an engineer in 100 days ――Day 24 ―― Python ―― Basics of Python language 1
You will be an engineer in 100 days ――Day 30 ―― Python ―― Basics of Python language 6
You will be an engineer in 100 days ――Day 25 ―― Python ―― Basics of Python language 2
You will be an engineer in 100 days --Day 33 --Python --Basics of the Python language 8
You will be an engineer in 100 days --Day 26 --Python --Basics of the Python language 3
You will be an engineer in 100 days --Day 35 --Python --What you can do with Python
Become an AI engineer soon! Comprehensive learning of Python / AI / machine learning / deep learning / statistical analysis in a few days!
You have to be careful about the commands you use every day in the production environment.
About testing in the implementation of machine learning models
About machine learning overfitting
Programming learning record day 2
Until an engineer who was once frustrated about machine learning manages to use machine learning at work
Machine learning in Delemas (practice)
An introduction to machine learning
About machine learning mixed matrices
Python Machine Learning Programming> Keywords
How about Anaconda for building a machine learning environment in Python?
Learn machine learning anytime, anywhere in an on-demand Jupyter Notebook environment