[PYTHON] I tried to find the average of the sequence with TensorFlow

There is a misunderstanding that Tensorflow is a machine learning library, so I wrote a code to find the average to deepen my understanding.

--Generate 100 uniform random numbers from 0 to 100. The average is 50. --Some changes to the learning coefficient

Gradient decent

python



import matplotlib.pylab as plt
%matplotlib inline
import numpy as np
import tensorflow as tf

x_train = np.random.randint(0,100, size=100)

n_itr = 100

m = tf.Variable([30.0], tf.float32) #Variables to estimate
x = tf.placeholder(tf.float32)      #Data to give

loss = tf.reduce_sum(tf.square(x - m)) #Square error sum

for lr in [0.009, 0.001, 0.0001]:

    optimizer = tf.train.GradientDescentOptimizer(lr) #Solid gradient method
    train = optimizer.minimize(loss)

    init = tf.global_variables_initializer()
    sess = tf.Session()
    sess.run(init)

    est = []
    for i in range(n_itr):
        _, est_m = sess.run([train, m], {x:x_train})
        est.append(est_m)

    est = np.array(est)
    plt.plot(est.reshape(n_itr), label="lr={}".format(lr))

plt.title("batch gradient decent")
plt.legend()
plt.show();

It's converged to the true average

--If the learning coefficient is large, it will vibrate. --Divergence when greater than 0.01 --If the learning coefficient is small, convergence is slow.

Unknown.png

RMS Prop

with optimizer I just changed the range of learning rate.

python


for lr in [5, 1, 0.1, 0.01]:

    optimizer = tf.train.RMSPropOptimizer(lr)
    train = optimizer.minimize(loss)

    init = tf.global_variables_initializer()
    sess = tf.Session()
    sess.run(init)

    est = []
    for i in range(n_itr):
        _, est_m = sess.run([train, m], {x:x_train})
        est.append(est_m)

    est = np.array(est)
    plt.plot(est.reshape(n_itr), label="lr={}".format(lr))

plt.title("batch RMS Prop")
plt.legend()
plt.show();

Unknown-1.png

--If the learning rate is too large, it will oscillate after convergence. --The learning rate is considerably higher than that of gradient descent.

Adam

python


for lr in [5, 1, 0.1, 0.01]:

    optimizer = tf.train.AdamOptimizer(lr)
    train = optimizer.minimize(loss)

    init = tf.global_variables_initializer()
    sess = tf.Session()
    sess.run(init)

    est = []
    for i in range(n_itr):
        _, est_m = sess.run([train, m], {x:x_train})
        est.append(est_m)

    est = np.array(est)
    plt.plot(est.reshape(n_itr), label="lr={}".format(lr))

plt.title("batch Adam")
plt.legend()
plt.show();

Unknown-2.png

--Vibration is gentle. Recall the vibration graph of transient phenomena. --If the learning rate is high, it will overshoot.

AdaGrad

python


for lr in [20, 10, 5, 1, 0.1, 0.01]:

    optimizer = tf.train.AdagradOptimizer(lr)
    train = optimizer.minimize(loss)

    init = tf.global_variables_initializer()
    sess = tf.Session()
    sess.run(init)

    est = []
    for i in range(n_itr):
        _, est_m = sess.run([train, m], {x:x_train})
        est.append(est_m)

    est = np.array(est)
    plt.plot(est.reshape(n_itr), label="lr={}".format(lr))

plt.title("batch AdaGrad")
plt.legend()
plt.show();

Unknown-3.png

AdaDelta

python


for lr in [20000, 10000, 1000, 100, 10]:

    optimizer = tf.train.AdadeltaOptimizer(lr)
    train = optimizer.minimize(loss)

    init = tf.global_variables_initializer()
    sess = tf.Session()
    sess.run(init)

    est = []
    for i in range(n_itr):
        _, est_m = sess.run([train, m], {x:x_train})
        est.append(est_m)

    est = np.array(est)
    plt.plot(est.reshape(n_itr), label="lr={}".format(lr))

plt.title("batch AdaDelta")
plt.legend()
plt.show();

Unknown-4.png

--The behavior is almost the same as AdaGrad. --The learning rate is quite high.

Recommended Posts

I tried to find the average of the sequence with TensorFlow
I tried to find the entropy of the image with python
I tried to find an alternating series with tensorflow
I tried to implement Autoencoder with TensorFlow
I tried to visualize AutoEncoder with TensorFlow
I tried to make something like a chatbot with the Seq2Seq model of TensorFlow
I tried to automate the watering of the planter with Raspberry Pi
I tried to expand the size of the logical volume with LVM
I tried to improve the efficiency of daily work with Python
I tried to save the data with discord
I tried to find 100 million digits of pi
I tried to touch the API of ebay
I tried to correct the keystone of the image
I tried to study DP with Fibonacci sequence
I tried to predict the price of ETF
I tried to vectorize the lyrics of Hinatazaka46!
I tried to transform the face image using sparse_image_warp of TensorFlow Addons
I tried to get the authentication code of Qiita API with Python.
I tried to automatically extract the movements of PES players with software
I tried to find the optimal path of the dreamland by (quantum) annealing
I tried to analyze the negativeness of Nono Morikubo. [Compare with Posipa]
I tried to streamline the standard role of new employees with Python
I tried to visualize the text of the novel "Weathering with You" with WordCloud
I tried to get the movie information of TMDb API with Python
I tried to predict the behavior of the new coronavirus with the SEIR model.
I tried to make Othello AI with tensorflow without understanding the theory of machine learning ~ Introduction ~
I tried to make Othello AI with tensorflow without understanding the theory of machine learning ~ Implementation ~
I tried to learn the sin function with chainer
I tried to extract features with SIFT of OpenCV
I tried to summarize the basic form of GPLVM
I tried the MNIST tutorial for beginners of tensorflow.
I tried to touch the CSV file with Python
I tried to solve the soma cube with python
I tried to visualize the spacha information of VTuber
I tried to erase the negative part of Meros
I tried to solve the problem with Python Vol.1
I tried to implement Grad-CAM with keras and tensorflow
I tried to implement automatic proof of sequence calculation
I tried to classify the voices of voice actors
I tried to summarize the string operations of Python
I tried to easily visualize the tweets of JAWS DAYS 2017 with Python + ELK
I tried running the TensorFlow tutorial with comments (text classification of movie reviews)
I tried object detection with YOLO v3 (TensorFlow 2.1) on the GPU of windows!
I didn't understand the Resize of TensorFlow so I tried to summarize it visually.
The story of making soracom_exporter (I tried to monitor SORACOM Air with Prometheus)
I tried to create a model with the sample of Amazon SageMaker Autopilot
I tried to automatically send the literature of the new coronavirus to LINE with Python
I tried to make Othello AI with tensorflow without understanding the theory of machine learning ~ Battle Edition ~
I tried to implement a volume moving average with Quantx
I tried to find out how to streamline the work flow with Excel x Python ②
I tried to find out the outline about Big Gorilla
I tried porting the code written for TensorFlow to Theano
[Horse Racing] I tried to quantify the strength of racehorses
I tried "gamma correction" of the image with Python + OpenCV
I tried to simulate how the infection spreads with Python
I tried to analyze the whole novel "Weathering with You" ☔️
I tried to get the location information of Odakyu Bus
I tried the TensorFlow tutorial 1st
I tried to find out how to streamline the work flow with Excel x Python ④
I tried to notify the train delay information with LINE Notify
I tried refactoring the CNN model of TensorFlow using TF-Slim