TensorFlow: Run data learned in Python on Android

Based on the TensorFlow tutorial (image recognition of handwritten image data), I wrote out the network data of Deep Learning and created a demo of handwriting recognition on Android.

tensorflow_mnist_screen0.png

Export training data

Based on the model of "MNIST For ML Beginners" in the TensorFlow tutorial, first, write the learning data in Python on your PC.

"MNIST For ML Beginners" https://www.tensorflow.org/versions/master/tutorials/mnist/beginners/index.html

Based on the tutorial here, I modified the script for exporting graph data.

https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/master/trainer-script/beginner.py

In order to export network data, it is necessary to export the graph information and the tensor data (learned content) contained in Variable together, but at the moment TensorFlow can save the graph information and Variable together. I couldn't seem to do it.

So, after learning, evaluate the contents of Viriables, convert it to ndarray once,

# Store variable
_W = W.eval(sess)
_b = b.eval(sess)

I converted the ndarray to Constant and used it as a substitute for Variables to reconstruct the graph and exported the graph and training data together.

#Regenerate graph g_2 = tf.Graph() with g_2.as_default():

Name the input part "input"

    x_2 = tf.placeholder("float", [None, 784], name="input")

Replace # Variables with Constant W_2 = tf.constant(_W, name="constant_W") b_2 = tf.constant(_b, name="constant_b")

Name the output part "output"

    y_2 = tf.nn.softmax(tf.matmul(x_2, W_2) + b_2, name="output")

    sess_2 = tf.Session()
    init_2 = tf.initialize_all_variables()
    sess_2.run(init_2)

Export graph

    graph_def = g_2.as_graph_def()
    tf.train.write_graph(graph_def, './tmp/beginner-export',
                         'beginner-graph.pb', as_text=False)

In order to call it on the Android side, I have named the input and output nodes as "input" and "output", respectively.

It took only a few seconds to train and export this model.

Android side

The Android demo originally included in TensorFlow could only be built in the Bazel environment, so I created an environment where Android apps can be built using only Android Studio and NDK.

I saved the library file (.a file) that can be created when building the Android sample of TensorFlow with Bazel so that it can be built only with NDK.

https://github.com/miyosuda/TensorFlowAndroidMNIST/tree/master/jni-build/jni

Android.mk looks like this.

Makefile


LOCAL_PATH := $(call my-dir)

include $(CLEAR_VARS)

TENSORFLOW_CFLAGS	  := -frtti \
  -fstack-protector-strong \
  -fpic \
  -ffunction-sections \
  -funwind-tables \
  -no-canonical-prefixes \
  '-march=armv7-a' \
  '-mfpu=vfpv3-d16' \
  '-mfloat-abi=softfp' \
  '-std=c++11' '-mfpu=neon' -O2 \

TENSORFLOW_SRC_FILES := ./tensorflow_jni.cc \
	./jni_utils.cc \

LOCAL_MODULE    := tensorflow_mnist
LOCAL_ARM_MODE  := arm
LOCAL_SRC_FILES := $(TENSORFLOW_SRC_FILES)
LOCAL_CFLAGS    := $(TENSORFLOW_CFLAGS)

LOCAL_LDLIBS    := \
	-Wl,-whole-archive \
	$(LOCAL_PATH)/libs/$(TARGET_ARCH_ABI)/libandroid_tensorflow_lib.a \
	$(LOCAL_PATH)/libs/$(TARGET_ARCH_ABI)/libre2.a \
	$(LOCAL_PATH)/libs/$(TARGET_ARCH_ABI)/libprotos_all_cc.a \
	$(LOCAL_PATH)/libs/$(TARGET_ARCH_ABI)/libprotobuf.a \
	$(LOCAL_PATH)/libs/$(TARGET_ARCH_ABI)/libprotobuf_lite.a \
	-Wl,-no-whole-archive \
	$(NDK_ROOT)/sources/cxx-stl/gnu-libstdc++/4.9/libs/$(TARGET_ARCH_ABI)/libgnustl_static.a \
	$(NDK_ROOT)/sources/cxx-stl/gnu-libstdc++/4.9/libs/$(TARGET_ARCH_ABI)/libsupc++.a \
	-llog -landroid -lm -ljnigraphics -pthread -no-canonical-prefixes '-march=armv7-a' -Wl,--fix-cortex-a8 -Wl,-S \

LOCAL_C_INCLUDES += $(LOCAL_PATH)/include $(LOCAL_PATH)/genfiles $(LOCAL_PATH)/include/third_party/eigen3

NDK_MODULE_PATH := $(call my-dir)

include $(BUILD_SHARED_LIBRARY)

If you do not set the compiler option and linker option as above, the training data (Protocol Buffers data) could not be read correctly even if the build passed.

Prepare 28x28 handwritten pixel data on the Java side, pass it to the c ++ side with JNI, and input it to the graph created based on the graph data.

https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/master/jni-build/jni/tensorflow_jni.cc

tensorflow_mnist_screen0.png

I was able to recognize it safely.

Replacement of training data

With the above model, the recognition rate is about 91%, so I replaced it with a model using Deep Learning (recognition rate 99.2%) in "Deep MNIST for Experts" of TensorFlow.

"Deep MNIST for Experts" https://www.tensorflow.org/versions/master/tutorials/mnist/pros/index.html

Script for writing training data https://github.com/miyosuda/TensorFlowAndroidMNIST/blob/master/trainer-script/expert.py

If the DropOut node is included, an error occurred for some reason when executing on the Android side, and since the DropOut node is originally necessary only for learning, I removed it from the graph when exporting.

It took about an hour to study in my environment (MacBook Pro).

Since the names of the input node and output node are the same, after exporting, all the code on the Android side can be executed as it is by simply replacing the training data.

When I tried handwriting recognition, I was able to confirm that it recognized numbers from 0 to 9 fairly accurately.

Source

Click here for the above set of sauces https://github.com/miyosuda/TensorFlowAndroidMNIST

Recommended Posts

TensorFlow: Run data learned in Python on Android
Run Tensorflow 2.x on Python 3.7
Run TensorFlow Docker Image on Python3
Run AzureKinect in Python on Christmas Eve.
Run Python in C ++ on Visual Studio 2017
Python variables and data types learned in chemoinformatics
Run Python YOLOv3 in C ++ on Visual Studio 2017
Run Python on Apache to view InfluxDB data
Run CGI written in python on Sakura's rental server
Handle Ambient data in Python
Run Openpose on Python (Windows)
Refactoring Learned in Python (Basic)
Run automatic jobs in python
Run shell commands in python
Run Python unittests in parallel
[Python] Notes on data analysis
Python classes learned in chemoinformatics
Run Python CGI on CORESERVER
Run unix command on python
Introducing TensorFlow on Ubuntu + Python 2.7
What I learned in Python
Python functions learned in chemoinformatics
Get Leap Motion data in Python.
Read Protocol Buffers data in Python3
Let's run "python -m antigravity" in python
Run shell command / python in R
Run Tensorflow natively supported on windows
Handle NetCDF format data in Python
Run Python on Schedule on AWS Lambda
install tensorflow in anaconda + python3.5 environment
Python data structures learned with chemoinformatics
Hashing data in R and Python
I learned about processes in Python
Elementary ITK usage learned in Python
Run TensorFlow2 on a VPS server
Run unittests in Python (for beginners)
Run a simple algorithm in Python
Periodically run Python on Heroku Scheduler
How to install OpenCV on Cloud9 and run it in Python
Get additional data in LDAP with python
Create a shortcut to run a Python file in VScode on your terminal
Data input / output in Python (CSV, JSON)
Run servo with Python on ESP32 (Windows)
Run TensorFlow on a GPU instance on AWS
Find files like find on linux in Python
Try working with binary data in Python
[Python] Run Flask on Google App Engine
Get Google Fit API data in Python
Run the Python interpreter in a script
Python: Preprocessing in machine learning: Data acquisition
Get Youtube data in Python using Youtube Data API
Modules cannot be imported in Python on EC2 run from AWS Lambda
Run servomotor on Raspberry Pi 3 using python
[Python] Run Headless Chrome on AWS Lambda
Run Python code on A2019 Community Edition
Deep Learning from scratch-Chapter 4 tips on deep learning theory and implementation learned in Python
Notes on using code formatter in Python
Books on data science to read in 2020
Run python wsgi server on NGINX Unit
How to run GUI programs such as tkinter in Python environment on WSL2
Get time series data from k-db.com in Python