The title is written like a dissertation, but the point is that the TensorFlow page says that it can be run on Android / iOS, so I tried something like that. When I tried it, I was able to build it on both Android and iOS, so I would like to write the procedure etc.
Android
First from the Android version. https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android Basically follow here. Once built, it will create a TF Detect based on TF Classify, Scalable Object Detection using Deep Neural Networks using Google's Inception model. Furthermore, it seems that TF Stylize that converts the style was also implemented ... So I tried rebuilding again in the following environment.
First of all, I will clone the latest version. It seems that TensorFlow was r0.12 now.
git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git
Then install the build tool bazel.
brew install bazel
You need an NDK to compile your C / C ++ code.
https://developer.android.com/ndk/guides/index.html
According to, search for sdk from Preferences and open the SDK Tools
tab. here
You can install it by checking the box and applying. You can install the SDK from the neighboring SDK Platforms
, and I have installed all Android 5.0 and above.
Then edit the / tensorflow / WORKSPACE
file. Correct the commented out part to your own settings.
Uncomment and update the paths in these entries to build the Android demo.
-#android_sdk_repository(
-# name = "androidsdk",
-# api_level = 23,
-# build_tools_version = "23.0.1",
-# # Replace with path to Android SDK on your system
-# path = "<PATH_TO_SDK>",
-#)
-#
-#android_ndk_repository(
-# name="androidndk",
-# path="<PATH_TO_NDK>",
-# api_level=21)
+android_sdk_repository(
+ name = "androidsdk",
+ api_level = 25,
+ build_tools_version = "25.0.2",
+ # Replace with path to Android SDK on your system
+ path = "/Users/user-name/Library/Android/sdk/",
+)
+
+android_ndk_repository(
+ name="androidndk",
+ path="/Users/user-name/Library/Android/sdk/ndk-bundle/",
+ api_level=21)
Below we will drop the TensorFlow training model and place it in the right place. You can see that the model should be placed in the assets directory. The training model graph: tensorflow_inception_graph.pb
was 53.9MB. Also, the detection model graph: multibox_model.pb
seems to be 18.6MB.
$ curl -L https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip -o /tmp/inception5h.zip
$ curl -L https://storage.googleapis.com/download.tensorflow.org/models/mobile_multibox_v1.zip -o /tmp/mobile_multibox_v1.zip
$ unzip /tmp/inception5h.zip -d tensorflow/examples/android/assets/
$ unzip /tmp/mobile_multibox_v1.zip -d tensorflow/examples/android/assets/
Now that you're ready to build, bazel build in the root directory (where WORKSPACE is located).
$ bazel build -c opt //tensorflow/examples/android:tensorflow_demo
It takes about 20 minutes, so let's extract it with espresso. If it passes, I will actually install it on the terminal and try it.
$ adb install -r bazel-bin/tensorflow/examples/android/tensorflow_demo.apk
If you get the following error, you should update it, so if you follow it, you can build properly.
$ adb install -r bazel-bin/tensorflow/examples/android/tensorflow_demo.apk
It appears you do not have 'Android SDK Platform-tools' installed.
Use the 'android' tool to install them:
android update sdk --no-ui --filter 'platform-tools'
Last week I had the impression that it was rather slimy, but when I rebuilt it, it seemed to be a little heavier ... Maybe it's because of my mind. You should also have the Scalable Object Detection application installed. And the style conversion was very well done, there was a sample painting at the bottom of the screen, and it was a UI where you could choose how much to use as a style. The resolution can be selected from 32 to 1024, and although it does not rotate in real time from about 192, it was possible to generate images.
Looking at the source code commits, I was surprised to see that StylizeActivity.java
and TensorFlowYoloDetector.java
increased in just one week.
iOS
Next is iOS. Basically
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/ios_examples
Follow the TensorFlow iOS Examples
in. I needed Xcode 7.3 or later, and I had raised iOS, so I updated Xcode and was able to build in the following environment.
First of all https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/makefile I'm asking you to build TensorFlow according to the iOS section of.
xcode-select --install
brew install automake
brew install libtool
Install. It seems that you can build with one shot with the following shell.
tensorflow/contrib/makefile/build_all_ios.sh
This seems to generate tensorflow / contrib / makefile / gen / lib / libtensorflow-core.a
. It seems that you can link this with the Xcode Project. It takes about 20 minutes, so let's brew coffee. If you try to generate them one by one instead of this shell, download dependencies first.
tensorflow/contrib/makefile/download_dependencies.sh
Next, compile protobuf for iOS.
tensorflow/contrib/makefile/compile_ios_protobuf.sh
And make targeting iOS.
make -f tensorflow/contrib/makefile/Makefile \
TARGET=IOS \
IOS_ARCH=ARM64
At the root of the TensorFlow directory, do the following, or from the link model Inception v1 Download and unzip.
mkdir -p ~/graphs
curl -o ~/graphs/inception5h.zip \
https://storage.googleapis.com/download.tensorflow.org/models/inception5h.zip \
&& unzip ~/graphs/inception5h.zip -d ~/graphs/inception5h
cp ~/graphs/inception5h/* tensorflow/contrib/ios_examples/benchmark/data/
cp ~/graphs/inception5h/* tensorflow/contrib/ios_examples/camera/data/
cp ~/graphs/inception5h/* tensorflow/contrib/ios_examples/simple/data/
If you look at this, you can see that there are three sample apps: benchmark
, camera
, and simple
. Of course, it is camera
that recognizes images. It seems that the inception model should be placed under the data directory of each application. Once this is done, all you have to do is open the camera_example.xcodeproj file in the directory and run it.
Unlike the Android version, there was a function to stop image acquisition at the bottom of the screen. It works slimy on your iPhone SE. http://qiita.com/shu223/items/ce190ea6669b2636a8a7 According to the article, there was a delay of a few seconds on the iPhone 6, so was the model itself smaller than it was, or did it support the GPU Accelerated mentioned in the article, Accelerate.framework on iOS 10? Is the BNNS added to (Is it still like because the issue is not closed ...) On the contrary, if it is not implemented and it is this speed, it seems that it will not be possible to raise fps more if it is iPhone 6s or later.
I built a sample to move the learning model of TensorFlow on mobile. It's quite slimy on both Android and iOS, and I'm wondering how heavy the model can be.
Recommended Posts