FUJITSU Advent Calendar 2019 This is the article on the 23rd day. Isn't Java the most object-oriented language? Historically, C ++ and Smalltalk seemed to come out first, but C ++, which was developed with compatibility with C in mind, does not enforce object-oriented implementation. When it came to Smalltalk, I didn't know its existence by any means. .. .. : sweat_smile: I remember reading Java as the language that represents "development concept = object-oriented that makes it relatively easy to implement programs that consider reusability" in a previous book, but that's not wrong. I think. As you can see from Transition of popular programming languages, 1965-2019 (Python perspective), even in software development It is a widely used language. By the way, it still has an overwhelming share of in-house software development! ?? Seems to be holding. (There is no statistical data by popular theory: sweat_smile :) However, in recent years, isn't it a genuine object-oriented language like Python? With the rise of languages, the power map may be rewritten in the future. .. .. : scream:
The introduction has become very long, but the point of this article is to try Deep Learning in Java.
A friend who likes competitive programming more than three meals said he used to program in Java but recently switched to Rust. He wanted to try any popular language. As you can see from the above Transition of popular programming languages, Java seems to be pushed by other languages. .. .. Some people express it as an old language, so it's treated a lot. .. .. In order to prove that object-oriented children can still fight in active duty, I thought that I had to do enough to pull off the trendy Deep Learning these days. However, this time it is a trial edition as the title says. : dancer_tone :: dancer :: dancer_tone2 :: dancer_tone3 :: dancer_tone4 :: dancer_tone5:
The Last Java Samurai[1]
: yum: By the way, I'm from a non-information system, so I was frustrated trying to study Java when I was a student, and after joining the company, I learned the goodness of Java for the first time.
Python has an overwhelming share in the world of machine learning, including deep learning, and most OSS opportunity learning libraries are also implemented in Python, but learning and inference can be done using deep learning models in Java for some time. There was OSS [^ 4j]. However, this time I will try using the OSS "Deep Java Library (hereinafter DJL) [^ djl]" which was newly released on December 3, 2019.
The article has become long, but it is the main subject. At present, the only information you can rely on is Official site and Reference article. : scream :: ghost:
In Reference article, the environment was built on AWS by the method using Gradle. This time I would like to adopt the method of executing on the container using Docker + Jupyter Notebook in the local environment. Jupyter Notebook has the image of a visualization tool for Python, but recently it has become more and more convenient as it supports multiple languages.
macOS Mojave 10.14.6(CPU:Corei5-8210Y 1.6GHz, DRAM:16GB)[^nogpu] Docker 19.03.5
Clone the repository from GitHub.
git clone https://github.com/awslabs/djl.git
jupyter
directory under the djl
directorycd djl/jupyter/
docker container run -itd -p 127.0.0.1:8888:8888 -v $PWD:/home/jupyter deepjavalibrary/jupyter
You can edit the Dockerfile to build the image, but this time it started with the defaults.
$ docker container ls
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
03791a7b1641 deepjavalibrary/jupyter "jupyter notebook --…" 4 minutes ago Up 4 minutes 127.0.0.1:8888->8888/tcp suspicious_brattain
If you access http: // localhost: 8888
and the following page is displayed, it is successful.
If you look closely, the tutorials are already stored as each notebook. I gave in to the convenience of Jupyter Notebook while boasting that I would do it in Java. .. .. : sob:
Start train_your_first_model.ipynb
. This is a sample of handwriting recognition (MNIST) using MLP. To execute training, simply press Shit + Enter to execute the commands on the cell in order. It's very easy to do. I was surprised at first that I didn't need to write the main function or class.
The following message was output when saving the model. The correct answer rate is about 97%, but this is just a numerical value based on the evaluation data at the time of learning, and the correct answer rate based on the test data is not measured this time. .. .. .. : stuck_out_tongue_closed_eyes:
Model (
Name: mlp
Model location: /home/jupyter/build/mlp
Data Type: float32
Accuracy: 0.96991664
Epoch: 2
)
This time I will run a sample of object recognition. ʻInitiate object_detection_with_model_zoo.ipynb`. I was trying Model Zoo in the above Reference article, so I thought I would try another sample, but I still can understand the detailed usage of the library. This time, the trained model ([SSD: Single Shot MultiBox Detector](https: // qiita)) is required because the GPU environment is indispensable for learning. I tried object recognition (inference) by .com / YutoHagiwara / items / 4b66442ff1e09936f1d0)).
Try object recognition with the image below. This is a picture of a dog, a bicycle, and a car.
The execution result is shown below. You can correctly recognize objects (dogs, bicycles, cars).
The inference results (numerical values) are shown below. You can see that each object is recognized with high accuracy.
[
class: "car", probability: 0.99991, bounds: [x=0.612, y=0.137, width=0.293, height=0.160]
class: "bicycle", probability: 0.95365, bounds: [x=0.162, y=0.207, width=0.594, height=0.590]
class: "dog", probability: 0.93471, bounds: [x=0.168, y=0.350, width=0.274, height=0.594]
]
I tried JDL which can execute Deep Learning in Java. It seems that it is supposed to be executed on AWS, but it worked without problems with Docker on the local environment. This time it was a trial edition, but I would like to proceed with the investigation and try something a little more advanced in the future. Finally, the future issues are described below.
This is the trial version of running Deep Learning with an object-oriented student. I will do my best to continue posting articles next year.
[^ 4j]: Deeplearning4j. As a Japanese book, "[Deep Learning Java Programming Theory and Implementation of Deep Learning (by Yusuke Negago; Impress; 2016)](https://www.amazon.co.jp/Deep-Learning-Java85-impress/ dp / 4844381288) ”has been published. [^ djl]: It seems that it was developed for use on AWS. ([Reference Information](https://aws.amazon.com/jp/about-aws/whats-new/2019/12/introducing-deep-java-library-develop-and-deploy-machine-learning-models- in-java /)))
[^ nogpu]: Since our goal is to try DJL this time, we will run it on the CPU, not in the environment where GPU + CUDA runs. Therefore, this verification is limited to trying multi-layer perceptron learning and inference using deep learning trained models. I will try the operation on the GPU at a later date.
https://blog.heroku.com/samurai-duke-and-the-legend-of-openjdk ↩︎
Recommended Posts