Call the Microsoft Emotion API by sending image data directly from Java.

What is the Emotion API?

One of Microsoft Azure Cognitive Services, it reads emotions from people's faces such as photos. For more information here As of June 24, 2017, you can use it for free to some extent.

Why Java?

There is no particular reason. There is officially a sample implementation in Java, but since there was only one that sends the URL with the image, I tried the implementation of the method of sending the image directly.

environment

Javaļ¼š1.8.0_131 Emotion Api : 1.0

procedure

Subscription registration

Register a subscription. Official site You can register by pressing Create from the tag "Trial version", and a 32-digit key will be issued. So save it.

Add dependency to pom.xml

Add the following:

pom.xml


<dependency>
  <groupId>org.apache.httpcomponents</groupId>
  <artifactId>httpclient</artifactId>
  <version>4.5.3</version>
</dependency>

Creating an EmotionApiClient

Almost the same as the sample source, but implemented as follows. The method argument is a byte array of the image file. The result is returned as JSON to the caller.

EmotionApiClient.java


import java.net.URI;

import org.apache.http.HttpEntity;
import org.apache.http.HttpResponse;
import org.apache.http.client.HttpClient;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.client.utils.URIBuilder;
import org.apache.http.entity.ByteArrayEntity;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.util.EntityUtils;

public class EmotionApiClient {

	private final String KEY = "32-digit key issued above";

	String postApi(byte[] image) {
        HttpClient httpClient = HttpClientBuilder.create().build();


        try
        {
            URIBuilder uriBuilder = new URIBuilder("https://westus.api.cognitive.microsoft.com/emotion/v1.0/recognize");

            URI uri = uriBuilder.build();
            HttpPost request = new HttpPost(uri);

            request.setHeader("Content-Type", "application/octet-stream");
            request.setHeader("Ocp-Apim-Subscription-Key", KEY);

            ByteArrayEntity reqEntity = new ByteArrayEntity(image);
            request.setEntity(reqEntity);

            HttpResponse response = httpClient.execute(request);

            //You should check the status.
            HttpEntity entity = response.getEntity();

            if (entity != null)
            {
            	return EntityUtils.toString(entity);
            }
        }
        catch (Exception e)
        {
            System.out.println(e.getMessage());
        }

		return "";
	}
 }

At the end

It was easier than I imagined to implement. I would like to try to see how much it can be recognized in bright and dark places.

Recommended Posts

Call the Microsoft Emotion API by sending image data directly from Java.
Call TensorFlow Java API from Scala
Data processing using stream API from Java 8
Call GitHub API from Java Socket API part2
Call the Windows Notification API in Java
Hit the Salesforce REST API from Java
Try using the Emotion API from Android
About CLDR locale data enabled by default from Java 9
Call Java from JRuby
[MT] Specify the article category from Android with Data API
[IOS14] How to get Data type image data directly from PHPickerViewController?
[MT] Specify the article category from Android with Data API
Try using the Emotion API from Android
Compatible with Android 10 (API 29)
Vibrate the wristband device with Bluetooth from the Android app
Android development-WEB access (GET) Try to get data by communicating with the outside. ~
Call the Microsoft Emotion API by sending image data directly from Java.
SetCookie from the client side with OkHttp3
Data processing using stream API from Java 8
Hit the Salesforce REST API from Java
Find Raspberry Pi from Android with mDNS
Android development-WEB access (POST) Try to communicate with the outside and send data. ~
Access the in-memory data grid Apache Ignite from a Java client
Sample code to call the Yahoo! Local Search API in Java