[PYTHON] Facial expression recognition using Pepper's API

By using the ** ALFaceCharacteristics API ** introduced in Age recognition using Pepper's API, facial expressions such as smiles You can recognize it. Here, we will try two methods, smile detection provided by the API and other facial expression recognition methods.

Note that ** ALFaceCharacteristics API does not have a means to check the operation of virtual robots, and an actual Pepper machine is required. ** I would like you to experiment with the actual Pepper machine at Aldebaran Atelier Akihabara. (Reservation URL: http://pepper.doorkeeper.jp/events)

How to get the project file

The project file for this tutorial is available on GitHub https://github.com/Atelier-Akihabara/pepper-face-characteristics-example. There are several ways to get the code on GitHub, but one of the easiest ways is to get the archive from the Download ZIP link. There are various other methods such as using git related tools, so please check according to the situation.

Judgment of smile

By using the ʻALFaceCharacteristics` API, it is possible to judge the degree of smile on the recognized face from the image features such as the fineness of the eyes and whether the corners of the mouth are raised. You can use this value to determine if the person Pepper recognizes is smiling.

Using the memory event FaceCharacteristics / PersonSmiling

The easiest way to make a smile check is to use the FaceCharacteristics / PersonSmiling event. First, let's check the contents of the document.

Check the document

More information about this event can be found in the Choregraphe documentation at NAOqi Developer guide> NAOqi Framework> NAOqi API> NAOqi PeoplePerception> ALFaceCharacteristics> ALFaceCharacteristics API. If you click on the FaceCharacteristics / PersonSmiling link, you will find the following explanation.

Event: "FaceCharacteristics/PersonSmiling" callback(std::string eventName, int id, std::string subscriberIdentifier)

Raised when a person has a smile degree above the current threshold (default = 0.7). Parameters:

  • eventName (std::string) – “FaceCharacteristics/PersonSmiling”
  • id – ID of the person as defined by the ALPeoplePerception API.
  • subscriberIdentifier (std::string) –

It is explained that this event occurs when there is a person whose smile degree (expressed as 0.0 to 1.0) exceeds the threshold value (default is 0.7).

Also, the value passed by this event is ʻid. This is [ALPeoplePerception: Get List of People](http://qiita.com/Atelier-Akihabara/items/4162192129f366da1240#alpeopleperception-%E4%BA%BA%E3%81%AE%E4%B8%80% E8% A6% A7% E3% 81% AE% E5% 8F% 96% E5% BE% 97) This is the person's identifier that we have already seen. In other words, based on the value obtained from this FaceCharacteristics / PersonSmiling, it is possible to obtain additional information about the person who detected that he / she is smiling by acquiring PeoplePerception / Person / / AgeProperties`. I understand this.

Checking the operation of memory events on the memory watcher

Before creating the application, let's check the behavior of the event with Memory Watcher. First, set the content of the event you want to check in the memory watcher.

  1. Connect Choregraphe to the actual Pepper machine

  2. From the Memory Watcher panel, double-click **