[PYTHON] I tried the Pepper NAOqi OS 2.5.5 SLAM feature

Introduction

I tried using the "SLAM function" which is a new function installed in NAOqi OS 2.5.5 scheduled to be released on April 4, 2017, so I will summarize it.

Why SLAM

I went to Silicon Valley for a month in February 2017. Speaking of robots in the United States, robots that involve movement, such as Amazon warehouse robots and security robots in shopping centers, are commonplace. Robots that appeal to sensibilities can be said to be a culture unique to Japan, but I thought that moving Pepper would greatly expand the scenes that could be used, so I tried SLAM's functions.

Warehouse robot youtube Security robot youtube

What is the SLAM function?

SLAM (Simultaneous Localization and Mapping) means that __self-location estimation __ and environmental mapping are performed at the same time. Roughly speaking, Pepper searches for places where you can move and creates a map of it. It also moves in the specified direction within the specified movable range. This will allow Pepper to roam a particular area, allowing them to move around and perform more dynamic tasks, such as explaining exhibits. It can be said that it is a function that can bring out the potential of Pepper.

__Sensor to use __

image Actually, I tried to verify which sensor Pepper uses to map in the search of SLAM.

Various sensors are attached to Pepper. This time, I focused on the sensors that will be used when creating the map below and hid them one by one.

Part Sensor name number
Forehead / mouth RGB camera 2
Eye 3D sensor 1
leg Laser sensor 6
leg Sonar sensor 2
leg Infrared sensor 2

result

What Pepper used to map SLAM was They were ___ sonar sensor ___ and __ laser sensor __. By the way, if you start (Wake up) Pepper with these two sensors hidden, you will be asked to restart by saying something like "Oh, my legs aren't working well and I can't move." Verification began by hiding the sensor after it started successfully. Of course, from the Pepper side, I recognized that the surrounding area was full of obstacles, so after that, I did not move a step and created a map using only __ "laser sensor" __.

FullSizeRender 2.jpg

For more information on slums. It is described below in SOFTBANK ROBOTICS DOCUMENTATION. NAO qi APIs ALNavigation¶

In the next article, I will write about the actual implementation. Try Pepper NAOqi OS 2.5.5 SLAM function ~ Part 2

Recommended Posts

I tried the Pepper NAOqi OS 2.5.5 SLAM feature
Try the Pepper NAOqi OS 2.5.5 SLAM feature ~ Part 2
I tried the changefinder library!
I tried the Naro novel API 2
I tried the TensorFlow tutorial 2nd
I tried the Naruro novel API
I tried to move the ball
I tried using the checkio API
I tried to estimate the interval.
I tried the TensorFlow tutorial MNIST 3rd
Let's use the NAOqi OS VM. I want to resolve dependencies with pip even in Pepper development
I tried the asynchronous server of Django 3.0
I tried to summarize the umask command
I tried tensorflow for the first time
I tried to recognize the wake word
I tried to summarize the graphical modeling.
I tried to estimate the pi stochastically
I tried to touch the COTOHA API
Python: I tried the traveling salesman problem
I tried playing with the image with Pillow
Python's "I can't reach the itch ..." feature
I tried the Python Tornado Testing Framework
I tried using the BigQuery Storage API
I tried Hello World with 64bit OS + C language without using the library
I tried web scraping to analyze the lyrics.
I tried using scrapy for the first time
I tried the pivot table function of pandas
[Python] I tried substituting the function name for the function name
I tried cluster analysis of the weather map
I tried hitting the Qiita API from go
vprof --I tried using the profiler for Python
I tried "differentiating" the image with Python + OpenCV
I tried the least squares method in Python
I tried using PyCaret at the fastest speed
Before the coronavirus, I first tried SARS analysis
I tried using the Google Cloud Vision API
I tried to touch the API of ebay
I tried python programming for the first time.
I tried to correct the keystone of the image
I tried "binarizing" the image with Python + OpenCV
I tried using the Datetime module by Python
I tried Mind Meld for the first time
Qiita Job I tried to analyze the job offer
I tried using the image filter of OpenCV
LeetCode I tried to summarize the simple ones
I tried using the functional programming library toolz
I tried playing with the calculator on tkinter
I tried to implement the traveling salesman problem
I tried to predict the price of ETF
I tried to vectorize the lyrics of Hinatazaka46!
I tried scraping
I tried PyQ
I tried AutoKeras
I tried papermill
I tried django-slack
I tried Django
I tried spleeter
I tried cgo
I tried to learn the sin function with chainer
I tried to graph the packages installed in Python
I tried Python on Mac for the first time.