[PYTHON] Try the Pepper NAOqi OS 2.5.5 SLAM feature ~ Part 2

__ Search for movable range and save data __

First, search the range where Pepper can move.

explore(float radius) Parameters: radius – Specify maximum search range in meters Return value: 0 – Normal, in case of error, the value according to the Error code table is returned. Error code table

When this method is executed, Pepper will start searching for the range that it can move within the range of the maximum value specified by the parameter. While exploring, Pepper wanders around. When the search is complete, the data obtained from the search is saved on disk.

image

saveExploration()

Return value: Returns the path of the map data file (.explo) created by the search.

The data saved by this method will be used later when creating the map. The path of the map data file is returned as the return value, but this sample does not process the saved file.

__ Move to specified location __

After recognizing the range that Pepper can move, you can move it to the specified location within that range. To specify, specify the distance in the X, Y, and Theta directions with parameters. As for Theta, it's in the parameters, but it doesn't work at this time. So if you want to specify the orientation of Pepper, use ALMotion.MoveTo.

startLocalization()

Return value: None

stopLocalization() Return value: None

navigateToInMap(float target)

Parameters: target – Specifies the direction you want to move in the [x, y, theta] format. Example) navigateToInMap ([0., 0., 0.]) Return value: 0 – Normal, in case of error, the value according to the Error code table is returned. (Error code table)

If you want to move Pepper, use navigateToInMap (), but this method is used by enclosing it with startLocalization and stopLocalization above. In other words, use them collectively as follows.

Example of use) startLocalization() navigateToInMap([0., 0., 0.]) stopLocalization()

__ Get map data of saved movable range and create map __

Finally, [Search and save data in the movable range](http://qiita.com/Nariki1998/private/2403066865e59e982239#%E7%A7%BB%E5%8B%95%E5%8F%AF%E8%83% BD% E7% AF% 84% E5% 9B% B2% E3% 81% AE% E6% 8E% A2% E7% B4% A2% E3% 81% A8% E3% 83% 87% E3% 83% BC% Acquire the search data created in E3% 82% BF% E4% BF% 9D% E5% AD% 98) and create a map. The API of the SLAM function added in NAOqi 2.5.5 is used to acquire the search data, but the existing API is used to create the map and display it on the tablet.

getMetricalMap()

Return value: Returns the data saved in the current search as map data. Data in the specified format [mpp, width, height, [originOffsetX, originOffsetY], [pxVal,…]] is returned as an array.

mpp: Map resolution, the number of meters per pixel width, height: The size of the image is expressed in pixels. originOffsetY, originOffsetY: Offset of map pixel (0, 0) pxVal: A buffer of pixel values between 0 and 100 (and as described in the API documentation, but the specific meaning of the data is unknown ...)

__Overall code __

qiita.rb



    import numpy
    import Image

    self.framemanager = ALProxy(“ALFrameManager”)
    self.navigation_service = ALProxy(“ALNavigation”)

    #Explore a range of 10 meters
    radius = 10.0
    error_code = self.navigation_service.explore(radius)
    if error_code != 0:
        print "Exploration failed."
        return
    #Save the searched map data to disk
    path = self.navigation_service.saveExploration()
    self.logger.info( "Exploration saved at path: \"" + path + "\"" )
    #Preparing to move from your own position on the map data
    self.navigation_service.startLocalization()
    #Return to first position
    self.navigation_service.navigateToInMap([0., 0., 0.])
    #Move completed
    self.navigation_service.stopLocalization()
    #Get the map data you searched for to display the map on your tablet
    result_map = self.navigation_service.getMetricalMap()

    #After that, create a map file using the existing API
    path = os.path.join(self.framemanager.getBehaviorPath(self.behaviorId),”../html/img/”)
    writepath = path + “map.jpg”
    map_width = result_map[1]
    map_height = result_map[2]
    img = numpy.array(result_map[4]).reshape(map_width, map_height)
    img = (100 - img) * 2.55 # from 0..100 to 255..0
    img = numpy.array(img, numpy.uint8)
    Image.frombuffer('L', (map_width, map_height), img, 'raw', 'L', 0, 1).save(writepath,"JPEG")

After this, the map created in the Show Image box will be displayed.

__ Summary __

When I tried using the SLAM function, searching for my own position, identifying the movable range, and creating a map worked as expected, but using navigateToInMap, moving to the specified position did not work as expected. There were several times. With the official release of NAO qi OS 2.5.5 on April 4, I hope that the beta version will improve the instability that was particularly noticeable.

Acknowledgments

The staff at Atelier Akihabara were very helpful in verifying SLAM for these 7 days. It became a dynamic verification that Pepper started to move by itself, which was not developed so far, but thanks to the strong support of the atelier staff such as securing verification space and arranging 2.5.5 compatible Pepper, smooth verification Is done. Thank you for this article. Thank you very much!

Recommended Posts

Try the Pepper NAOqi OS 2.5.5 SLAM feature ~ Part 2
I tried the Pepper NAOqi OS 2.5.5 SLAM feature
Try installing only the core part of Ubuntu
Try using the Python web framework Tornado Part 1
Try using the Python web framework Tornado Part 2
Try Katsuhiro Morishita / Aso_Sea_Clouds_Pridiction Memo-Excluding the learning part-
Try to visualize the room with Raspberry Pi, part 1