Let's Make Robots!

Mapping

In a reply to this post: http://letsmakerobots.com/node/31641 I have exposed one of my software problems that I encountered on one of my robots. I am re-posting this separately with some added info:

Here is a real case. My robot's size fits in a 20x20 cm square. All the measurements the IR and US sensors return are in cm. I can generate a list of coords for the obstacles detected in a scan. Because of EEPROM space constraints, I can store a map with the cell size similar with my robot's size. Each cell is represented by a byte in a 2D array. We can use some tricks to write 2 cells in a byte, perhaps even 4 cells if necessary. To have an accurate map, we can encode different cell information in the value of that byte, for instance, a cell might have a wall on the North side, but the cell is empty, we can store the value of 200 in the byte. 201 for a East wall, 202 for a South wall, etc. A cell that is completely blocked can be 255. Say we have a couch that uses 6 cells, for the cells on the edge that the robot can measure, we can encode the value 250 but for the cells closer to the wall we can write 255. Makes sense? Similar, we can write 100 for a cell that contains the leg of a table or a chair. The robot might go between the legs, but say the chair occupies 2 cells. Instead of blocking the access, the robot can decode the 100 value and approach with caution, measuring the distance between the legs, center itself and pass with care, then update it's new position on the map. In the case of chairs, it may also be possible that a human moved them around and they are actually occupying other cells. The robot in this case can mark their new position on the map and also proceed with care. Why go between the chair or table legs? Say we want to play ball fetch, the ball will roll there for sure, so the robot should be able to follow. So, storring the map is easy, 2D array. Set rules on what to do based on the encoded value. Not so easy, but doable.

Now here is my problem. As I said in the beginning, the sensors return distances in cm but my map size is in 20x20cm. How do I compare the sensor readings with the map? How do I make decisions on how to update the map or correct the robot's position on the map? I use IR and US sensors on a panning head and also a compass and encoders to measure the traveled distance. The robot travells from the center of one cell to the center of the next cell and makes 90 degree turns, except for the "proceed with care" situations. Say we elliminate the special case for now to keep things simple. How do I use the sensor readings to verify the position on the map? This is where I need help.

Added stuff:

I do not want to use a hardware fix to this problem. I can do that easily, as I am a hardware guy. Beacons, RFID, barcode, makes localization easier. But then the robot is constrained to that environment. For now I want the robot to work with a given map, but the next step will be to build the map as it goes. I know it's easy to just put in a BeagleBoard or something similar and do PC programming, but I still think it may be possible to have a microcontroller do this, even if mapping is the only thing it does. I have no problem using a few microcontrollers in my robot, actually that's the case. So assume I am using one Arduino just for the mapping process (if a 328 is too small - as memory - I'll use a mega). The actual map is stored in the EEPROM, if that is not big enough I can use a uSD shield.

Why I think the robot should have a map of the environment? Because fatching a ball is not the end of the story. That is just one task the robot can do. Other tasks will require the robot to pick up stuff from the floor and place them in correct bins. Think of it as a mini butler robot, microcontroller based. Any uses may be developed if we have a working system. Also, the map will be accompanied by a list of places where the robot will need to go, for instance the charging station, the bins for toys and socks, the kid's chair to bring back his toys after he throws them on the floor. 

Again, what I need is an algorithm that allows me to compare external sensor readings with the stored map. The fact that the map is pre-built or built by the robot is not important at this moment.

Perhaps I'm looking at this the wrong way, if there are better ways, please share your ideas. 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

IR and US data is way too sparse to be able to recognize objects. The way to do what your asking is to use the Iterative Closest Point algorithm. There are a bunch of different versions of it (http://www.cs.princeton.edu/~smr/papers/fasticp/fasticp_paper.pdf)

That paper talks about 3D point clouds, but the problem is even simpler in 2D. If you are using a pre-given map then the ICP can be used as well. Say if you have a map built out of lines that represent the inside of your house, and you know roughly where the robot is starting off at, you can use a point-to-line ICP algorithm to estimate changes in location. 

If you don't have a pre-given map, you can still use ICP to help with location estimation, as well as build a map. But with IR and US data being as sparse and noisy as it is, I don't think the results will be all to great.

ICP is a popular algorithm so I wouldn't be surprised if someone has posted a micro-controller based implementation of it somewhere. If not, it shouldn't be to hard, I've done it in matlab before and it's mainly just a few matrix manipulations. 

Here's another paper if your interested: http://web.cs.dal.ca/~eem/cvWeb/pubs/matching.pdf

With all that said, if you doing ICP, and you have compass and encoders information, you can use this information with the ICP results in a Kalman filter to get more accurate estimates. I don't know too much about this though. 

Hope that helps, 

-Aaron