Let's Make Robots!

Wavefront Algorithm Mapping

Hi, LMRians. I've been reading up on this Wavefront Algorithm Navigation and I understand bits and pieces of it, but not everything. I'm trying to learn up on it so I can use it in my next robot : Project 4L-FRED (Alfred), a butler robot. The robot is suppose to navigate around the house from the dining area to the living room and serve drinks to guests. The robot will have pre-recorded messages like greetings and stuff like asking what drinks the guest would prefer. The guest would then press a button, stating his/her choice and Alfred will travel back to the kitchen, tell whoever's in the kitchen what drinks were requested, and return carrying the tray of drinks. 

Anyway, the procedure of how Alfred will do all that is a bit fuzzy for now, since I need to get navigation nailed down first. What I can understand from this Wavefront Navigation is that the robot uses a pre-programmed map in its programming that displays the impassable areas of the map like walls, sofas etc. The robot uses encoders to track its position on the map based on the movement of its motors. A compass sensor would help the robot align itself so that it does not gradually rotate off course. If I were to implement adaptive mapping, I would need to have 1 or 2 IR Rangefinders on the bot so that it tracks changes in the pre-programmed map, like toys or cereal boxes lying around the floor. Basically, the robot adapts itself to the changing enviroment so that it is able to reach the target location. But for Project 4L-FRED, I don't think i'll implement adaptive mapping just yet, maybe at later time. For now, I just want it to scan for obstacles, and gradually slow down to a halt as the obstacle approaches. Alfred will then say something like : "Excuse me, you are blocking my path. Please allow me to pass." Or something along the lines of that. If the object still does not wish to move after three warnings, Alfred will sound an alarm and call for assistance to move the object out of the way.

Finally, we can get to the questions.

1. Is there anything missing from my understanding of the Wavefront Algorithm Navigation?

2. Could there be better ways of doing this, preferably without beacons or black tape around the house? 

3. Can anyone help me figure out how to write the Arduino Code for Wavefront Algorithm Mapping? I have no clue how to start. I can probably learn       how to use the compass readings to adjust the robot's angle, but other than that, i'm stumped.

Thanks for any and all help given by you guys. :D

Reference : http://www.societyofrobots.com/programming_wavefront.shtml  Wavefront Algorithm (What I've read)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

At first, I didn't understand how the "Wave" spreads out. So I pulled out a piece of paper and drew a 7X7 Matrix, then I marked a few of the cells as impassable (Walls). Then I marked one of the corners as the robot start position and another as the target location. Initially I though the wave spreads around the entire cell, meaning diagonal cells as well. It didn't look right so I refered to the diagrams and found out why I was doing it wrong. The wave only spreads out on the left, right, top and bottom of the cell, not including the diagonally aligned cells. I did a few of these matrixes to test it out, having a bit of fun. It was like one of those brain puzzles, lol. :) Then a thought occured to me, what if the ATMega328-PU can't handle the size of the actual matrix? I read on the Society of Robot's forums that the ATMega328 may not be able to process very large matrices with its 2KBs of RAM. Any ideas on this, or should I switch to another chip? I have an assortment of Atmel chips at home like the 644P and a many others, all of them Through-hole type chips. Thanks for the link and the help! :D