Let's Make Robots!

Robot position with quadrature encoder

I'm building a robot for environment mapping. It's a simple "start here"-class robot: 2 wheels, 2 dc motors and a range sensor mounted on a servo. I plan to hook it up with my PC 1st by USB and later on by wireless somehow.

I want it to drive around and send the range sensor readings to the PC which in turn will be building a map of the environment. The hard part will be knowing the robot's position relative to it's earlier position(s). I know this would be easy with a GPS or an accelerometer, however I wish to find a more lowtech/DIY solution, so here is my idea:

What if I attached a quadrature encoder (aka rotary encoder) to the motor shafts? Knowing the size of the wheels it should be easy to calculate how far it is moving in any direction at a given time?

Has anyone tried something similar? Has anyone ever hooked up a quadrature encoder to a MCU? Do you guys think it'll work? Do you foresee any problems?

EDIT: I forgot to mention that I allready found instructions on how to hook it up to an Arduino. That's not the issue. It's more practical advice, ideas and experiences in the use of it I'm looking for...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

So far I've been playing around with points in a Cartesian plane. Simple..whenever my sharp sensor (mounted on a servo) detects an object I calculate a point (x,y) relative to the sensor based on the distance and angle of the servo, and store the point in a large array. Off couse this has been easy because the sensor is itself is static. Once it starts moving around It'll be much harder.

I also considered using a simple matrix (grid) as jklug80 described. But I think I'll stick with the method described above: storing and displaying all the points. My PC will be the actual brains and the Arduino will only be sending of sensor readings to and receiving commands from my PC, thus I'm not limited to the scarce memory of my Atmega328. Even so eventually I'll have to start thinking about saving the data in files. For this I think I'll use a grid. So I'll only be keeping the nearby grid-sections in the memory at a given time. Hence as the robot moves forward the sections behind it will be saved in files and the sections in front of it will be loaded into the memory.

EDIT (thought I'd write a bit more about my thoughts):

Furthermore I would draw lines between ALL points that are closer together than the diameter of the robot (represented as a circle). Then apply a simple navigation rule: Robot cannot cross lines!

I also thought about applying a probablitity/uncertainty variable to each point: How certain is it that there is something there. This would be based on for instance: the distance from which it was detected (the sensor is more precise at close ups), the time since it was detected (the object may have moved in the meantime), and the number of times detected (how likely is it to be a static vs. moving object).

That should give a better idea of my initial thoughts on the matter...


Thanks a lot to both of you for providing a lot of interesting input :)

http://letsmakerobots.com/node/2558 check out the videos, mapping in progress!!