Let's Make Robots!


Eventually autonomous with dead reckoning, IMU, avoidance, self charging, etc
Odometry.txt1.77 KB


Bringing it all together… I’ve been wanting a nice little platform to bring things together and have something that I will want to continue to code instead of building, coding, and then putting on the shelf. I have played with encoders on my ProtoBox, LCD output on an unpublished “Wanderer” RC truck, dual and triple sonar tracking on the bench, IMU and Compass values on balancing the ProtoBox, IR input on the last few projects, and a few things in between.

With this project I want to put them all together and work on some real code for navigation, avoidance, X/Y targeting, location and environment awareness and some “mood matrix” code. Before it’s over I’ll likely add something goofy to give it some personality but for now it’s an “all business” platform.

This time I’ll start the post now and update as progress continues… or does NOT continue.

Update 02/03/2014 - I2C Between Brains - Dead Reckoning Square(ish) Dance - See Below

Update 01/28/2014 - Encoders Working - See Below

Lexanne Guts
Goals and Objectives

A small platform for developing full navigation, environment sensing, avoidance, telemetry, mood expression, and more.

Current Goal List

High resolution encoders (Pololu) for dead reckoning and speed control (4 pins)

Serial motor control for pin use reduction and better controls (2 pins)

Dual sonar for targeting / tracking an object if possible (2 pins)

IMU/Compass for heading management and tilt detection, etc (I2C)

Full front bumper for touch sensors (2 pins)

PIR for motion / human sensing (1 pin)

LCD display for status viewing (TX Used)

Temperature, humidity and pressure information for mood adjustments (1 pin)

Real time clock for date/time awareness (I2C)

Sound input for general sound level (1 pin)

Sound output for noise, info, mood, etc (1 pin)

RGB Mood LED (3 pins)

LDR Light Sensing - Left and Right (2 pins)

Battery voltage monitoring using internal 1.1 / external (1 pin)

Omni wheel caster for testing

Less hot glue :-) - Love it but trying to tone it down on this one


Future / Optional

Bluetooth or wireless telemetry feed (use existing TX/RX for connection)

Rear bumper for back into issues (1 pin)

IR input for mode management (1 pin)

Manual or self docking for recharging with front charger plates

Mood matrix and expression through color output based on environment, activity, etc

Simple feed back to Home Automation system via X10 buttons for status (2-4 pins)

WiiCam for charging station tracking, follow the light, etc. I2C


Arduino ProMin Clone Pin Use

So as you can see from above, at this point all pins are already going to be used and more on a single little Pro-Mini clone. If I want to do more I’m going to have to add another Pro-Mini or move to a different main CPU. I may have to move to additional CPU anyhow depending on performance and load.

With that in mind already I decided to try a dual CPU setup having one ProMini take care of the core stuff like read the encoders, read the IMU, calculate the X,Y, and heading information and handle little things like RGB LED outputs, etc. I call this CPU the Brain Stem CPU taking care of core calcs etc. Not exactly the same as a biological version but kind of similar.

The main CPU would then handle higher functions such motor control, reading and acting on sensors, handling the mood matrix information, talking to the LCD, etc. Basically be the “higher brain” or Main Brain of the setup.

Pro Mini Brain Stem

* = PWM Output option

Tx TX to the Main Brain
Rx RX from the Main Brain


4 Left Encoder 1 input - These are Quad encoders

5* Left Encoder 2

6* Right Encoder 1

7 Right Encoder 2


9* RGB LED Output - Red Hope to mix these via PWM for color controls

10* RGB LED Output - Green

11* RGB LED Output - Blue


13 OnBoard LED output





A3 Battery Voltage/Charging Monitor

A4 i2C Network - IMU

A5 i2C Network

A6 DHT-11 Temp/Humid/Pressure


Pro Mini Main Brain

Tx Telemetry / LCD Output

Rx Serial Command options input if wireless is used

2 Serial motor TX

3* Serial motor RX - apparently required for Pololu library?

4 Serial Brain Stem TX - Send commands or status requests

5* Serial Brain Stem RX - Receive data




9* Rear Sonar Input (Single Pin)

10* Left Sonar Input (Single Pin)

11* Right Sonar Input (Single Pin)

12 Speaker Output

13 OnBoard LED output / PIR Input


A0 LDR Left

A1 LDR Right

A2 Microphone Input


A4 i2C Network - RTC / WiiCam / Other

A5 i2C Network

A6 Left bump input

A7 Right bump input


The Chassis

Lexanne Top Down

I could have found a chassis to print out on my printer or better yet designed one to be printed but honestly I still like cutting things out by hand myself. (That’s my excuse for my poor 3D design skills). With that in mind, this build is based on some old Lexan I had around for the chassis and some core pieces like bumper, etc.

The key requirements is the chassis needed to be able to support the Pololu micro metal gear motors, large Pololu wheels, and the encoders… basically a flat surface wide enough to mount them. That was simple. :-) I figured I’d use either acrylic or Lexan as I had both in scrap pieces big enough but the Lexan won out as it was lighter, strong enough, and easier to cut up. Plus it’s really hard to chip and break which may be helpful during the drilling, cutting, etc that will occur. Also I just have really always liked the clear chassis look of plastic acrylic or Lexan based bodies. I may cover some of it up with a 3d printed cover but it will look good “naked” at least.

So what better name that Lexanne?

I also already knew I wanted a chassis that would allow a full face full height front bumper and I wanted to try out an Omni wheel as the caster wheel like OddBot started in the beginning of the Service Droid bot. The goal is, as he mentioned, to hopefully reduce the swag that occurs when turning with a standard caster wheel. Originally I had the caster at the far back of a basic chassis design but remembered the discussion on David Anderson’s SR04 bot about keeping your caster in the turning circle of your wheels to avoid even more turning distortion.

With that in mind and a limited wheel base width of the width of the scrap Lexan, I cut out a rough shape with a long tail on it, too long really so I trimmed it down to be just behind the Omni wheel caster. With a roughly 130mm wheel base, the rear caster is centered up at 65mm behind the wheel centers putting it in the turning circle.

I guess what I really ended up with is something very similar in size and design as the BoeBot but oh well, it looks like I wanted it to and I think it will work out.

Layout and Fitting

Lexanne Mocked UP

As most everyone knows, getting it all to fit, work, make sense and look good is a constant battle when working with a small platform. This is no exception. Not a lot of space and a lot of things I want to include. There is always the option to double deck it but I’d rather not if I don’t  have too, just makes it a pain to get to the lower deck usually. To gain some space I will be “double siding” the chassis putting somethings like battery, RTC, etc on the bottom of the board where the motors are already mounted. That should help some.


I know I want the LCD in the back so I can see it during operation and the full face bumper up front. I also have the CPU board and motor controller board (which has a lot of wasted space on it) that I need to include along with the front bumper switches, battery, IMU, DHT-11, RTC, PIR, and on and on and on. With the added Pro Mini for encoder and brain stem type of functions it has to go somewhere as well.

Currently I plan on putting the RTC, DHT-11 and the brain stem CPU on the bottom of the chassis with the main CPU, motor driver, PIR, sound and LCD mounted up top. The sonars will be flush mounted in the front bumper and the IMU mounted on a small post or something to get it up above the CPU and motor controller board for a hopefully cleaner signal.

I had planned on mounting the PIR on a small servo to raise while stopped and lower it out of the way when rolling like David Anderson’s SR04. With the use of the secondary CPU I can now do that, it just a matter of figuring out the mounting.

Phase 1: Power, Motors, and Basic Control

Obviously the first order is to get the power hooked up, switched, get both CPUs and Motor driver board mounted up and wired. For power I have a couple UltraFire 18650 4000mah li-ion batteries that can be removed and charged/swapped as needed. The CPU board has a Pololu 5v step up/down regulator that will hopefully be enough for all the peripherals hanging around.

I am using a pair of Pololu micro metal gear motors that are really pretty fast so control and encoders are likely going to be required. The motor controller is a Pololu qik 2s9v1 that I’ve used on the ProtoBox balancer and although expensive it’s a nice little controller for small motors.

The encoders are pololu for their micro metal gear motors and are quadratures used with their library. I’ve never used them before or quad encoders so this will be interesting to see how well it all works. I do have code for calculating current X, Y, and heading that will hopefully be used for tracking location. The goal is to use the dead reckoning information and then meld it with the IMU angles, compass, etc for a more solid positioning solution. This is supposed to all occur on the brain stem CPU and X, Y and Heading data sent to the main CPU for processing and decision making.

Phase 2: Sonar and Bumpers

Once I’m happy with the motivation, dead reckoning, and positioning, assuming that actually ever happens, I’ll move to the avoidance parts including mounting the front bumper, switches, and sonar sensors and get that level of subsumption working. At this point Lexanne should be able to have a target X, Y position to go to and make it’s way there using the navigation information it has and using the sonar and bump sensors to avoid and go around objects in the way.

Honestly, if I can get this level I will be quite happy. I had the ProtoBox doing dead reckoning and object avoidance but never really got the code clean enough to do both. The avoidance would subsume the navigation but then navigation would take back over and get really looped out trying to do  both avoid and navigate.

I will have the LCD hooked up before this as I plan on watching it during operation to understand what is really going on in the Lexanne’s brain. Telemetry may be better for this but I will wait and see if that is really required at that time.

Phase 3: RTC, Temperature, PIR, LDRs, RGB LEDs, Audio

If I get to this point I’ll add the real time clock, DHT-11, LDRs, and PIR for sensing the environment. This then allows the option of adding another layer of code for doing things depending on the time, temp, presences of others and/or light levels. This is where the RGB LED(s) could also come into play allowing some basic status information by color depending on the inputs read. I have a small microphone board I plan to use as well to get a general sound level of the environment. This isn’t going to be directional, etc, just general noise levels but it may have some interesting uses if it works properly.

Phase 4: Mood Matrix, Battery Monitoring, Self Charging

The last phase for this platform would be to implement that battery monitor and try to work out a battery monitoring and self charging solution. My first idea is to use the WiiCam I have, mount it up in the bumper down low, add two contacts on the bumper for a charger base and simple drive into the light via WiiCam until bumpers are engaged and the battery voltage level is a charging level measured somehow.

If I get to this point the robot could really be considered “autonomous” and not just run for a while and die. High goals for me and if I get bored or burned out the project may go back on the shelf for a while.

Updates to Follow

Lexannes ButtI’ll post updates as they occur and are worth mentioning if anyone is interested. Not sure how fast this one will occur at this time. Wish me luck, good luck for a win or bad luck to laugh at my failures.




Update 01/28/2014 - Encoders

Goal 1: Getting the Encoders working


The first goal for Lexanne is actually NOT to move. My first goal is to get the wheel encoders working so she knows if she is trying to move or not. I say “trying” to move because even if a wheel is moving the platform may not be. That is for yet another higher goal of using the IMU to check against the actual platform movement, WAY down the the list of things to do.


Pololu Encoders


The encoders are not nearly as straight forward to implement as I thought. For one the library was rather hard to find although after searching I realized I already HAD it installed, just too many libraries to display in the limited Arduino IDE. Most Pololu encoders seem to use their little reflective sensors and the mirco motor version is no exception. These are quadrature encoders would is much better than my ProtoBox setup where I used an IR sensor and then used code to determine if I needed to up or down clock the wheel ticks depending on which way the current direction should be going. If you have any roll past the stop command you lose ticks and moving backwards and forwards quickly is very risky for losing counts. With quads this should all be handled by the library and all I do is take the tick count and apply it to the coordinate update math.


Now that I have counts I can implement my old code for calculating X, Y and Heading values. Its not really MY code as I’ve found bits and pieces around the web to use. Most of the concept is very well written by David P Anderson here, you just have to change it a bit to use on an Arduino. http://www.geology.smu.edu/~dpa-www/robo/Encoder/imu_odo/ is a very good read to understand dead reckoning and more to me.


This is my basic odometry code, I’m sure there’s room for improvement as I’m no code pro.


The odometry code is attached in the files area.


Getting the encoders working wasn’t too bad, getting them working well was another story. Without motor control yet it was a little difficult to really see if I’m getting good counts. With the motors hooked up I could basically tell it to move xx tics forward and see what/if any are missed. At first I had a decimal placement error and had my tics per cm at 7.087 instead of 70.87 and wheelbase at proper 13.1cm and was totally confused as to why bad heading values. Once that was fixed it all worked out great.

Goal 2: Making it Move on it’s Own

So I thought I’ll hook up the motors temporarily to the left brain, the core level function CPU for now and drive and read encoders. Nope… library conflict right off the bat. It appears the Pololu QIK library does not work with Pololu’s own wheel encoder library. From what I can gather (which isn’t much) the QIK driver uses the SoftwareSerial library to talk to/from the controller. Unfortunately the Pololu wheel encoder library apparently uses the same timer or interrupt and creates a “vector_3” conflict.

First thoughts were “Oh well, I will be controlling motors on the right brain side anyhow”, but then I remembered I was still planning on talking serial between the brains. Now I have to move on to I2C for the brain to brain communication. I guess that will be better anyhow.

So now next goal is to build another brain board and get the two Pro-Mini’s talking to each other over I2C. Then the main brain can run the QIK driver as planned and the left brain can read encoders, calc targeting, etc. Otherwise I need to do motor control differently such as I2C which still could happen.

Hopefully motor-vation soon...


Left Brain Right Brain02/03/2014 - Left Brain - Right Brain - Pain

Sooo since I decided I wanted to split out the duties of reading encoders, calculating position, moving around, mood matrix stuff, etc I decided to try the dual brain idea. As Basile pointed out it’s a copycat idea from one of Dan M’s build as show here: http://letsmakerobots.com/node/25953 - Having a Left Brain and a Right Brain. Dan M has a fun read explanation of his solution using true Left / Right brain analogies.


For Lexanne I’ve moved from the idea of serial communication to I2C due to the fact that I can’t use SoftwareSerial and the QIK motor controller libraries at the same time on the “Left Brain” without digging into and modifying the library’s timer use, etc. Additionally it just makes sense to use a standard bus I can hang multiple devices off of. Eventually I’ll have a real time clock (RTC), IMU, OLED display, and maybe more I2C devices so this just makes sense to have the brains talk to each other that way as well. With I2C any CPU on the bus could actually use any of the I2C devices which may come in handy later on.


However, I wanted to keep it real simple to exchange information and I needed it to be bidirectional as well. I need the Right Brain to get current X, Y, Heading, TargetBearing, TargetDistance etc information from Left Brain encoders but I also need to push the desired target X, Y, etc information back to the Left Brain for calculations. Having read about every I2C tutorial I could find I was still extremely confused on how to send AND receive data from slaves effectively.


I kept reading about having to have a Master and a Slave and the Master could send data to the slave but the Slave could not really initiate a transfer to the Master. At least that’s what I kept seeing everywhere. I had already reviewed Ro-Bot-X’s MiniEric code and it seemed so simple but I just couldn’t grasp what was going on. Finally I found this site (http://digitalcave.ca/resources/avr/arduino-i2c.jsp) that explained the option for “Multi Master” which basically explains that any slave can initiate a communication to another device thus effectively turning into a Master. Now I believe that is what MiniEric code is doing but I just couldn’t understand it at the time.


So with that in mind I finally just created a simple receiveEvent() subroutine on each CPU that will process any inbound data from other CPU’s devices. I am using the same event name and will use the first byte in the array I am sending as the “command”, i.e. what variables to populate or what to do.


The basic subroutine is below. I’m sure it’s not optimized and someone could turn that into three lines of code but it’s working for me. Basically nothing is polled, the sender just sends the data on a regular basis keeping the other device updated. So far it is working great to get current X, Y, heading, target bearing and distance information from the Left Brain to the Right as well as sending Target X and Y values to the Left Brain from the Right.


//** I2C Communications subroutines

//** Right Brain

void receiveEvent(int i)


 byte x=0;

 while (Wire.available())


     rcv_packet[x] = Wire.read();

     x = x + 1;


//** Packet format

// 0 = what data do we need to update

// 1/2 = int value 1

// 3/4 = int value 2

// 5/6 = int value 3

// etc


  switch (rcv_packet[0])


  case 0:


  case 1: // Update from Left Brain encoders

   CurX = double((rcv_packet[1] << 8) | rcv_packet[2]);

   CurY = double((rcv_packet[3] << 8) | rcv_packet[4]);

   CurThetaDegrees = double((rcv_packet[5] << 8) | rcv_packet[6]);

   TargetDistance = double((rcv_packet[7] << 8) | rcv_packet[8]);

   HeadingError = double((rcv_packet[9] << 8) | rcv_packet[10]);

   TargetBearing = double((rcv_packet[11] << 8) | rcv_packet[12]);





To send data to another CPU I am just using stand alone routines for now. This make it easier for my tiny brain to understand what I’m doing so far. Below is the sample code for the Right Brain to send new X, Y target values to the Left Brain. It just opens up the channel, sends the “command” byte and then sends the data splitting out the values to bytes.


void sendNewTarget(int newX, int newY)




   Wire.write((int(newX) >> 8));


   Wire.write((int(newY) >> 8));





Square Dancing… well, not really square but close


See the Video...

Having FINALLY gotten TargetBearing and Distance information from the encoders over to the motor controlling brain I can actually try to do some dead reckoning driving. The first test is basically a “Square Dance” starting at 0,0, moving to 0,100, then 100,100, then 100,0, and back to 0,0. The encoders give me CM resolution which is fine for this purpose. They are actually a float variable in the left brain to keep things more accurate but I convert to int when sending them over.


Sounds so simple and it does work… kind of. I have some tuning to do for wheel sizes, wheelbase, etc but she does try to drive in a square. I did expect some errors but am actually surprised by the amount. My ProtoBox DR setup actually does a bit better using reflective wheel encoders so I need to get some telemetry and see what is going on inside Lexanne’s head I guess.


What’s really odd is that on flat smooth floors the shift is one direction but on carpet the shift is the other direction so figuring out a compromise may be a bit tricky. Note that I am not using the IMU or a compass yet for heading management, that is the next step to try to keep bearing errors to a minimum.


So until next time, off to the “Kind Of” Square Dancing routine…

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Very ambitious project.  I like the name!

I too have similar goals.  I would like a bot to be smart enough so I can show it where I want to put something, then have it turn and go find something in the room, pick it up and put it where I want it within a few inches.  It sounds so simple, but it is a pretty complex project.

I too have played with dead reckoning and come to the conclusion (perhaps wrong) that dead reckoning alone is not going to cut it.  It will keep you in the general ballpark, but you need something else to periodically readjust your position and heading based on landmarks.  I also have been continually frustrated by the memory limits (or lack of knowledge on my part) of the Arduino Uno.  I am thinking minimum Mega, RasPi or some other bigger processor for my next stab at this.

I have been reading up on machine vision, exploring different libraries etc mostly around OpenCV.  Wow, there is a lot to learn.  

It seems a shame that there are a discreet set of issues that have to be dealt with, yet these solutions are developed in silos and in a very piece meal way. Many times we are separated by different processors, different memory footprints and the limitations inherent in that, different languages etc, but this is a global problem almost every builder has to deal with.  It seems collaborating on some of the pieces to make this work would be in everyone's interest.  Two or more minds are always better than one...  





You are absolutely correct, dead reckoning is not dead on by any means. And unless you correct your heading from a corrected compass you get farther and farther from being anywhere close to the calculated location. It's a start for me at least and she will run on a flat smooth floor to start out with. Good learning experience as well.

Not sure that my Left/Right brain solution using two ProMinis is going to be much better but it's what I have and worth playing with. Similar to what I wanted to do for decades now of having a micro controller take care of the details and a PC be the top level. Just never really gave it a go.

Pick and place is a huge challenge in my opinion. You're right, using OpenCV or some other high level vision system is the long term solution but I'm no where near there at this time and have only used some prepacked face tracking in RobotSee on BoxHead for vision.

I'm not sure I'll have much to add for a higher level project but I'm documenting and sharing this one along the way. Mostly to keep me motivated to work on it as everything I start a "serious" platform I get tired of it and build something silly instead. :-)

Again, thanks for the comment!


Looks like this will be fun. I'm interested in the serial motor control..... keep us posted! :)

Here's a link to the motor controller Roxanna. Pricey for it's size but super easy to use. http://www.pololu.com/product/1110

They have a whole lot of selection here: http://www.pololu.com/category/10/brushed-dc-motor-controllers

I normally just use a $2.00 H-Bridge chip but thought I'd use this one since I bought on the cheap when on sale.

"Fun"... hmm we shall see how "Fun" it ends up being. :-)



Looks like this will be fun. I'm interested in the serial motor control..... keep us posted! :)

You, my friend, have some big ambitions but I like it!  I have similar plans for my robot. I am looking forward to the updates.


Good job on the write, geat attention to detail.

Good luck and happy coding!

Well, i see that as a big project and we are still going to see updates by next year :-)

I also had something like that in mind with my Stray but it's pending since a long time ... I hope yours will make more progress!!!

He's just waiting for some robot love :-)