Let's Make Robots!

Bottoo! Platform to test ranging sensors and algorithms

Evaluates combinations of ranging sensors - Efficiently identifies and tags landmarks (yeah.. right...)

Update: 03/14/2014


I've finally mounted the awesome Open Source Laser Range Finder (OSLRF01) onto Bottwo with a panning servo to provide mapping functionality.


I was previously attempting to do this with the Sonar, but was stymied by limited range and cone size.


The narrow beam and greater sensing distance of the LIDAR will give me the ability to accurately map out a room in near real time, and then use the other sensors for closer proximity measurements.


How much is too much? 


Yeah, I heard that!    Yes, I've got four  Sharp IR Sensors, two front and back for collision detection, two left and right for following walls at a specific distance.  

I've now got two more short range IR sensors  front and back facing the floor... so we don't fall down stairs again.



The purpose for keeping the front and rear panning MaxSonar is simply to fill the near field void that the LIDAR does not cover.  Because of the distance between the optics, the OSLRF cannot see closer than 1/2 a meter.  Also, like IR sensors, Laser is not fantastic at identifying thin objects like chair legs.  So I use the Sonar to sweep the near-field for collision avoidance as well.


I hope to have video up soon of this in action.



Update: 03/01/2014

I accidently left Bottwo "online" last night...  Usually, he is offline for charging or during my development, but there are small windows of time where other's have logged in to drive him around the house remotely.  Last night was apparently one of those nights...

Someone drove him to the edge of my basement stairs, and tested his "slinky" function.

Luckily, I had not yet mounted the Scanning Lidar on him, and the only things that broke were the camera/sonar pan/tilt, and the sealed gel battery ripped off it's cable...  (and a dent in the hardwood floor... don't tell my wife!)

Tonight, after repairs, I will be adding a front and rear Sharp IR GP2Y0D810Z0F floor sensor...   When my "Path Planning" algorith is completed, I should be able to disallow motion into such areas. 


UPDATE:  Videos added.

This is my second bot.  I just started it a couple days ago, and expect it will be a few more before I submit a video.  This one has low speed (50rpm)  high torque motors.  

My first one ran too fast for the wheel edge encoders, and if you drove him slow, he would stall under the weight.

Botoo (bot-2)  will be equipped with:

  • The I2C Sonar Pod that I'm working on.
  • A set of standard Sharp  2Y0A02 IR sensors on front/rear/left/right  ( I may also put two more on 45 degree front-right/front-left).
  • A pulsed line lazer with Webcam for Paralax ranging, and ....
  • I just bought a Kinect 

The purpose of this bot is to develop and refine routines for identifying landmarks (walls, doors, furniture) to allow for better interpretation of ranging data.


Update: 14/01/04

Got power supplies, Raspberry PI, and Arduino UNO up and running, and running simple sketches to tune wheel encoders.


Update: 14/01/07

So, this is the oldschool laserprinted encoder wheel and QRD1114 that I'm using on Robbie... I will admit to wasting more time on this little POS circuit than any other piece of this build.  

So, I treated myself to a commercial set of encoders from Solarbotics.  As well as the typical quadrature encode funtions, they have a CLK pin pwm modulated and a direction pin. (Also have a serial out with distance/velocity, but...) 

I had to enlarge the hole by a few thou to get it over the hub of my new wheels.  Not what Solarbotics intended, im sure... 

Fine print warned me against using it on anything other than their GM 2/3/8/9 gear motors...  My skull's too thick for that to register though...

And yes!!!  that is hot glue holding it all together.  Once I get the alignment validated.... then we'll put in the screws!

Update: 14/01/10

Telemetry control board - 1/2 completed...

Update: 14/01/16

Apologies for the slow progress on this.  Three kids under 7 means little time to myself or my projects.  :)

I'm all wired up now, and working on my code.  If I were to admit to having any skills whatsoever in coding, I'd have to say PHP is my comfort zone.  However I2C capabilities on the Raspberry Pi are pretty much non existent in PHP. 

I found this https://github.com/tbrianjones/raspberry-pi-i2c-bus/blob/master/peripherals/i2c_bus.php  as a good start.

I'm expanding upon this, using the Adafruit python I2C bus code as a template.

I need to read/manage:

  • HMC6352 compass module
  • ADXL345 3 axis accelerometer
  • BMP085 barometer and thermometer (also provides altitute via algorithm)
  • Arduino UNO motor driver / wheel encoders
  • Arduino Mini Sonar Pod and IR proximity 

I would love to hear from anyone who has had any experience in PHP on the Pi....

Update 2: 14/01/16

It's been a rather productive, yet expensive day.  I somehow shorted out and destroyed my 18v Lithium ION motor battery. Awesome! 

So, I'm improvising with 8 AA NiMH rated @2100 mah... we'll see how that does for now.

Here's is a picture of it's first "un-tethered" voyage....

... and yes.... it hit the stack of DVDs.  apparently I was scanning right over top of them.

Video to come soon.  (Is this the part where I admit to my lack of skill at making/editing videos?)

Update: 14/01/26

I've replaced the dead 18v Lithium Ion battery with a standard 12v gel cell.  Easier to charge, weighs a bit more, but... whatever...

I get bored easily, and have too many little things that I jump around between.  Lately I've been working on various routines for "self preservation".  Nothing extraordinary, just typical things like if the battery gets below a certain point, come back to base to charge .  The latest one was regarding wifi connectivity on the Raspberry Pi.   The routine would evaluate the wifi connection with the WebServer (commands coming in/telemetry going out) and if it hasn't connected in a while,  or the wifi signal is too low (small usb dongle inside chassis... bad idea...)  the rover would seek out a stronger signal.   Sounds great in theory.


So I went downstairs this morning, to find the rover huddled in my living room directly below the wifi router.... battery dead as a doornail.  Upon reading the logs, it appears that I accidently connected the routine that would send him back to the charging station on low battery with the new one that would attempt to correct wifi issues.    Battery got low, so he looked for a stronger signal!   Makes sense to me...


btw.... I said something like "Awwwww... it looks like he was trying to get a better signal..." in hearing distance of the wife...  She just looked at me, and said   "He?..."


Update: 14/01/27

Let's call this update "I'm no mechanical engineer!"

If you look at my pictures, this is a two wheel differential drive, with a trailing caster wheel.....  The caster is small, providing a slant to the chassis, which I kind of liked the look of, so didn't think to or bother to correct.

I've been wondering why turning has been "lurchy", as well as transitions from forward to reverse....

As it turns out, I should have just looked underneath during such "transition".   The offset caster, as it rotates, because it's pivot plane is on an angle, has to actually lift or drop the chassis.... including the rear-of-center mounted gel-cell battery....

Here is the "slope" of the chassis moving forward...

Here is the "slope" of the chassis moving in reverse....

I simply raise this issue to help others that may come across this.    Tonight, I will either be adding a spacer to lift the chassis, or preferrably installing a larger ball caster.


Update: 14/02/18

Just some new pictures... 

Profile (Ain't he cute?)

Head on... notice the Laser Line level and RaspBerry PiCam front and center for future ranging...


And this is the glue that ties the Pi to all of the sensory input....



Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

This robot is great! You seem to have the same kind of goals as I do (I suppose most people here would want the autonomous, "goto location", and manual drive modes, really). I've got my Pi, and varous other parts. Just trying to work out what to use as a frame. 

I look forward to seeing how you go with it!  

I'm very interested in where you are going with this as its an area I intend to spend more time on when life/kids and work allow me to!

I spend a lot of time watching insects which are essentially pretty dumb but seem to get around ok, even with very little brain power, and they dont get stuck in corners either :-)

I've been thinking about having two levels of navigation, one being a low-level, basic object avoidance, similar to an insect. Your sonars and IR sensors seem ideal for that. Then a higher level intelligence that can perform the localization and mapping/search path navigation which the Lase Range finder seems idea for.

So you can use the laser to make a plan and head for for your goal, but the sonars can override to get around obstacles and then when danger is passed, the higher level cortex makes a new plan.

Love to see where you go with this.


I feel like I had really good luck with sonars and the "Force Field Algorithm".  I'll donate some arduino code if its anything you're interested in.  Once I smoothed out the sonar data a bit, it gave a pretty good obstacle avoidance while moving around and looking elsewhere.  At a few bucks per sonar, its hard to beat.  

I think around 12 are needed to be effective indoors and cover 360 degrees and handle wall bouncing.  My bot only covered 270ish, so it would turn back towards a wall (if its goal was behind the wall) after it had turned directly away from it, as the force field was blind in the back.  This can be solved with software to have a short term memory of what is behind or some other techniques...still wish I had 12 though.  Despite the blindspot, it would work its way around a wall while attempting to reach its goal.

Did you do anything with the laser line level yet?

Nice work, I really hope you stay interested in this project, I think we all can learn a lot from it.  I think I'll hold off on LIDAR until I see what happens with yours.



I read about the Force Field Algrithm from a University of Waterloo Paper from a couple years ago.

Would be VERY much interested in seeing how you implemented it.   It will be a bit before I get back to the Laser Line Level.  I'm re-doing my  code around command and sensor processing in python.  Moving a lot of my "git-er-dun" style inline coding to approriate classes, and threading where I can.  I've removed the mySQL  command queueing nonsense that I had between the webserver and the bot, and replaced it with a websocket client/server.  Much more responsive (but you all knew that!) 

I'm still tee-ing the commands to a mySQL table for logging and potential replay, but that may even go away in the future...

I'm also trying to understand how to set up a publish and subscribe system to support multiple "bots" as they come online.  Or more appropriately... to support multiple sensors as they get added to a bot.

By the looks of your videos, I've got a couple years of catchup to do.  Hopefully I can lean on you from time to time to guide me in the right direction.





I sent you a bunch of code through the "message" feature, which I have never used.  I assume its going to send you an email.  Please confirm if you got it.

This was some of the first robotics code and first arduino code I ever wrote, so I hope it doesn't suck. 



There might be another possibility for a good obstacle detection/mapping sensor that would be quick and not need to pan to get detailed data.  I haven't tried it but I think it would be doable.

1)  Put a $10 laser (that shoots a line pattern or a cross pattern) low on the bot shooting out horizontally forward and level to the horizon.  Maybe an inch or two off the ground.  This line will spread out to give you 30-45 degrees of coverage at the same time.  

2)  The laser line will take various shapes depending on what it hits and at what angle.  Example, if it approaches a wall hear on, a level line will be produced on the wall.  If it approaches at an angle, a line sloping up on one side will result.  If a narrow obstacle is in the path, a short line will be seen on the obstacle, with the rest of the line disjointed on whatever is behind the obstacle.

2)  Use a camera mounted higher up in the bot and open CV to look at the line and filter for the intense red/pink of the laser.  I have tried this and it works.

3)  Evaluate the shape / slope / number of segments / position of the line to estimate.  (haven't tried but it seems like basic geometry and a little stats)

1.  Distance

2.  Obstacles

3.  Angle of Attack (when near walls)

4)  The strengths would be being able to evaluate an entire wide field of view in a single frame, with great granular detail, without panning.  The weakness is this would not cover the vertical dimension.  Perhaps two or three lasers could be used to cover various heights, but this would start beaming people in the eyes.

Looks like you might have all the pieces on your bot and skills to take a whack.  Hope there is not some flaw in my thinking, I tried firing a laser and looking at the patterns quite a bit.  It seems doable.


If you look at the picture below, you will notice the Raspberry Pi cam (5megapixel?),  and about 6cm below you will see three wires feeding the Laser (power / ground / pulse.  A cm  below that, you will see the slot for the laser exit.  This is a dismantled Black and Decker Laser line level.


It works quite well, and definately requires more attention, however unless you dedicate a processor akin to the Raspberry Pi to this function, you end up with a "run-stop-look-run-stop-look" method of travel. 

My goal is to take measurements while traveling.  Even with my panning sensors, I send a time stamp and a frame stamp with the distance samples that can them be aligned with wheel encoder position to get point in time measurements while traveling.  The arduino can handle this data mapping readily.

I frequently get told I've got too many processors onboard... nope... I'm good with it. 


LIDAR Lite can measure down to zero.. Just sayin'. :) 

I really like this bot because I have a thing for sensors.  Are you planning some comparative testing of the sensors or just seeing how many you can fit on one bot? Didn't see any trusty SR04s on there..  Not good enough for ya? 

The LIDAR is good for mapping and rangefinding, but not great at detecting small obstacles locally (even the really expensive ones) bu the Sonar, with a much wider cone can tell you something is "somewhere in this area at this distance"  which is good for collision avoidance. 

I'd like to ultimately have a data feed that combines the two data types. In-field wide angle, and  Outfield narrow angle.

I has the SR04s initially on my first bot http://letsmakerobots.com/node/39052   but soon traded them out for the MaxSonar.  Was hoping the the narrower cone of the MaxSonar would help me with Rangefinding, but it became apparent that I could reduce my pan increment to 5 degrees and still see no noticeable difference in the output... so...

After I do some more work on the Mapping and Localization piece, then I'm going to look at comparing the various sensors systematically. 

And yes... LOL... Will be picking up a LIDAR Lite to add to the arsenal.    I've also got a Kinect sitting on the shelf waiting.  The problem with it is, it is bigger than by bot! 


and to that I answer...

time to make a bigger bot!