Let's Make Robots!


Maps 2D surroundings, working toward fully autonomous navigation

R1 is a work in progress. On board is an Arduino compatible microcontroller programmed in C++. It links to a PC via XBee wireless. The PC is running code in Python. I'm investigating SLAM algorithms and want to implement a fully autonomous exploration mode where it maps the floor area.

The wheels are driven by DC motors with rotation encoders for feedback. PID is implemented for speed control.

The sensors are on two servo-rotated platforms. It can scan > 180 degrees in front of the robot with Infrared for short-range and Ultrasonic for longer range distance finding.

Find out more at http://ralphsrobots.com or http://www.facebook.com/ralphsrobots

- Ralph


*** Update: September 22, 2012 ***

I made video #5 today. It is embedded (to the right and below). It's an interesting one because it shows what's happening on the laptop side of the laptop-robot link. I take you through a mapping session & mapping is what this robot is all about.

Since there are only 5 video slots, I'll just put links for videos #2 - 4 here:




Thank You!






Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

It is always good to have computation power. Even if it is in a mothership. Great job. Welcome to LMR.

A few days ago this article showed, that there is an asymmetry in the IR sensors. So if you turn one of this two sensors around so that you get this setup then you get better sensing results.

Thanks Nils. I have actually noticed asymmetry in their performance. I'll try flipping the one.

I'm glad to find this site - it looks quite active.

Regarding the mothership, I figure if I wanted to spend big bucks, I'd make a big robot and put a laptop in it so - there's no cheating in what I'm doing - just saving money.

Of course, it wouldn't be too hard to make room for a raspberry pi!

I am somewhat surprised you didn't mention sensor fusion or having enough sensors that things can fail gracefully.

Ultrasonics have problems with soft, fuzzy objects, and, narrow, round objects. IR has problems with black/dark objects (not so sure about narrow objects). If one or both of your distance sensors fail to see something, you can still rely on your bump switches to catch said hidden object.

You mentioned encoders. No mention if they are single or quadrature encoders. Add current sensing and you can know if your motors have stalled.

All around, nice bot. I hope we will get to see more of the work that is offloaded to the laptop in the future.

Welcome to LMR.

Thank you, Birdmun for your interest & your positive feedback.

The encoders are single encoders and there are only 10 pulses per revolution. Still, much better than nothing.

Besides the items you mentioned, it is difficult to get a good fix on direction with the ultrasonics (IR doesn't have this limitation). The ultrasonics also give many false postive "hits" i.e. they'll tell me an object is close when there is actually nothing there (maybe sound sometimes bounces back off the floor?). When I get more than one ultrasonic and IR hit in the same direction, that gets lots of weight. Also the IR are great at finding walls.

I'll have to do an article on how I process the sensor data.

There are a couple more videos on http://ralphsrobots.com - maybe I'll copy them here too - there's lots of activity on this site.

Good to meet you! I'll check out your projects tomorrow - time for sleep!

There must be a bug inyour program,or maybe a bad sensor,  I use the parallax ping sensors, and they work flawlessly. The sensor should not get a reflection from the floor with the setup you have.

I'll have to play around with them & see if I can make the "ghosts" go away!

Thanks Damo!

welcome Ralph, nice little robot. Most of the robots here running the code directly in the µC, so I am watching how your system performs :-) Double/triple sensors, good idea, will do that for my "Stray" too...

Good to meet you!

Next robot I make will definitely have sensors that can get a 360* view.

This video: http://ralphsrobots.com/2012/09/01/video-4-r1-system-software-architecture/ explains a bit about how/what goes on the uC versus the "mothership". Hopefully, I'll be putting up more info soon.


Just saw the youtube videos. Well, that's something we do not see often here (except from CtC) that the guys introduce their robots personally :-) Well done!!!