Let's Make Robots!

Robbie-Robot (Have to change that name!) is basically "intended" to be a mostly autonomous rover using a Raspberry PI for processing sensor data from the arduinos, commands from a Web Console, and providing instructions back to the arduinos for roaming. Very *very* much in it's infant state, this project is hoping to use TINYSLAM" with limited capability IR and sonar sensors. To deal with the limited range and sensitivity of these sensors, we are cheating, by providing a pre-existing imagemap of it's surroundings (monochrome bitmap of the floorplan)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
unix_guru's picture

Home made wheel encoders

QRD1114 encoder with inkjet printed encoder wheel.  2.49mm per transition... not great, but It'll have to do.

 

Almost reassembled

Almost time for the new inaugeral run...

 

We need More Power Captain!

I'm using a DC-DC converter to drop to 5v, so as not to waste too much to heat. Seems to work well, but I'll know better once I'm tested for a bit.

 

First dry run of this new re-build  "untethered" tomorrow morning...

airuno2l's picture

Wow that thing is jammed pack! Can't wait to see what all you do with it. I haven't heard about TinySLAM before. Think I'll give that a read.

airuno2l's picture

Oh, I was thinking it was slam for cheap sensors, but it looks like it requires a lidar, it is just written in a few amount of lines.

unix_guru's picture

I'm using tinySLAM as my base, and have bastarized... er.... modified it in the hopes of getting my maxSONAR sensors to do the work.  It means a fair bit more travel to find walls and edges to use for localizing, and because of the poor resolution/conical shape of the sonar ping, it  is definitely my current biggest obstacle. 

 

I'm working on a few smoothing / normalizing processes which makes localizing to large obstacles better.... but them damn chairs...

I've already broken off front IR sensor twice, and just about sheared off the webcam with a chair leg crossmember...

I'll put in another week or two on THIS redition before I cave and try the "parallax Lazer Line / Butchered Webcam"  method.... ultimately I want a Kinect on a servo pod...  Just ran out of money on this project...

 

 

Dan M's picture

Very cool. Looks like the Raspberry Pi is just running the GPS?

Anyway, nice layout. Hope you will have a video or two for us soon.

 

unix_guru's picture

The Raspberry PI currently manages communications between

  1. The GPS - parsing GPS strings, and committing them to MySQL
  2. The Arduino UNO - commands sent to the UNO to manage motors and Sonar POD
  3. The same Arduino UNO to receive Compass, accelerometer, temperature, humidity, ambient light, and sonar
  4. The Arduino FIO to receive left and right wheel encoder, front/rear/left/right proximity IR (lots of room left on this Arduino)
  5. WebCam feed (stillframe shots at the moment.  Scheduled via CRON every 30 seconds, or when a "LOOK" command is issued.)
  6. Stereo audio input to assess direction of incoming sounds...
  7. The External website used for control and monitoring

The Raspbery PI manages  (is supposed to manage) the static floorplan map, the dynamic occupancy map, and  a series of learned trajectory maps (working point A to B)

 

I fully admit to having bugs (some significant) in each and every one of these processes...  but that's the fun of it, right?

 

bdk6's picture

I really like this project and think it well done.  The division of labor seems about right to me, and I like seeing these bots using the power of the pi (I shoul dtrademark that!).  Anyway, your Unix heritage is showing with the MySQL and CRON jobs.  nice 

ericteuh's picture

can u describe your SLAM and post the code ?

 

unix_guru's picture

This is the part where I admit having trouble with the tinySLAM ( http://openslam.org/tinyslam.html )  implementation I'm working on.

 

I currently have to augment it with course grained position information and use that as an index to determine where I might be within that particular zone.  I'm seriously getting frustrated with sonar as a ranging device... I know... I've read dozens on articles on this same complaint.

Right now, I have front and rear facing sonar on a servo pod.  I measure distance in front and back through a 180 arc. the problem is that with the conical shape of the sonar sensing.....  I do not know how to "normalize" overlapping scans to get an average. 

Here is the best I'm able to do at the moment... Each square is 1meter.

 

Sonar sweep with front and rear MAXSonar EZ1

I'm thinking of treating myself to a Kinect for Christmas... 

thereturnofthewill's picture

you need a bigger photo