Let's Make Robots!

My intro and a request for pointers for navigation pseudocode please.

Hello All

I am new here and am currently working on building my first robot. I think I bit off a larger project than I ever should have for a first bot, but have learned a load in the process.

I built a RepRap 3D printer and have since printed out some track links and wheels (a design I found on thingiverse). I have designed my own chasis and am currently on version 4 which seems to be fairly stable.

My bot is a catarpillar..

At first my bot was way underpowered using one of those hobby twin motor gearboxes. The bot would not turn on a carpet, and later tests found it would not even turn on a smooth wood floor. I then purchased a couple Towerpro MG995 servos and modified them for continuous rotation. Obviously had to redesign and re-print the chassis, but now my bot has the power required to skid-turn on the carpet wohoo :)

I have also added a Sharp IR sensor (10-80cm resolution) on a small towerpro 9g servo. This is used to scan as the bot wonders around trying not to bump into things.

It kinda works and I have included my current pseudo-code but was hoping someone could direct me to more grown-up navigation algorithms or navigation strategies which I could get some inspiration from.

Thanks

main loop {
  if first time in main loop { navigate() }
  else { scanWhileDriving() }
}

navigate() {
  while not clearAhead() {
    scanForPath()
    turnTowardsPath()
  }
  goForeward
}

scanWhileDriving() {
  Scan 30 degrees each side of straight ahead (60 deg  to 120 deg)
  if anything closer than collisionThreshold detected {
    stop
    navigate()
  }
}


clearAhead() {
  Scan at [60, 75, 90, 105, 120] degrees (90 being straight ahead)
  if anything closer than clearPathThreshold { return false }
  else { return true }
}

scanForPath() {
  Scan 80 degrees each side of straight ahead ( 10 deg to 170 deg)
  Return furthest distance >= clearPathThreshold and angle distance found at
}


turnTowardsPath() {
  lookStraighAhead
  turn leftOrRightBasedOnAngle until furthestDistance found
}

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

This old post from frits might be useful. Understanding what your robot sees can help you define what to do with that data.

CtC also discussed an approach to navigation he called 'May I go?', as opposed to 'just go until you see something, then turn'.

There are more complicated mapping type appoaches, where your robot builds a map and memorizes the environment. This takes tighter control, such as encoders or other sensors, because you have to know exactly how far you have turned or travelled.

Nice one ignoblegnome

The wavefront algorithm/rout planning process at http://www.societyofrobots.com/programming_wavefront.shtml  looks pretty interesting...

Also the other links you provided certainly offer some food for thought.

 

@birdmun

You are probably right. My bot is no more complicated than the SHR. I do personally think I should have started with a basic platform and 2 wheels rather than a tracked platform because most of my initial grief was due to having to construct a transmission process to link my tracks to the motor solution I had. This process had a number of failures purely due to the wrong choice of propulsion and a simple wheeled bot would have gotten me to this programming phase much quicker.

Anyway, it is the navigation process I am interested in, and have looked up the subsumption architecture you talked about. It seems like a methadology that makes sence and I may explore it later as my bot gets some purpose besides wondering around not bumping into things. (I don't really know what my bot is good for so I am a bit like my bot in aimlessly wondering around right now :) )

@Cactus.

I have been wondering how you implement this 'gap search' process of yours. I am guessing you need to record a number of points and the gap is measured in terms of the angle your servo sweeps through not detecting an obstacle?

I can't imagine there are no resources/methodologies that would help in navigation control, or are the chosen methods simply in the efffectiveness of the methods that I have included in my pseudocode?

All an adventure :)

I use a 'gap-finder' method.

Scan the horizon for multiple, consecutive distance readings greater than some "safe" threshold - this defines a circular sector (pie-wedge shape) that should be clear for navigation.  Then turn towards the center of that arc and take off - even if that's not where the absolute longest reading was.  Since you're using skid-steering and can't be very precise in your turn direction (yet), this points you into the middle of the largest "safe" area.

I also like to navigate towards an anomaly - look for a distance reading that is significantly different (further or closer) than the adjacent readings and then go towards that.  It might be a chair leg, or it might be a secret gap passage into a hidden room!

Your bot doesn't sound much more involved than most of the Start Here Robots.

Your pseudocode looks fine to me. It really looks fairly similar to what the SHR runs. I don't know that there is a whole lot of wiggle room unless you consider something like subsumption architecture. Even then, the functions are similar just the way they are enabled is a bit different.