How-to for Thomas
February 12, 2013
First, lets state our goal:
We want a robot to drive to dark areas while avoiding stuff on the way using various sensors.
As with everything, we break it down into smaller pieces. We test each one individually, record any numbers we get (ADC readings and the like) and then start putting them together. While doing each of these tests individually, it helps us figure out how we are eventually going to put them together.
If someone told me to accomplish the goal above, I would do it just like this:
- Test of LDR's by themselves. No mapping, raw data. Write these numbers down (dark, light, etc)
- Code the robot to turn on its center until both LDR readings are identical or within a threshold --then I would play with this code and a flashlight for a while
- Code the robot to do basically the same thing as above, but go forward this time. See if you can steer to the dark side, see if you can get it to turn proportionally in relation to the difference in readings of the 2 LDR's
- Forget about the LDR's completely and move on to the sonar sensor
- Test the sonar sensor all by itself
- Code the robot to drive forward until it sees an object at a given distance and have it stop
- Forget about the sonar sensor completely
- Code the robot to drive off of its sharp sensors (I assume these are R/L pointing outward) --these can be coded to "scoot" you off of a wall (when you are coming in shallow and your main sonar misses it) Code the robot to drive forward, check its side sensors and make a small correction (just a small correction) to "scoot" off the wall
- Add your sonar code to your sharp code (still without the LDR's) and have your robot do 2 things --scoot off of the wall, and stop when the main sonar sees something too close
At this point, you will need to stop and think about the actual operation of the bot. Right now, we just stop when we see something and have to reset the bot to make another run. Lets think about what we would want to do instead. We are driving forward, we see something, --might be good to turn. Which way? Well, as it happens, we have 2 sharp sensors sticking out to the sides, lets ask them and see if either way is better (farther). Our code might look like this:
- Go forward, check all sensors
- If sonar is too close, turn. Base turn direction on readings from Sharps
- If both sharps and sonar are too close --Allstop, goto stuck in corner routine
- if none of the above is the case, drive forward
Ok, so now we have a robot that reads all 3 navigaion sensors and knows which sensor is more important than the others and drives accordingly. Let's go back to our LDR code.
- You still have a "read the LDR's" routine left over --great
- Add to this read the LDR's routine, some code that will figure the "turn request" I.e. the LDR's want to turn left or want to turn right.
- Add this "check the LDR's" to your main loop --it can go right next to the code that reads all the other sensors
Now, here's the tricky part:
Add this turn request to the beginning of your navigation code. Now, the navigation routine continues (the stuff we talked about above, scooting off of walls, front sonar etc.) and if any of that happens, it comes after your turn request in the code. --It will "cancel out" the turn request, ignore it, and just do regular navigation. However, lets say the robot is out in the wide open. It runs through the navigation routine (starting with the turn request from the LDR's), but this time its obstacle avoidance is not the priority --there is nothing in front of the bot (or to the sides) and the robot will not correct its path. Thus, it will not "cancel out" your LDR turn request, it will "get through" and be done. Now, we just turned to the dark.
Wow, please tell me at least some of that helped...