Let's Make Robots!

UPDATE: Added a short video of hexed-bot moving around (on the leash) and sort of following the laser.

UPDATE: Added new video of light tracking training. Hexed-bot's camera eye is following a laser pointer and displaying on the small LCD.

UPDATE: Added some photos of additional shields added for clearance and easier wiring, and my little TV from Adafruit. Adding some output for the video shield seemed to make the light tracking more stable/reliable and, well, it just seemed cool. I probably won't be adding much more to hexed-bot because he's getting a little heavy.




UPDATE: Posted some updated code on github. The GP2D12 sensor and the video shield seem to be playing nicely now. Not fully understanding how the video shield operates in that for it to work the Arduino code needs to switch the on-board ADC off to enable interrupt routines used to capture video. For now, I'm alternately switching the ADC off when capturing the video input according to the shield designer's example code, and switching it on when reading the distance sensor pin...is there a better way? The low level bit flipping is melting my brain.

Capturing the video stream from the camera needs this:

ADCSRA &= ~_BV(ADEN); // disable ADC
ADCSRB |= _BV(ACME); // enable ADC multiplexer
ADMUX &= ~_BV(MUX0);  // select A2 for use as AIN1 (negative voltage of comparator)
ADMUX &= ~_BV(MUX2);
ACSR &= ~_BV(ACIE);  // disable analog comparator interrupts
ACSR &= ~_BV(ACIC);  // disable analog comparator input capture

When reading from the sensor, I'm doing this and then calling a function that sets the bits again as shown above:

ADCSRB &= ~_BV(ACME); // disable ADC multiplexer
ADCSRA |= _BV(ADEN); // enable ADC
float val = read_gp2d12_range(gp2d12Pin);

UPDATE: Added some video of hexed-bot's vision/light-tracking training.


UPDATE: Keeping my driver code for Arduino on github: https://github.com/swoodrum/hexed-bot

Work in progress. HexedBot can walk forward and backward. Still working on the driver program. Hopefully I'll get the video processing working so that he can follow an object or a bright light, for example a laser pointer. I'll use the Sharp sensor for collision avoidance. I've followed the plans pretty much literally out of the Robot Builder's Bonanza book, and added a second deck to mount the Arduino Uno board, the Pololu servo controller and batteries. The construction material is 6mm Sintra PVC plastic from Solarbotics. Still a lot of work to do, but wanted to share my progress :)

Here's a couple more pictures:



Update: added a video of HexedBot's obstacle avoidance training

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
ChuckCrunch's picture


that is the robot i biult with the vidio sheald

i expect you to have less trubble than i, having a sereal servo controler. i needed to use a second arduino to get around timer isues

ChuckCrunch's picture

i swaped to the wii cammuch better sensor and it can track 4 blobs not just 1



Project2501's picture

Thanks for the info...I'm using a small color camera that outputs NTSC. I've played around with the object tracker sketch and got that working although for me the sketch freezes after a few minutes and haven't had time to troubleshoot. I haven't quite figured out how I'm going to translate the object detection into the servo motion that controls the camera position etc...I'm looking at how this face follower was implemented using OpenCV, although I want to try to do everything onboard the Arduino: http://marco.guardigli.it/2010/01/arduinoprocessing-face-follower.html

ChuckCrunch's picture

you can lose sync with the input signal and it will freeze. i have no fix for this

if you map the X and Y to 0-180 you can use it directly with the servo command .it may need some tweaking

or work from the center pixel and calculate the difference of the center pixel and the blob and add or subtract from 90

but im shore you will come up with something

the sensitivity pot is a big pain in the ass . im not fond of this sheald it's going to used as OSD (on screen display ) for my RC rover when i get a decent wireless Cam and battery, using it for blob detection is fun and cool but not what i would call reliable , auto gain and wight balance play havoc with the trim pot setting and going from bright to dark lit space you may find you lose the blob altogether, but your camera will have a lot to do with that ,  a black and wight camera would probably be the best choice 

Wii cam's rule 4 blob detection ,settable gain and min blob size down to 1 pixel , resolution 1024x768, simple output format X,Y and blob size for each , uses I2C so only 2 pins  on the down side it's a bit tricky to debug i need to wright a custom debug in Processing to get a screen like output,          

TheRealDanielJ's picture

Good bot, i like this bot! and im glad you are going to use a camera to follow stuff! im excited to see the progress of this bot!