Let's Make Robots!

Cheap home made IR compound eye

Allows your robot to see and track nearby objects
AttachmentSize
Mr._General.bas5.15 KB
Compund_eye_instructions.jpg1.24 MB
Mr__General.zip3.24 KB

The purpose of this cheap, easy to make eye is to allow your robot to track movement of nearby objects (within 200mm). After much experimentation and various degrees of success I have finally got a good working design for my IR tracking system which is really a simple 4 element compound eye. Compound eyes are found in Arthropods such as insects. They are of relatively low resolution compared to the human eye but more responsive to movement. Unlike Insect eyes, my design includes it's own light source and is blinded by excess ambiant IR making them better suited to indoor and nocturnal activities.

In my earlier designs I used a transistor to amplify the signal from the phototransistors but this caused some problems with calibration and did not increase the range as much as I had hoped. When I did increase sensitivity to about 500mm I ran into other problems such as a white wall in the background reflecting light better than an object such as my hand causing my robot to look away from my hand instead of towards it.

The eye consist of 4 IR LEDs and 4 pairs of photo transistors. The phototransistor pairs are connected in parallel to increase their sensitivity. The phototransistor pairs are then connected to your analog inputs the same way you would connect an LDR. This circuit is really 4 FritsLDRs but using phototransistors instead of LDRs. The main reason for this is that the lens on the phototransistors makes them more sensitive to light directly in front of them and because LDRs are very slow to respond to changes in light.

The demonstration video is of a new robot being produced by DAGU called Mr. General. He is basically a "Start Here" robot based on my Bot 08M. Click on the schematic for a larger picture.

 As you can see, the eye is very simple to make. Using it to guide two servos in a pan/tilt mechanism is a little more complicated. I have included the sample program used in the demonstration video to try and help. Mr. General is designed to work with any processor but unfortunately I can only provide a sample in picaxe basic at this time.

The program basically compares left and right inputs for pan, up and down inputs for tilt. The bigger the difference, the faster the servo needs to move to follow the object. Another thing the program does is look at the average value of the inputs to gauge distance. The closer the object, the higher the readings. This is used to scale the results and prevent the servo from over correcting.

Having said that, I haven't perfected the scaling yet. at the moment, the program divides the readings to get a scale factor but since the light returning to the sensors is inversly proportional to the distance² I should really use a square root funtion in calculations. As a result of my crude scaling technique the robot seems to have developed a bit of personality. It likes one of our technicians and behaves well for him in the first video but does not like Claudia and shakes it's head at her (due to servo overcorrection) in the second video.

For those who would rather buy than make one, DAGU will soon have these available as a robot accessory.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Yes there is some leakage in this prototype and I did try using heatshrink on the LEDs but unless you get the heatshrink perfect then it tends to require more calibration as some sensors seem to get more reflected light than others. When the new PCB's arrive I will experiment with having the LEDs raised off the board with a custom shield.
I'm a bit curious about that as well seeing as shrinkwrap or black tape(harder to apply to just the emitter though) works wonders for the stray light.
 I like component level and have had similar ideas. You can make them more directional with heat shtink or tape around the sides of the detectors. I forgot about the old style IR sensors that look like LEDs. The newer ones are flat with a bubble lense and built in transistor.

Nice to see the evolution of an OddBotEye.  Have you done a lot of experimentation with the number of LEDs ?  Would a stronger spot light help in Claudia's case?

In my earlier experiments (the photo is from my original splatbot) I had used a lot more LEDs. Apart from using a lot more power, range was increased but the ability to track seemed comprimised. If a stronger light was used then a lens would probably be required to keep the light focused into a relatively narrow beam.

IR_sensor_array__small_.jpg

After watching the vido with Claudia several times I suspect it was shaking it's head because of the angle that she held her hand/wrist at compared to the robot. The software still needs refinement.

Very nice, it seems to be quite capable based on the posted video.
I notice you only enable the IR LEDs for the duration of the ADC sampling, no doubt this saves a fair bit of power. Perhaps you could also scan the ADC values before enabling the IR LEDs and use those values to help calibrate out background IR interference?

This is true and normally I do but for this sample program it was not neccessary.

wow awesome!

*bookmarked*

Calculon likes it a lot. Calculon will henceforth refer to his creations' eye sensors as "oddbots". he invites all LMR to do the same.
Thank you Calculon, I think Oddballs might be a more accurate name :D