Let's Make Robots!

Sharp IR-Sensor Frustrating You?

How to relate the IR-sensor output to measurements you can understand.

You have probably noticed that the "distance" returned by a "Sharp" IR sensor is not in specific units of measure. As the IR sensor approaches an object, (or vice versa) the readings are higher and higher numbers until you get one or two inches away (2 to 10 centimeters).  Any closer and the numbers drop again.

Note: This also relates to a post from a year ago. http://letsmakerobots.com/node/6511


Latest change (28MAR2012): Just fixed some typos.


ADDED (17MAR2012): Now, I am sure you realise that the robot can work with any numbers; it doesn't care that the values are not in centimeters nor inches nor any other standard measurements. The key is that in order to program your μ-controllers for these sensors, you as a human need to understand what the values represent. The easiest way to do that is to translate these odd numbers into standard distances a human can understand.


I will note that these distances will be different for different models of IR sensors.  Consequently, the graph that I present below only works for the exact type of sensor I'm using, (ADDED: Which happens to be a GP2Y0A21YK0F).  Even if you use the exact same model, it may not give the same results, --for instance you may be using a different processor or a different analog-read command.  What you will need to do is make your own graph.  I will explain.

Using a ruler I sat an object at exact distances from the sensor.  I happened to use inches, but you can make your graph using centimeters, instead.  You could set an object at two centimeters away, four centimeters, six centimeters, (etc.) and then run a test program on your microcontroller to get distance measurements from the IR sensor, and display them on the screen.

Make a list of all the distances you checked and the value returned from the sensor.  I took multiple readings, so I could get an average, since the sensor does not always give exactly the same number with each reading.

Now you can use a draw program on the computer screen or simply take a piece of graph paper (or, make a graph, drawing it with a ruler).  Mark off the vertical of the graph in inches (or centimeters) and mark off the horizontal of the graph in numbers, at or above the largest value returned by your sensor. I felt that the higher number coming from shorter distances was a little confusing, so I reversed the numbers so the values increased as distance increased.  It is easy to do this.  Just subtract the returned values from a number larger than the largest one.  For my sensor, I subtracted the returned values from the number 165.  Voilà!  I now had a list of numbers that increased as the distance increased.  I am including the following picture as an example to show you what I mean. My graph may not work for your sensor(s). For instance, I have seen another sharp sensor that return values up over 550, and ones that have a minimum distance of four to six inches. You can mount your sensor back from the edge, so the "too close" readings will never occur.

Even though I did mine in Imperial measurements (inches), I also included centimeters on the right hand side of the graph below, for your convenience.

 (Remember this graph was reversed just because I liked it better that way. -Note the "actual reading"s in the grayed out numbers of the included table.)


ADDED NOTE (19 MAR 2012):

Within my PICAXE programming, I used the values in a subroutine called GetRange

to tell me rough distances to the nearest object the IR sensor sees.

( I used the picaxe SELECT / CASE command to assign distance in inches to b25

and during testing the SERTXD command displayed the value on the PC terminal.)


readadc 1, w8 ` 1 refers to pin A.1 (raw distance comes back stored in w8, but only b16 is pertinent.)
pause 10

case >= 155
b25 = 2 : goto inchset
case >= 135
b25 = 3 : goto inchset
case >= 105
b25 = 4 : goto inchset
case >= 90
b25 = 5 : goto inchset
case >= 75
b25 = 6 : goto inchset
case >= 65
b25 = 7 : goto inchset
case >= 58
b25 = 8 : goto inchset
case >= 53
b25 = 9 : goto inchset
case >= 49
b25 = 10 : goto inchset
case >= 46
b25 = 11 : goto inchset
case >= 43
b25 = 12 : goto inchset
case >= 41
b25 = 13 : goto inchset
case >= 39
b25 = 14 : goto inchset
case >= 37
b25 = 15 : goto inchset
case >= 36
b25 = 16 : goto inchset
case >= 34
b25 = 17 : goto inchset
case >= 32
b25 = 18 : goto inchset
case >= 29
b25 = 19 : goto inchset
case >= 26
b25 = 20 : goto inchset
case >= 23
b25 = 22 : goto inchset
case >= 20
b25 = 24 : goto inchset
case >= 16
b25 = 30 : goto inchset
case >= 14
b25 = 33 : goto inchset
case >= 11
b25 = 36 : goto inchset
else b25 = 999
sertxd("Clear beyond 3 feet ")
     ' beyond 3 feet (about a meter) was far enough away to not matter to the robot.

sertxd("Sensor reads object about ",#b25," inches away.")


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Here is a write-up about how to make a graph of sensor data in Excell and have Excell generate the ecuation for you. It works for Arduino but not for picaxe, because it needs floating point math.

Dan, using your data I've generated this graph which takes the reciprocal of the raw value and multiplies it by a scaling factor (in this case 6000) which gives a linear(ish) relationship.



Code for the Picaxe would be simply :

#picaxe 08m2
symbol IR = b0
symbol Rng = b1

    readadc 1,IR  : Rng = 6000/IR
    wait 1

You can get away with byte variables as Picaxe calculations are done with 16 bits, but you need to choose the scaling factor or limit the range values to stop 8 bit overflow in the result (in this case, IR less than 24).

Thanks Dan,

I am quickly reaching the stage where I will be putting the IR sensor on my LMR (First Robot) soon.  Seeing the steps you used to calibrate yours will allow me to calibrate mine.  Thanks for sharing the technique.



Nice work, you can get graph for every model in datasheet. For example - I have SHARP distance meter, model gp2y0a21. So I write in google - "SHARP gp2y0a21 datasheet" And we will see all data and the graph (http://html.alldatasheet.com/html-pdf/412635/SHARP/GP2Y0A21YK0F/799/5/GP2Y0A21YK0F.html)

The data sheet you point to happens to be for the sensor I am using.  I saw those graphs before, and the problem is that they give an output based on output voltage. My processor (picaxe 28X2) does not have a way of reading voltages, but rather outputs numbers between 0 and 255 with the ReadAdc command or between 0 and 1023 for the ReadAdc10 command. Consequently, I needed a graph that related the numbers outputed to a distance to make it easier to understand what I was looking at.

I made this "Tip" sheet, thinking other people would likely come up against the same problem.

You've just flagged up a little issue with using the IR sensors here.

The Picaxe (and many other micros with ADC) give a value 0-255 or 0-1023 where the measured value is a proportion of the supply voltage.  However, the Sharp sensor block diagram is showing a voltage regulator on the component.  The upshot will be that if you're running off batteries there will be some variation with ADC reading of the sensor as the battery voltage drops during its life.  Probably not that significant and fine for trolling around and avoiding stuff, but could be an  issue if you're trying for accurate measurements (in which case you'd probably be using a regulated supply anyway!)

A way around it on a battery powered circuit is to measure the battery voltage by comparing against the PicAxe internal reference, which the 28x2, the 08m2 and several others can do.  There's code snippets on the Picaxe forum to do this.

I know they are more expensive at first, but if you use rechargeable batteries you will have less problem with that.  Rechargeables tend to hold the same terminal voltage (only dropping slightly) until they reach the end of their charge and then the terminal voltage drops quickly and the robot stops moving and taking readings. Regular batteries or alkalines drop more evenly as they discharge and might give you a more pronounced problem with changing values.

Also, on my Schrödinger robot, I used the IR sensor for normally avoiding obstacles, but he also has two ultrasonic (HC-SR04) sensors. In his mapping subroutines, those readings are compared to the IR readings to try to get better accuracy to create his map. In normal object avoidance, he only takes 3 readings at 45° left, straight ahead, and 45° right so it does not take him long to make a decision. In mapping he goes quite slowly taking IR readings at 60°, 45°, 30°, 15° left, straight ahead, 15°, 30°, 45°, and 60° right, plus he takes the ultrasonic readings (fixed forward about 5° left and right, but they have a wider beam width). The two ultrasonic readings are compared to each other for overlap to get roughly 15-20° left and right plus a straight ahead figure through software rendering. These are then compared to the various IR readings as well. (As I mentioned, he does his mapping very slowly.)

I tried "fuzzy logic" comparisons for a while, but with the limited maths available to the picaxe 28X2, I finally decided it was just taking way too long and reverted to "straightforward" comparisons.

I do not use a regulated wall (mains) supply, as I wanted him autonomous and not tethered on a power cord leash.  :-)


(To AndyGadget:)  I'll put my answer here instead of under your entry, so you can still edit it to remove the extra lines.)

In this case, by fuzzy logic, I just mean giving it the "appearance" of analog instead of digital. Instead of saying, if (ultrasonic left distance) == (Infrared distance) then save number (or whatever); I give it a range that it will allow as "close enough".

Example: In the following I'm comparing the right and left ultrasonic distance readings (InchesRight & InchesLeft) and get a temporary number (stored in b26 or b27) giving me the amount of difference between the two to apply later. [b26 tells me an object is closer on the right, whereas b27 means closer on the left.]


low 2 : low 3 : low 4 :

b26 = 0 : b27 = 0
If InchesLeft = InchesRight THEN
 HIGH 2 : goto Jump
If InchesLeft > InchesRight then
 b26 = InchesLeft - InchesRight
If InchesLeft < InchesRight then
 b27 = InchesRight - InchesLeft
If b27 = 1 or b27 = 2 then
 high 2 : goto Jump
 elseif b26 = 1 or b26 = 2 then
  high 2 : goto Jump
If b27 =3 or b27 =4 then
 high 3 : goto Jump
If b26 =3 or b26 =4 then
 high 4 : goto Jump

There is a lot more programing that went into that section, but this should give you the idea how I went about figuring out if the readings were "close" to each other or not. Where I put High 3 or High 4 etc. that is just used in testing to light leds to tell me was the distance close to the same or a little under or a little over (but still close). LED2 lights if readings are within 2 either way, LED3 lights if object is close on the left but more than 2 inches and LED4 lights is an object is similarly close on the right.


Thanks Dan . . . Extra lines gone!

I see what you're doing with the fuzzness there.

The long-term aim for what I had in mind was to think of a robot's active area divided into a low resolution grid (6 inch squares or so) and have the robot 'look around' to get a series of distances of objects at various angles.   It's 'prime directive' will involve looking for small objects, so of special interest would be a closer value with longer  values each side of it.  Other interesting features would be barriers (a series of similar readings, or readings increasing or decreasing linearly (after the polar maths calculations)).  Also of interest would be vertical drops at edges of the area (from another sensor).

From the initial sweep it would only be able to discern one dimension, but from that could mark its internal map with 'blanked out' areas,  'guessing' what features might be and using a coding to mark them on its map. Scanning from another point could then firm up or reduce the weighting of the validity of its guess, and mark other areas as interesting.  Subsequent scans from different points whould be able to add information to the guesses and build up the map of features across the area.  (I'm think I'm getting into A.I. territory here!)

The robot will have a compass module for orientation information but I hadn't originally envisaged wheel encoders. Now having written this down and reading through I think they will be a must.  The processor will be a Picaxe 28X2 (probably with a 14M2 looking after some functions) so doing this with integer maths will be fun and games too.  As I said at the top, it's a long term aim.  I'll start off a lot simpler #;¬)

I know what you mean on the mapping.

     In Schrödinger's case, I set a grid distance that is more meaningful to him rather than a specific real-world measurement.  When he moves forward, the distance his wheels travel in one revolution works out to roughly a fourth of a meter. His wheels are exactly 3" in diameter, meaning he travels 3 times π inches with one turning of his wheels. He uses his wheel revolutions as a measurement in his mapping.  For better resolution and because I have enough memory on board, I break that into 1/4 turn increments, which is about 1/16th of a meter (about 6 cm or 2.35 inches). Each such distance is one side of a block in his memory map.  His original map was initialized to 255 (all ones) in each memory byte. 255 is by definition an "unknown" area.  As he moves about, he will store a "0" (zero) in each memory block he comes to that is open /unobstructed. When he comes to an obstacle, it is marked with "11". Each time he visits /passes that area (by dead reckoning, -he has no GPS), he increments the number, to show he has more "confidence" in his readings and that that area indeed blocked by something.  He only increments the number up to 14.  These numbers indicate the highest level of confidence that his readings are correct. Other than 255 (unknown area) and zero (open area), any other non-zero number in the memory map indicates an object or a probable location of an object.

     The reason behind the confidence levels is that when he gets to a 14 confidence on at least three different memory blocks that are in a straight line and only a few blocks apart, even though he has not been to the blocks in between, he will make a "guess" that he is looking at a solid object that includes the blocks between. When he makes a guess like that he will fill in the "guessed" object(s) on his map, but with low-confidence numbers. He puts a small number between 5 and 1 (explained more below) to show that it was an estimate and not a solid reading. -- Much lower confidence than a reading of 11 through 14.  He uses these in two ways.

 1) If he is trying to find the shortest route from Point A to Point B, he will "assume" there is no path in any area where the number in the memory block is not zero.  Even though the area he guessed at has a low confidence number, it is not zero and so he will consider that path blocked and he will look for a different route.

 2) If he cannot find a route from A to B, then he will go back to his map. He will first check areas marked 255 (unknown) and if there is still no path, he will test to see if he can in fact get through the section he thought would be blocked. If he can, he changes those memory blocks to a "0" (open). If he finds the way is blocked, he marks the block(s) with a "11" meaning already tested once and marked impassible.

     Also if he goes near a previously checked area that now appears empty, he may be dealing with a moveable object, such as a door, so rather than mark it "0" right off, he just decrements the number (14 becomes 13, etc.) by one to show there was something there, but now he is less confident that it will be there again.

     I am not sure all this was necessary, but it was one way of dealing with his mapping and giving him some confidence in his long-term-memory of what his surroundings are like.

     He has a major flaw in his mapping system, and that is the he does not make perfect 90° turns. He ends up at arbitrary angles. If he is in MAPPING mode, he tries to minimize the errors in the following way. If he saw an object at say, 30° left he will attempt to turn 90° and then check if the object now seems to be 60° to the right. If not, he will try to adjust his position so his map readings will be as accurate as possible. This is not always possible, since the object may have a shape that hides its extent. He might be looking at a wall for instance that will go on a ways.

     If he is not successful, his next attempt to line himself up will be pick the nearest object already on his map and try to go to it. If it is close to where he ends up, he will attempt to adjust his position.

     The last thing here is to mention that in his MOVEMENT mode, because he is not nearly so precise, he will use the map he has already collected, but will not alter it. He only does that in MAPPING mode, where he is double and triple checking himself.

Here is a SAMPLE memory map, showing him scanning the directions above him and seeing objects. Since they are all lined up (based on his distance readings), he makes a guess that the sections he did not scan are part of a single object. He will only take the time to confirm this if he runs out of options and must double check his guesses.

On the right hand side of the sample map, note the vertical row of "guesses". Using the explanation key on the left, note that the yellow square is the very lowest confidence (1) because, even though the known blocks (black) are lined up, that block is the furthest from them that a "guessed" block can be. If there were even one more unknown block between the confirmed known ones, he would not make any guess that the known blocks are part of the same object.


     If we apply this to real world measurements, we will see that 9 blocks is approximately 21 inches or  9/16 of a meter (~56 cm.) As I said above, this is not an even number in the "real world" but is based on his wheel movements. The robot doesn't care.