Let's Make Robots!

IR and Sensor rangefinding data visualisation

These images were created by stepping a pair of servo motors to create a file containing sensor readings for each possible servo position.

I took 3 readings for each position from each servo and stored them in a file, then read the data back in a seperate pygame program running on a desktop to generate the image. RGB values were generated from the 3 readings from each sensor.

The robot was positioned in a hallway looking at a door about a meter away. For the bottom half of the picture, the robot is looking down and scanning an area about the size of a CD and shrinking as it gets lower. this is due to the positioning/mounting of the servo.


I think i need to add a cap to the Sonar's input line - both sensors are running from the arduino's +5v line, powered via a serial connection to an eeepc

All software is done in python, scanning took 10 hours or so and generated a 16 meg file.

IRscan.png1.49 MB
SonarScan.png1.49 MB

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

And really nice use of pygame and python.

But, you did not mention what are you going to do with that.

The original intention was more just to see what it would look like than anything else - i had already done sensor visualisation in a single plane before, and i wanted to test the accuracy/resolution of my sharp IR sensor.

As far as python/pygame goes - one of the real reasons i am doing this is to improve my coding.I suck at programming! Nothing cooler to see than an arm flailing about, or a robot wheelstanding, or a cool 3d-scanned image!.


Theres actually a few python apps that go into making this work. Most of them handle pickled python data via UDP, and convert it into serial data to control assortedd -USB-serial attached devices including an arduino, a pololu servo controller and a pololu motor controller. Another daemon handles sending servo and motor commands, and receiving and logging sensor data from the arduino. A sepereate app on a desktop handles de-pickling the logged data, 3 samples from each sensor, along with its X,y position data and servo stepping size (the resolution could have been 5 times higher, but i assumed the mechanical tolerances of hobby servos would not be up to it) and then displaying the selected data set

I have one question.

How do you plan to use this?


It gives me an easy way to calibrate "handy" distances and servo positions. Since each pixel can be converted into a servo direction, it gives me an easy way to choose areas to scan for information. Also, the multiple samples and image comparison give me handy visual information on how accurate my sensors are

By placing the robot in different locations, i should be able to better calibrate its object detection/avoidance - next, i want to point it the other way down the hallway, so there is a clear view for several meters, with doors on the side