# Face tracking with Arduino, Processing and OpenCV

Tracks faces and orientates the camera so that the face is always in the middle

UPDATE 1:

I have stripped down the processing code to the bare basics so that it can be used however you want. I added lots of comments to make it easier to understand and leave you open to put your own actions in when criteria are met. I have also added an additional check to see if the face is in the middle of the screen, (min_x, max_x, min_y and max_y inside middle square), so additional actions can be performed, in the case of a robot, it could move towards you.

The stripped down processing code : https://www.dropbox.com/sh/vpvbsrpx43zzcen/S8Xi4MmMwj

UPDATE 2:

Made the face tracking code to control a remote control car to show its potential uses.

UPDATE 3:

The last update I will post on this. I was playing around with the basic sketch today and managed to read how far away the face is from the screen. All I did was put my face 30cm away from the screen and work out how many pixels wide the face bounding box was. I then did the same for 60cm and used the two points to form a linear equation to determine how far away the face is depending on how big the bounding box of the face is. I recorded this and the link is below:

How this actually works is described about below but also in the video. I suggest you watch the video as my writing is terrible.

The Processing code : https://www.dropbox.com/sh/v9hkdxuoazoyb0d/fCGQzEfBAK The OpenCV and Arduino libraries are needed. OpenCV also needs to be installed onto the computer.

The Arduino code : https://www.dropbox.com/sh/ujjlahx83ilv1j2/QB0bu8E-EJ

This is something that I have wanted to do for a very long time, face tracking. As I look at all my favorite films I see a common factor, technology, whether it was R2D2, flying cars or JARVIS from the Iron Man movies. I have already made something inspired by JARVIS, a simple voice control code written in Glovepie to control my computer seen here: http://www.youtube.com/watch?v=i5S1H1nogpI&feature=plcp, but this time I wanted to make something really cool. I had the idea in the shower to try to make something that would follow my face like something out of the i-Robot movie and the rough idea of how I would do it.

Now to talk about what I did. I broke up the screen into 9 different quadrants representing different areas of the screen, I did this visually by drawing 4 lines running up and down the screen to show where the areas are. The program drew a bounding box around the faces with a (x,y,width,height) values, all I did was simply check to see which quadrant the box was in and then print those results to Processing. Depending on what quadrant the box is in Processing will write cases to the Arduino through the serial port. The Arduino code listens for cases and increments the servos position accordingly.

## Comment viewing options

I'm preparing something like that, but with a Raspberry Pi, I've got the facetracking software running, but the hardware for the robot is expensive, so I'm going step by step, first the RPi, then the code for face and body tracking, then the pan-and-tilt hardware like yours, then the chassis and the rest of the sensors.

Yes price is the only factor stopping me from implementing this into a robot at the current moment. Let me know when you have got your face tracking code on your robot I would really like to see it in action. My pan-tilt hardware consisted of 2*5gram servos and double sided tape, which was later upgraded with zip ties :)

Well, that's a nice project. Will see if I can use it for my "Stray" when I ever get time to proceed with that project :-)

I would love to see it operational on a robot! Im getting more components of eBay so expect to see my implementation of this in a robot in the coming months, but I would love to see what people do with this.

Well, it might take some time since I am not familiar with OpenCV yet :-)

Fot the beginning I will just grab some code from here and try the things out. I will get two cheap webcams by tomorrow, so the hardware is already ready to be mount...

After playing around with some values the vertical jumpiness on the servos can be corrected by changing the tilpos varialbe in the arduino code to 0.5 instead of 1.