YARP - YARP: Android-controlled Robot Project [APP SOURCE CODE AVAILABLE]
YOUTUBE VIDEO: http://www.youtube.com/watch?v=q3u6t3t0W4k
now it's time to present YARP to you. The name is a recursive acronym (http://en.wikipedia.org/wiki/Recursive_acronym): YARP: Android-controlled Robot Project. I got the idea for the acronym while playing a game called DeathSpank with one of my buddies. If you played the latest part of the series you will know where it comes from.
Now lets turn to some more robot-specific information: The project is not really hardware-heavy and I spend most of my time working on the software side of thing. So if you are looking for exciting hardware stuff, this is maybe not the best place to find it. Nevertheless I am going to start off by presenting the hardware part of the robot:
I have used the DFRobot COMB0004 kit (http://www.dfrobot.com/index.php?route=product/product&product_id=570#.UJPdgMVmJfc) as a base for this robot.Since this is the first time that I am building a robot, I was looking for an easy way to get started and still be able to add more and more functionality to the robot while the project progresses.
The first picture is just a shot of the front of YARP:
As you can see the front proximity ultrasonic sensor is framed by a plastic "head". This head was the case of a USB Alarm Clock that I just had lying aroudn somewhere. The next picture shows that there is a (RGB)LED in the "head".
On the upper picture you can also see the ANKER battery that powers the Arduino. The motors are controlled by a seperate energy source: The 5AA battery pack that came with the kit. The next shot shows a shot of the head glowing blue (it can be set to any color, see the video for a demo of speech controlled color changing).
Here you can see a messy shot of the board and the wiring:
Here is a shot of the side of the robot:
That's it for the hardware. If you have questions, feel free to leave a comment, I'll try to answer it soon! But keep in mind that this is a software focused project, so my main focus will stay on the software.
So let's get to the interesting stuff! If you want a little teaser before reading through the next part, please watch the robot's video! It summarizes my progress so far and how the robot works.
YARP has an Android device for a brain. This means: There is almost no logic on the Arduino board. All it does is forward the sensor data to the brain and receive and process commands from the brain like setting motor parameters. In my opinion this approach has several advantages:
- There is no need to change the arduino code when the robot's behavior needs to be changed
- Several behaviors can co-exist, when adding a new one, nothing has to be deleted
- Modern Android devices have fast processors, more resources in general and many libraries that can be used freely.
- Android Sensors, Camera, Microphone is easily accessable
For the basic communication I use the Amarino API (http://www.amarino-toolkit.net/) and therefor bluetooth. There was a bluetooth shield bundled with the robot kit and it was quite easy to get it to work. Look at Amarinos web page for an explanation on what it does. You can also find a Master's thesis of the author for a deeper look.
On the Arduino I have a sketch that provides methods to control motor, servo and LED and also functionality to send sensor data of the Arduino sensors to the Android device. This sketch is kept really simple and it only needs to be changed, when the hardware configuration of the robot is changed. This means that I didn't have to change it for the last week even though I did a huge amount of work on the Behavior side of the robot. I will provide the sketch as soon as I am sure about how to release the rest of the code open source. See "Code" section below for details.
I put the most work into the "roboBrain"-Android-App. "roboBrain" is the working title, I hope there is no problem if someone copyrighted that name. I already wrote several thousand lines of code to create a framework that allows controlling Arduino devices. The App is able to control multiple Arduino Devices and run different behaviors on each of them. Devices and Behaviorsmappings are configured in xml files which reside in a "robobrain" folder on the devices sd card. Robot part types and Behaviors can be defined by implementing an abstract java class and putting the new class in the specified java package.
As you can see in the video I implemented three different behaviors for YARP. One of them just makes the robot go back and forth. It is really simple and just for testing purposes. There is also an Obstacle Avoidance mode, which would also be possible without the Android app. But the third one (a lot more like these to come) allows the user to change the robots head RGBLED color per !voice! by using the speech recognition functionality of Android deviced. Doing something similar with just the Arduino would mean a huuuuge amount of work and time spent by several clever heads.
My next steps will be to allow Behavior changing per voice and also a way to allow Behavior changing from Behaviors itself. This will make everything even more dynamic. I will also implement a speech controlled movement behavior. Stay tuned for more!
The app is also build in a very generic way, meaning that it can be used for controlling other Arduino devices with different layouts quite easily. At some point I want to implement ways for different Robots with different behaviors to communicate with each other. So a little robot swarm can be controlled by one phone. This is still some time away though. I will try to make this the topic of my next paper -> early 2013.
Here is the current class diagram, so you can get a basic overview of what I am talking about:
I've tested the application to work with an HTC Wildfire S (Android 2.3.5) phone and a HTC One X (Android 4.1.2). I guess it should work with every Android version inbetween, but it shouldn't be much lower than the Wildfire, otherwise the speech recognition would probably not work. The only additional software needed is the Amarino apk, that you can get on the Amarino page mentioned above.
Before you ask this question alot: Right now there is no way for you to get the roboBrain App, see below for details.
This section will also provide some code snippets as soon as I am clear about open source licensing stuff, see below.
As I am writing this code as the contribution of a study thesis for the Cooperative State University of Baden-Württemberg Karlsruhe, I have to find out what I have to do to be allowed to push the code out to you with a Open Source license. I don't think this will be a problem, so you should be able to look through the code and use it soon enough. As long as I am not sure about this matter, I will not be able to show you huge amounts of code, since it's unlicensed and I don't want it to be "stolen" by someone else who could easily publish it using his/her name.
You are now able to get all the code that I wrote and will write for this project. You will also be able to follow the work, help me develop, report bugs, etc!
The code is released under the GPLv3 License. Basically meaning you can use it, if you don't do so for commercial purposes.
I created a small website where you can find all the different links and information:
This is the current logo of the project:
Right now there is no detailed tutorial on how to use the Framework for you own purposes! I am working on it. The code is pretty well documented though. So if you are experienced, you can give it a try.
I also added some new features that YARP can use. I implemented a Dance Behavior that makes the robot play music and "dance" around and flash the LED in random colors taken from a color table. I'll probably add a new video some time soon.
This page will be updated again soon and I will present more features of the app and if I have the time I will upload more videos.
Comments for info box in upper right corner
*Time to build includes time to build the Android App which really is the biggest amount of work by far
*Costs depend on Android device used. I did not really add the costs of the device to the mix as it's not really built into the robot (yet)