Let's Make Robots!

MiniEric robot

Play Fetch, Firefighting, navigate by map using different sensors for localisation.
AttachmentSize
MiniEric.txt311.01 KB
ScanMap.txt10.24 KB
MotorController.txt15.37 KB
WInterrupts.c6.43 KB
TTSzip.txt16.74 KB
SpeechControllerzip.txt5.9 KB
R-Dev-Ino.zip24.13 KB

This is my latest robot, still work in progress. It will take a while until it gets finished, as I want to make it more and more complex. I intend to add every functionality I can to this robot. But here is the description:

 MiniEric was born because I needed a multipurpose test platform to develop code for my big butler robot (Eric). I wanted a small replica of the big robot, incorporating almost all features, like: object retrieval, interaction with humans, mapping, object recognition, text to speech, self charging, eventually voice recognition (simple commands), ability to compete in several types of robotic competitions (line, maze, fire...). Some of the features are implemented, some are on the way. I am a weak coder so I am slowly testing out bits of code, getting ideas from people on the net (I already got some from you guys, thanks a lot!). I am using the Arduino platform, the brain being a Roboduino board. I am using all pins on it so I'll have to hook up a Duemilanove over the I2C and downgrade the Roboduino as a servo controller. For the moment I am using a self made dual DC motor controller over the UART (I didn't get the I2C slave working yet) using a Tiny2313 and a SN754410 to drive a couple of Faulhaber motors with built in encoders. The robot has 8 servos: one for waist, 2 for shoulders, 2 for arms, 2 for pan/tilt head and one for a scanning sensor. On the head it has a Ping)) sensor, a thermopile array and will have an AVRcam. The sensor mounted on the scanner is a GP2D120 and is used for wall following or object retrieval. The arms can move independent (to point or wave) or together as a claw (to pick up objects). On the tip of the arms it has some suction cups that I want to attach to FSR to sense when the object is grabbed (but I could steal the ASF idea...). The robot has a 2x16 serial LCD (custom made) that I should upgrade with a graphic LCD for mapping purposes. The robot has some programmed moves (stored in the EEPROM) and is able to play a small tune or beeps. Did I mention that I hooked up an IR sensor and I can teach the robot new moves with the TV remote? The process is not so easy, but it beats the PC control method.

I have decided to add a few pictures to highlight the building process of this robot. I started CAD-ing it with Google Sketchup, then cut the parts from a poplar board I bought from Home Depot (it took me one afternoon to make and mount most of the parts) and used a piece of automotive (big) hose clamp to fabricate the servo brackets. I have used small wood screws (you can find them only at the hardware store in packs with tiny hinges for small jewelry boxes) but I had to drill small holes in advance so the wood would not crack. At first, I have used servos for driving, but they are too noisy for my ears, I hated it when it was running all over avoiding objects. So instead of installing a quadrature encoder, a small H-bridge and a ATtiny microcontroller inside the servo's box, I decided to get geared motors with built in encoders -> the Faulhaber motors from Electronic Goldmine. Over time, the robot has suffered many small mods and I guess it will happen again with the arms, as I am not happy with the current design. I need to re-shape them, perhaps ad one more micro servo per arm for an elbow bend or for a hand... Here are some early pictures:

 

 

 

     

I Hope you'll like it! 

UPDATE: (Nov. 14th)

 I have redesigned the robot's head and added 2 long range Sharp IR sensors mounted at 90deg from each other and 45deg from head axis. I also added the AVRcam and a LED bar to act as a mouth when the robot will speak. I will use a tiny to drive the LEDs using a AtoD pin to determine the voltage on the speaker, I've seen it done somewhere some time ago. I have added a color Nokia LCD to my motor controller board. I wanted to make the robot scan using the head pan servo and send through I2C the LCD commands, but it didn't work. So, I had to move over to the motor controller board the servo and the Ping sensor for testing purposes and I finally had proper results. The color on the LCD are still crap (for some reason this LCD is hard to set up properly) but I can display the distance and draw the pixels on the screen. Another weird thing, it seems that the Ping sensor's max distance is 102-103 cm, but I didn't had time to see why. After I got the scan properly displayed I have eliminated all the delays in my code and to my surprise, it scans madly fast! Then I made it scan left to right and right to left, with a second delay between directions, to be able to see the map on the screen. You can see the result in the video. I also attached the code and the NokiaLCD.lib in the zip file (change the extension from txt to zip). I had to use the SoftwareServo lib because the original Servo lib causes problems with the display. Enjoy!

Nov. 17th. Another update:

I have finally received the new R-Dev-Ino boards I've designed for the robot that will split all the functions over 4 or 5 modules. I'm using I2C for communications and I have to say I'm pleased how well that works. At the end of this week the robot will be ready for FireFighting and all that will remain will be to complete the mapping code and the vision code. Hmm, actually there are more things to do afterwards...

Nov. 27th.

After the Fire Fighting competition my next challenge is Mapping. But until then, I want to make Speech work. So far it's not intelligible, but I think I can tweak it a bit. It doesn't have to be perfect, but at least a bit better. Then I'm making the LED mouth work for a much realistic feel. When the new Nokia color LCD shield is here I'm going to continue my mapping efforts.

So here's a new video demonstrating the Voice and Speech. 

Update: Dec. 31st 2009.

I have attached the code for the Speech controller that uses a Mega328 with Arduino bootloader installed, on one of my R-Dev-Ino modules. You need to download the SpeechControllerzip.txt file and TTSzip.txt library, rename the files to replace the extension from .txt to .zip, unzip and copy the library in the proper place and the SpeechController code where ever you keep Arduino sketches.

Update: Jan. 30th 2010.

I've decided to change the two HXT900 servos from my robot's neck and I got some Turnigy TGY-S3101S mini servos, a bit bigger and a bit stronger. After taking the robot apart half way to remove a body part that I needed to cut to fit the new servo, bend in weird ways the servo bracket to fit the new lenght I managed to get it all back together and it was ready for the test. I loaded the ScanMap code on the micro and my jaw dropped in awe!!! The head moved perfectly, jitter free, 180 perfect degrees and SILENT like it was some sort of a stealth robot... I poked the head to tilt it, came back smoothly, with no extra oscillations... OMG!

I also took sensor measurements every 5cm from 20cm to 250cm and had Excel come out with a new equation, so my Sharp sensors measurements went right over the Ping sensor measurements and perfectly one over the other (I have 2 Sharp sensors at 90 degrees to each other, so the measurements overlap for the middle portion). Great Gods of Robots! I am now ready to start the mapping stuff!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
this robot.the accessories....

Last week and weekend I worked hard on MiniEric to prepare him for Fire Fighting. I had to make him shorter to fit the height limitations, I gave him a paint job, changed the electronics, sensors, managed to render the Nokia color LCD worthless... Oh well, lots of work and frustrations. So many things to do and so little time left to the competition. I have tried to adapt Mike Ferguson's code for his Crater robot but couldn't figure out some of the logic and it wasn't working on my bot. Trying to make the robot run in a straight line and turn precisely 90 degrees I made a list of commands that would take him over the entire course node by node. Of course, nothing was precise enough and after going through half of the course, the robot would hit the walls. I have tried adding sensor measurements, but one Sharp sensor failed to work entirely, the other one was giving me odd measurements and the Ping sensor was too high (read too close to the top of the walls) to be reliable. It works in a real house, just not in the Fire Fighting prop. If I tilt it, I can read the distance to the front wall, but can't use it for side walls at an angle, the measurements are completely weird. So, I ended up with a sensorless robot. But he was able to detect the candle flame using the Thermopile Array sensor and the spraying mechanism worked fine. So I went to the competition just to talk with the people there and show my robot for fun. After the competition started, I saw a robot that was just running from wall to wall not knowing where it was, eventually by luck it would find the candle and try to put out the flame using a fan. It just hit me. It doesn't have to be a perfect bot to compete, I can do it with simple commands and once in a room look for the candle and put it out! I can do that! So I rushed to my laptop, changed the code, did a small test right there on the floor and headed to the officials to enter him in the competition. But it was too late, the competition was already half way through... I did some more tests between rounds directly on the course and managed to get close to the candle but the sprayer couldn't put it out for some reason. Of course, at home it works! Spooky candle they use, hard to get it blown off even with the fan.

Al in all, it was a good day, talked with the experts, met Mike Ferguson, saw his Izzy bot doing pretty well, talked with Jon Highlands, saw his super powerful robots (first and second place in Mini Sumo experts division). I came home and did more tests and this time I got the robot doing it almost perfect. Took a video to show you how it was supposed to work at the competition (well, without the small nudge). 

Looks even better now and fire fighting works realy nice. Bravo !

As for Fire Fighting competition, better luck next time. I see you as a perfectionist and I'm sure those  candles will have no chance next time ;)

Thanks! I'm sure in a year from now I'll be a much better programmer and this competition will be a piece of cake. I'll also try line following using the AVRcam. Having a year to prepare, I will make sure this time I won't get caught short and have it working flawless. (Yeah, I'm a perfectionist.)
I have updated the project with a new video. Let me know what you think.

Very nice work with the onboard voice recognition. Not many people have got it done without a PC interface.

Do you have more information or links about the VR and synthesis chips? I couldn`t understand what they were called properly, your accent is very interesting.

Thanks!

The VR chip is this one: http://www.tigal.com/1770 it has a Demo for Arduino. I have adapted the code from their demo to work on my robot.

The Speech chip is actually a C code developed by Webbot from SoR as a speech synthesizer. I have adapted that code (with a bit of help from Arduino forum) to work in Arduino and added it to my Arduino module that takes care of the VR chip. I had to use a mega328 because the code got over 14k that is available in a mega168. Without the bootloader, it will fit in mega168, since it's almost 15k. Webbot told me that if I generate the phonemes from text, then play a bit with the parameters I can improve the way the text it's spoken. I'll try to do that and see how's going. Since my robot speaks stored text, he also said it's better to store that text in phonemes and not in actual text. Faster response and better speech quality.

The benefit of using a dedicated Speech module is that it can be on voice recognition at all times while the robot can do what ever it needs to do. I can stop it's actions anytime I want and give him different commands. The Speech module on my robot is mainly an I2C Master device, but, depending on the situation, it can become a Slave device. 

Here is the original C code:  http://www.societyofrobots.com/member_tutorials/node/211

Oh, and my accent  is Romanian.

MiniEric is looking better and more advanced every time I check. Now with AVRCam and VR chip. And I see you got the speech synthesizer working too. Though it doesn't sound that good yet. It should be possible to improve it I think.

Anyway keep up the good work :)

Thanks!

Yeah, the  speech code gave me some headaches but I managed to make it work with a little help. It needs some improvements, I'll see what I can do, but even if the speech is not perfectly intelligible I know what he wants to say so I can understand it. It's like small kids, parents would know what they say, but outsiders would always have trouble understanding what they're saying.

It will take a while until I'll use the AVRcam. I have a Arduino compatible library for it and I made some attempts to use it, but I had poor lighting and that creates problems. In this apartment where I currently live there is plenty of light, be it day or night so I hope I won't have so many problems.

I've been asked on SoR to make a tutorial about the VR and Speech so I'll do it over the next weeks and link it here. Perhaps I'll enter it in the 5th tutorial contest to win an Axon II, who knows... 

I guess we all feel that our bots are like our kids, and I imagine even more so with an advanced, and might I add cute one like MiniEric. However it should be possible to get some intelligible speech out of it. I recall playing with a speech synth as a kid on my Commodore 64. It didn't have much more resources available than an Atmega328.

On the other hand image processing takes up quite a lot of resources. Some years ago I was playing around with some image processing with my webcam. It was quite heavy even on a PC, so I'm not sure what you intend to do with it, given the limited resources available in your current setup?