Let's Make Robots!
AttachmentSize
MiniEric.txt311.01 KB
ScanMap.txt10.24 KB
MotorController.txt15.37 KB
WInterrupts.c6.43 KB
TTSzip.txt16.74 KB
SpeechControllerzip.txt5.9 KB
R-Dev-Ino.zip24.13 KB

This is my latest robot, still work in progress. It will take a while until it gets finished, as I want to make it more and more complex. I intend to add every functionality I can to this robot. But here is the description:

 MiniEric was born because I needed a multipurpose test platform to develop code for my big butler robot (Eric). I wanted a small replica of the big robot, incorporating almost all features, like: object retrieval, interaction with humans, mapping, object recognition, text to speech, self charging, eventually voice recognition (simple commands), ability to compete in several types of robotic competitions (line, maze, fire...). Some of the features are implemented, some are on the way. I am a weak coder so I am slowly testing out bits of code, getting ideas from people on the net (I already got some from you guys, thanks a lot!). I am using the Arduino platform, the brain being a Roboduino board. I am using all pins on it so I'll have to hook up a Duemilanove over the I2C and downgrade the Roboduino as a servo controller. For the moment I am using a self made dual DC motor controller over the UART (I didn't get the I2C slave working yet) using a Tiny2313 and a SN754410 to drive a couple of Faulhaber motors with built in encoders. The robot has 8 servos: one for waist, 2 for shoulders, 2 for arms, 2 for pan/tilt head and one for a scanning sensor. On the head it has a Ping)) sensor, a thermopile array and will have an AVRcam. The sensor mounted on the scanner is a GP2D120 and is used for wall following or object retrieval. The arms can move independent (to point or wave) or together as a claw (to pick up objects). On the tip of the arms it has some suction cups that I want to attach to FSR to sense when the object is grabbed (but I could steal the ASF idea...). The robot has a 2x16 serial LCD (custom made) that I should upgrade with a graphic LCD for mapping purposes. The robot has some programmed moves (stored in the EEPROM) and is able to play a small tune or beeps. Did I mention that I hooked up an IR sensor and I can teach the robot new moves with the TV remote? The process is not so easy, but it beats the PC control method.

I have decided to add a few pictures to highlight the building process of this robot. I started CAD-ing it with Google Sketchup, then cut the parts from a poplar board I bought from Home Depot (it took me one afternoon to make and mount most of the parts) and used a piece of automotive (big) hose clamp to fabricate the servo brackets. I have used small wood screws (you can find them only at the hardware store in packs with tiny hinges for small jewelry boxes) but I had to drill small holes in advance so the wood would not crack. At first, I have used servos for driving, but they are too noisy for my ears, I hated it when it was running all over avoiding objects. So instead of installing a quadrature encoder, a small H-bridge and a ATtiny microcontroller inside the servo's box, I decided to get geared motors with built in encoders -> the Faulhaber motors from Electronic Goldmine. Over time, the robot has suffered many small mods and I guess it will happen again with the arms, as I am not happy with the current design. I need to re-shape them, perhaps ad one more micro servo per arm for an elbow bend or for a hand... Here are some early pictures:

 

 

 

     

I Hope you'll like it! 

UPDATE: (Nov. 14th)

 I have redesigned the robot's head and added 2 long range Sharp IR sensors mounted at 90deg from each other and 45deg from head axis. I also added the AVRcam and a LED bar to act as a mouth when the robot will speak. I will use a tiny to drive the LEDs using a AtoD pin to determine the voltage on the speaker, I've seen it done somewhere some time ago. I have added a color Nokia LCD to my motor controller board. I wanted to make the robot scan using the head pan servo and send through I2C the LCD commands, but it didn't work. So, I had to move over to the motor controller board the servo and the Ping sensor for testing purposes and I finally had proper results. The color on the LCD are still crap (for some reason this LCD is hard to set up properly) but I can display the distance and draw the pixels on the screen. Another weird thing, it seems that the Ping sensor's max distance is 102-103 cm, but I didn't had time to see why. After I got the scan properly displayed I have eliminated all the delays in my code and to my surprise, it scans madly fast! Then I made it scan left to right and right to left, with a second delay between directions, to be able to see the map on the screen. You can see the result in the video. I also attached the code and the NokiaLCD.lib in the zip file (change the extension from txt to zip). I had to use the SoftwareServo lib because the original Servo lib causes problems with the display. Enjoy!

Nov. 17th. Another update:

I have finally received the new R-Dev-Ino boards I've designed for the robot that will split all the functions over 4 or 5 modules. I'm using I2C for communications and I have to say I'm pleased how well that works. At the end of this week the robot will be ready for FireFighting and all that will remain will be to complete the mapping code and the vision code. Hmm, actually there are more things to do afterwards...

Nov. 27th.

After the Fire Fighting competition my next challenge is Mapping. But until then, I want to make Speech work. So far it's not intelligible, but I think I can tweak it a bit. It doesn't have to be perfect, but at least a bit better. Then I'm making the LED mouth work for a much realistic feel. When the new Nokia color LCD shield is here I'm going to continue my mapping efforts.

So here's a new video demonstrating the Voice and Speech. 

Update: Dec. 31st 2009.

I have attached the code for the Speech controller that uses a Mega328 with Arduino bootloader installed, on one of my R-Dev-Ino modules. You need to download the SpeechControllerzip.txt file and TTSzip.txt library, rename the files to replace the extension from .txt to .zip, unzip and copy the library in the proper place and the SpeechController code where ever you keep Arduino sketches.

Update: Jan. 30th 2010.

I've decided to change the two HXT900 servos from my robot's neck and I got some Turnigy TGY-S3101S mini servos, a bit bigger and a bit stronger. After taking the robot apart half way to remove a body part that I needed to cut to fit the new servo, bend in weird ways the servo bracket to fit the new lenght I managed to get it all back together and it was ready for the test. I loaded the ScanMap code on the micro and my jaw dropped in awe!!! The head moved perfectly, jitter free, 180 perfect degrees and SILENT like it was some sort of a stealth robot... I poked the head to tilt it, came back smoothly, with no extra oscillations... OMG!

I also took sensor measurements every 5cm from 20cm to 250cm and had Excel come out with a new equation, so my Sharp sensors measurements went right over the Ping sensor measurements and perfectly one over the other (I have 2 Sharp sensors at 90 degrees to each other, so the measurements overlap for the middle portion). Great Gods of Robots! I am now ready to start the mapping stuff!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
droidbuilder's picture

One of the more awesome robots here! Very impressive. Was bummed to hear you weren't able to compete in the fire-fighting contest. Especially when you have put so much time and work into MiniEric. In any case, your effort is not wasted. Many here I think have noticed what a great robot this one is. Next year you will certainly have MiniEric ready to fight fires!

Thanks for telling us about the VR chip. The ideas are already churning here... :-) Now I just need the time!

 

Ro-Bot-X's picture

MiniEric went through a major make up the week before the contest, including height reducing wheel replacing, paint job, I replaced all electronics boards and I had to make the sprayer work. All low level functions had to be adjusted, nothing worked properly anymore. So I spent too much time fixing and debugging low level functions than working on the contest specific functions. And it's still not the way I want it to move. Now I'm taking a small break, just lurk the forums and answer questions, prepare an article about MiniEric on Circuit Cellar magazine (digital version) that is going to be published soon. I'm also waiting for a new color LCD so I can continue the mapping efforts.

I'm going to make a tutorial about the voice recognition and speech for SoR pretty soon, so there will be more in depth information on this subject. 

Thanks for your comments guys! Keep them coming, give me more ideas how to improve this robot.

jgillick's picture
This is wicked cool. As your features get more advanced, you might want to consider having it run on top of a PC platform like the BeagleBoard (http://beagleboard.org/). It's a little bigger than the Arduino board but is a full 600Mhz computer with audio/video (i.e. text to speach). You could have the arduino read the sensors and then send some data back to the PC for heavier processing or user interaction. If you want to get really slick, the Gumstix Overo (http://gumstix.com/) the same thing as the BeagleBoard but smaller than a stick of gum. Just an idea. Great work, though!
Ro-Bot-X's picture

Thanks for the idea, but I'm not a computer programmer and I had a hard time trying to do something too advanced for my capabilities in my butler robot Eric. You can see this robot here: http://seriousrobotics.wordpress.com/eric-the-butler-robot/

 I have decided to make MiniEric to get better at microcontroller programming to be able to do all the low level functions for the big robot then move the Navigation and A.I. to the PC. And of course, a smaller robot allows me to participate to different competitions too. So, when I'm done with MiniEric, I'll get back on re-building the big robot, since I had to give up some parts when I moved to Canada.

An important construction improvement was done in MiniEric, the possibility to bend to grab object from the floor and then lift them up (I want Eric to be able to put them on a regular table or counter top). I am also considering making Eric like a Segway, but this is just a wishful thought at the moment. A preliminary experiment was done with this robot: http://seriousrobotics.wordpress.com/2008/12/22/balancing-roboduino/ but I could not make it drive around because some limitations of the system. An accelerometer and gyro are necessary to make a real balancer. But I'll experiment more latter, perhaps make MiniEric balance. 

robotnics997's picture
I think so I misheard the voice coming out of the speaker any way know you can make mini eric bow.he he he
Ro-Bot-X's picture

Thanks for the comments guys!

Yes, MiniEric was already voice activated when I was doing the Fire Fighting tests. At the competition, I could not use the voice command because of too much background noise. There was a FIRST competition  with live amplified sound effects and music in another part of the Great Hall. The voice command works as you train it, some people had it done with the TV in the background and it worked. I did it in silence, I have problems if there is noise around me.

About the VRbot chip you can read an article in the last issue of Robot Magazine. The guy used it on his Robonova, as this chip was designed for that robot, but as you can see, it works on any robot that has a serial port (I'm using software bit-bang on 2 digital pins, as this was the sample code I got directly from Tigal). The "bridge" mode did not work for me, so I had to use some wires to connect the chip to the Basic FTDI board I'm using for programming my boards. This way I could train it perfectly.

I am not sure what you say about the "miniEric bow" command, as I don't have one, I was just undecided for a moment what command I should say and when I said "move" it already came out to the main "say trigger" mode. There are 2 indicators that the robot is listening for commands, a beep and a red LED shining. Also the LCD prints "say trigger" or "say command" and then it displays what command he understood. If not, it will jump back to "say trigger".

The Speech library is almost ready, I am waiting for some input from the original developer, as I have asked him if any improvements could be made. As soon as I have it completely done, I'll attach the code for my Speech Controller module, that includes monitoring the VR chip and speaking.

robotnics997's picture

The voice activation controlling was cool but in the video it doesn’t reply to the command “mini Eric bow”. What was the problem? And was it already voice activated in the ‘fire test’ video?

Aniss1001's picture

MiniEric is looking better and more advanced every time I check. Now with AVRCam and VR chip. And I see you got the speech synthesizer working too. Though it doesn't sound that good yet. It should be possible to improve it I think.

Anyway keep up the good work :)

Ro-Bot-X's picture

Thanks!

Yeah, the  speech code gave me some headaches but I managed to make it work with a little help. It needs some improvements, I'll see what I can do, but even if the speech is not perfectly intelligible I know what he wants to say so I can understand it. It's like small kids, parents would know what they say, but outsiders would always have trouble understanding what they're saying.

It will take a while until I'll use the AVRcam. I have a Arduino compatible library for it and I made some attempts to use it, but I had poor lighting and that creates problems. In this apartment where I currently live there is plenty of light, be it day or night so I hope I won't have so many problems.

I've been asked on SoR to make a tutorial about the VR and Speech so I'll do it over the next weeks and link it here. Perhaps I'll enter it in the 5th tutorial contest to win an Axon II, who knows... 

robologist's picture
Speech is an awesome addition, hope the improvements go well. I really like this little guy, has some personality!