Let's Make Robots!

Using gestures and voice commands to control the robot

Couple of months ago we decide to participate in Intel's perceptual computing challenge (http://perceptualchallenge.intel.com). The idea was to extend driver cockpit application to control Veterobot (http://veterobot.org) with gestures and voice recognition. We are using detected thumb position in 3D space. Position along horizontal axis is used as a steering signal and distance to the camera (depth) to control acceleration (forward and backward).

It was a fun exercise but some more tuning is still required to achieve smooth control experience. So it is work in progress.

Control Blender3D objects remotely

I just made a very simple example using Zeroc's Ice which illustrates how to control Blender objects remotely using Ice. Since Blender can run Python scripts, the idea is to write Ice server which can modify certain object properties (for example location). Corresponding client application can remotely invoke server methods and control objects remotely. Sources are available on GitHub. The first video illustrates how it works.

Codemotion Berlin 2013 conference

On 10-11 May 2013 there will be a Codemotion conference in Berlin. Citing their web-site: "Codemotion is an innovative tech event engaging developers of all languages and technologies in presentations & conversations about: mobile, web, Makers, startup ideas, sustainability, game development and creative coding."

Testing the wheeled version

As announced in our previous post, we are working on wheeled version of our robot. Tires and shaft adapters are arrived today and we mount them to conduct first test drive. Here are some photos:

Kinect and wheeled versions under development

We were pointed out that tracked version might have problems on thick carpet or wet tiles. That is why we want to try the wheeled version and compare the performance on different surfaces. Any thoughts whether it will drive better then tracked version? Some pros/cons of wheels vs. tracks?

The new "Shark" model of our robot

Last year we spend considerable amount of time building the new version of our robotics vehicle. Here is the result - the small tracked vehicle which might be intereresting for researchers in robotcs, AI, computer vision as well as for DIY enthusiasts. All the model, assembling instructions and software are open source.


Design of 3D printed enclosure - first printout

Here is a quick update related to this post about 3D printed robot enclosure.

Two videos below illustrate the goal - how it will looks like soon :-). And here is the first step into this direction - my first 3D-printed part of the enclosure shown in the video. Took me about the whole week from receiving the packet with Ultimaker kit till the first reasonable printout came out :-) .

How to build BeagleBoard based WiFi robot

Some time ago I have announced my remotely (over the Internet) controlled robot. However, there were not too much documentation and details. So with this post I would like to announce the availability of more detailed project documentation and make brief overview of some interesting aspects of the project. The whole available documentation could be found on the project Wiki.

Design of 3D printed enclosure

We are currently working on the new version of our BeagleBoard-xM based robot. The main reason for redesign was the lack of wheel encoders which prevent us from developing more sophisticated navigation algorithms. That is why we decide to use Dagu Rover tracked chassis. However, not all our electronic could be fit in to the chassis, hence the need for additional enclosure on top of it.