Let's Make Robots!

Would anyone be interested in an XMOS challenge?

UPDATE: If you have a YouTube account or use YouTube, add/view our "MyXMOS" channel for latest videos and XMOS news!

 

UPDATE:

The 9.9 version of the XMOS Development Tools is now available:

http://www.xmos.com/technology/design-tools

It is VITAL that XK-1 users use this release or above as the XK1 is not supported in previous versions.

 As for anyone else - you will probably appreciate some of the cool new features:

http://www.xmos.com/system/files/releaseNotes9.9.0.txt

 

 

XMOS challenge winners are now available!

 

Hi everyone!

I was curious to know if anyone would be interested in participating in a challenge to make something AWESOME with a new breed of processor which is particularly great for DSP/networking/USB/motor control type applications or more basic things if that is overkill for what you want.

XMOS event driven processors allow you to execute code in parallel which could open up a number of possibilities to evolve your robots "brain" to the next level!

I have some development kits to give away potentially, but before I give away all the details, I just wanted to see if there was any interest before I add to the challenges area or such.

People have made some cool stuff with our tech before (See videos).

 

Some details of the processor itself which will be on the dev kit (to be released soon):

 

- Single core device (Although we do have quad core versions - ask me)

- 400 MIPS per core.

- 8 Threads per core.

- 64Kb RAM

- 8KBytes OTP memory for applications, boot code or security keys, with security mode

- 64 user I/O pins

- Support for high performance DSP (32 x 32 → 64bit MAC) and cryptographic functions

- Time aware ports provide up to 10ns timing resolution

- Designs implemented using a software-based design flow (can program in C or XC - very similar to C but with support for extras such as parallel execution)

- Scalable - can connect many kits together for crazy amounts of processing power using Xlinks.

 

For detailed spec see: http://www.xmos.com/products/xs1-l-family/l1lq128

 

EDIT: As there has been some interest shown and to save time later - please post below with the following info if you would like to be considered:

1) Project Title

2) Project Description

3) How many Dev Kits you think it may require (eg you may need 2 if you are demonstrating the ability of one robot using image recognition to track another for example)

4) Are you willing to keep a video/photo diary of your progress if we choose you? Please state which/both.

 

If anyone is interested please feel free to post below with your project ideas (and subscribe to this forum topic for updates so I can contact you nearer the release date) so I can see if there is enough interest. I hope I have posted this in the right place, if not please feel free to move this to the right part of the forum!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Welcome to the XMOS parallel world! A whole new set of possibilities are now open to you! I am glad it is all making sense to you now! Are you reading the XC tutorial PDF? That is indeed very nice, taking you through all the basics you need to adapt yourself into one mean parallel programmer :)

1) Title unknown...

2) Two robots navigating and mapping an area (for example a house). I'd like to turn the 'bots loose on opposite ends of the house. When the two finally meet up in the middle (or wherever) they would share their maps with each other, so each bot will already know what it's going to find as it continues its journey.

3) Two dev boards would be required, one for each bot.

4) Photo diary would be kept, with occasional video.

 

I gotta say, I'm really interested in the parallel execution part. This could be used for the 'bot to process whatever it's just "seen" while already continuing on its journey. Also with your mention of scalability, when the two bots meet up, perhaps instead of just sharing information they could "tag-team" the information processing as well. Kind of like they'd just collect data, then crunch it as a pair.

Very interesting sounding processor. Sounds a lot more powerful than the AVRs I'm getting used to using.

This is really funky. Could be quite cool to exend this type of logic into a Roomba vaccume cleaner if a user had 2 of them so that they could be more efficient at cleaning as they would know where all obsticals are in half the time!

Sounds like a great parallel project as you said. Hopefully it will prove its worth vs your AVRs!

Hey, the Roomba idea's pretty good. Not only could they share where the obstacles are, they'd also know what has already been done by the other vacuum.

While I'd have to start a little lower-budget than a pair of roombas, I'd definately try to make it modular enough to easily expand to that kind of idea. I guess that would refine the overall idea from "communicating robots" to using a pair of robots to develop a library/module for data sharing and processing.

Another tidbit I'd like to comment on; Thank you for using a Linux supported IDE. Even if I don't end up working with this, y'all have extra points in my book for not making us Linux users jump through hoops to use your chips like some other guys (*cough* parallax */cough*)

No problem! Given that most of us at XMOS come from a Computer Science or Electronic Engineering background, we all have some form of respect for Linux! I am glad we earnt some points in your book!

Indeed, for now you can just pretend the area the bot has gone over has been cleaned, instead of actually investing in lots of expensive vaccume technology lol.

I think it could work very nicely though if you did make it modular enough to port to a roomba when you have a bit of extra cash some day! I am sure it would get blogged about and such. 

Project - Outdoor Nav robot                                  

I'd like to build a robot using the XMOS, that would read data from MEMS Accelerometer and gyro sensors to be combined for IMU data. This data would be used to corroborate odometry data taken from incremental encoders and possibly GPS for accurate robot positioning in an outdoor environment. XMOS device may also read ranging sensor data, as well as provide PID motor signal control to h-bridges. 

What form of analog inputs are available on the XMOS devices, if any?

Estimated requirement would probably be one device.

Project progress can be marked with either video or photo or both. 

 

As stated possibly before, my background is more on the Computer Science side of things, though this should be doable.

We already have GPS code which you should be able to reuse from our open source XMOS community over at Xlinkers. And I am sure people have used gyros etc before too, so I highly advise you ask any specific hardware related questions there as they can provide you with a much more in depth answer as I am more from a software background.

I believe you would need to attach some form of ADC if you wanted to get data from an analog source on one of our dev cards in the same way one of the videos uses an ADC hooked up to a LEGO Mindstorms color sensor as input.

Is that ok?

 

 

Adding an ADC would be ok. It is a very useful peripheral along with timers and communications methods. It looks like there are some good timing capabilites. Communcations would include SPI and I2C, those may have been already owrked out as well.

Check out our community site for existing code related to SPI/I2C:

 

SPI:

http://www.xlinkers.org/node/47

 

I2C:

 

http://www.xlinkers.org/node/67

I'm in the planning stages for a bot that is essentially an intelligent beer cooler on big wheels. It will seek out (and serve, photograph, videotape, and follow) the humans it encounters as it maps my backyard or other enviornment. It will then upload it's (mostly visual) data in a blog describing it's adventures. I haven't settled on a processor yet ... does this sound like something xmos might be useful for?