This looks a really exciting development in the field of emotion-based robotics.
If it's not a rigged demo. I think it is technologically possible. Look at mtrplette's Super Droid Anna.
I really believe they are onto something. The Jibo has about the minimum needed for emotional attachment. It can't move on its own (except to rotate its body and move the head). It has no hands. They didn't make the mistake of putting any kind of a human face on it. Therefore it stays way outside of the Uncanny Valley and can be thought of as cute.
I'm still hoping to program something interesting, which is why I've been hip deap in PDF files from universities on natural language processing for the past week or so. Though it would be great if there were a secure web service that would help us with the conversational and emotive parts of out bots.
I think they are onto something with Jibo. I think ultimately the women out there will show us that all this trying to move around stuff was overrated and what sells is something like Jibo. The minimalist approach of eliminating arms or legs was brilliant in my humble opinion (avoiding so many issues), and probably not having a face too. I think avoiding the uncanny valley is doable if the face is a bit cartoony or simple. Their idea is probably better though not to risk it. R2D2 didn't have one. I felt like I immediately had to have one when I saw the ad. I have long argued that a legless robot can easily climb stairs and move around a home (by asking a human to carry them) if the bot is endearing enough. I guess it all comes down to how good the software is at getting us attached.
As for the web service thought. This may not be the place to bring this up, but I have been thinking of putting one up myself. My thought is that Version 1 would have a WebService that bots and devices would call, and a website that would allow developers to control behaviors, emotions, settings, configure agents, maintain conversational elements, and test the AI without needing a bot. The service could but would probably not be intended to handle sensor and actuator data, but would pass back "commands" to be handled by the bots.
Please search for GIT-CC-95-10
This is a thesis (250 pages or so) about natural language parsing. I'm reading the proposal now, and it's got a lot of your ideas in it also.
Thanks for the paper. I always appreciate your references. I started into it about 1:30am last night. Try as I did, I can't say I understood much of what I was reading. I'll give it another go tomorrow probably.
It has a feminine appeal alright. I would guess it is no accident that it takes a few design cues from Eve in the Wall-E movie.
I like it. Dr. Cynthia may have cracked the mass marketable digital assistant code.
I hope it sells better than the Wowwee Alive Chimp head.
Graft this neural network learning code and graphics into Jibo and watch the newest Tamagotchi "key chain pet" sweep the world.
The creator thinks babyX has "sailed over the uncanny valley". I am not so sure.