LIREC at Cebit, some thoughts.

Last week in Berlin, I gave a talk about LIREC work describing some of my thoughts around how we ended up exhibiting our work and presenting robots at CeBit.

CeBit is a small city full of the latest in business technology and academic research and it's mostly targeted at a german audience. Many of the exhibits weren't in english and most of the content this year focused on cloud computing for businesses. It's a fairly dry environment to be presenting robots in, but thankfully we weren't the only ones, and that made our hall a little bit of an attraction.

We presented 4 elements of LIREC, mainly EMYS, Little Mozart, Pleo and iCat as live demos. As every technologist knows, live demos are very tricky, but all things considered we were able to make everything work for 5 straight days which was brilliant.

EMYS was used as a demo to imitate the facial expressions of anyone sitting in front of it. This isn't how EMYS usually works, but was an easy demo considering we only had the head, and not FLASH, the rest of the body. People sat in front of it and in EMYS' nose, a camera identified the facial features and with the help of some open source software, the features would be tracked in real time and the motors in EMYS's head would move to try to imitate that person. It was a rather slow process (10 seconds for the camera to locate the person and then focus on his/her face), but one that didn't seem to bother people who were genuinely fascinated by the head. Not unlike dogs sniffing each other, the test user would try to figure out how EMYS was reading their features, accentuating a smile, raising eyebrows etc. This proved tricky of course as EMYS doesn't have a large number of facial features to play with, but there was enough resemblance there for people to recognise themselves in the robot.

Secundino Correia was there to present Little Mozart on the iPad which launched that week and has been very popular since. Little Mozart is a learning companion teaches children about how to compose music that sounds good. They get to feel supported by this "virtual agent" which uses the same process of empathy as other aspects of Lirec demos.

We also presented the Pleo migration hack where the toy dinosaur was hacked to be able to communicate via Bluetooth with a moble Android game (instructions coming soon!). This is incredibly exciting work and proves that our relationship with robots can also extend beyond a physical object. This is what people often call the "internet of things" but in the project, we call "migration" which defines the ability for an agent to move from one platform to the other.

Finally our Portuguese partners at INESC had an ongoing contest for attendees to beat iCat at a game of chess. Some russian attendees got especially involved. The cat was not only playing against contenders, but reading their facial expressions with the help of a web am and the Emotion tracker software developed by Queen Mary University. This made the cat give out clues or say encouraging things depending on how stressed the player looked and how the game was doing.

What we learnt from this process is that robots in a semi-academic context like Lirec's are hard to explain quickly. People want to know what the project hopes to achieve and the exploratory nature of this project makes it hard to answer. Business-men wanted to know where the opportunities lied and what we told them was this type of research is the top of the iceberg in order to predict future consumer behaviours. We are surrounding ourselves with more and more smart products which will act more and more like everyday companions. The ideal robots might be our washing machine that will read our faces, not only the robots from science-fiction writing. This is an ongoing exploration both of what we are capable of, but also what humans do naturally, effortlessly and how both worlds can work together to create more natural and emphatic tools. The best is still to come.

Posted by designswarm on Thursday, 24 May, 2012 /

Photos

www.flickr.com