Thursday, September 17, 2015


We were invited to present our robotic tutor at the BBC as part of their on going week long programme called Intelligent Machines. In a series of shows the BBC is exploring the effects of new technology specifically in Robotics and Artificial Intelligence on our lives. Are we going to lose our jobs to robots? How safe are they? Can we feel empathic towards our robotic friends? These are a few questions they are exploring.

We presented our robot at the BBC media cafe.

Wednesday, September 16, 2015

Empathic robotic tutors

Our robot is an empathic tutor that was developed as a part of the European Union funded project called EMOTE. This is a project with several partners in Europe comprising of teams from UK, Portugal, Sweden and Germany. Our empathic tutor is an autonomous agent who will present tasks to children's aged 10-15 and as they do the task help them out. It is empathic because it can sense the emotional state of the student(s) and adapt its responses accordingly. We have developed two scenarios: a treasure hunt and an urban planning game.

In one scenario, the task for the student is to carry out a treasure hunt. The map is laid out on a huge touch table. They are given tasks such as "find the museum about 300 meters North of your current location". Students answer these tasks by touching the appropriate feature on the map. This answer is diagnosed as correct, incorrect or partially correct. The tutor responds with an appropriate feedback. If they get it wrong, it tries to help them by suggesting appropriate tools to use, prompting them, giving them hints, etc. These pedagogical tactics are adapted based on the skills of the learner. As the learners progresses and his/her skill levels increase, less help is provided.

In the other scenario, the tutor interacts with two students. They play a game called Enercities. It is an urban planning game where the team builds a city. In order to build a city, they will have to balance between constructing features pertaining to the economy, environment and citizen's wellbeing. The role of the tutor is to play the game with the students as a team and reinforce the cause and effect relationships between concepts in sustainable development.

In both these scenarios, I was involved in design and implementation of the central decision making module called the Interaction Manager. IM is the module that decides what to do based on the state of the interaction. We designed and built an IM engine that can be loaded with IM scripts. Scripts tell the engine how the conversation should proceed. It is written in XML. Based on our consultations with educationalists and psychologists, we implemented scripts for both the scenarios.

The system was tested in several schools across Europe and currently we are analysing the results. Overall the students seem to have enjoyed interacting with the robot. 

Friday, August 1, 2014

Romo: Convert your smartphone into a robot!

Just a few days ago, I wrote about a new personal robot called Jibo. It is a robot that is being developed by a team at MIT led by Dr Cynthia Braezeal. You can find my blog article here

After I wrote the article I was reminded of another such robot I had seen sometime ago on a TED video. The robot was called Romo and is very similar to Jibo. It is similar in the sense that it has no actuators (i.e. no hands) and has a cute face. Just like Jibo, it can turn around and can track you walking across the room. But unlike Jibo, it can actually move around as well. 

Romo: A smartphone powered robot

It is actually a smartphone sitting on a dock that can move. The dock can be instructed to move by the smartphone. In a sense, the smartphone is the head and dock is the body. The smartphone itself can tilt forwards and backwards. At the time of the talk, the robot can be remote controlled using an iPad and therefore used as a telepresence robot. You can see on the iPad what the robot can see remotely. You can move the robot using the iPad as the controller.

The face of the robot is cute and cartoonish. It can express a range of emotions via its face and its body. For instance, it can move backwards a bit and tilt backwards to express fear.

Romo is available now. All you need to buy is the dock for around 130 USD. You need to install the latest Romo App on your iPhone (4/4S/5/5C/5S) and dock it. Your robot will be ready.

In the TED talk, there was no demonstration of any autonomous behaviour. Like when Jibo can be given a task, say order Chinese food for two, and it does so by itself or when it can be commanded to take a picture. I believe such behaviours can be added to Romo too using the SDK provided. Same with capability of having a conversation. If Romo could exhibit autonomous behaviours such as these, it would be a better product than Jibo. Simply because, I can undock my phone, go to work and when I come back, dock it back in and I'll have a robot.

And if you want to buy one, here is where you can..

Wednesday, July 23, 2014

Jibo: your very own personal robot

Jibo is not the regular robot that you are used to seeing. It has no legs and can't walk. It has no hands and can't hold stuff for you. But what it can do is to sit there with you and have a conversation. It looks like a smartphone but with limited mobility. If you can call the top part its head, then what Jibo can do is to turn its head from side to side. The bottom part ("the body") is not mobile and so it sits on your table or your desk and can look around the room. So if you are walking around the room, it can turn its head and track you. The head seems to contain a screen to display stuff and a camera to shoot pictures and videos.

(Picture from the video below)

It has a face that is displayed on the screen unless it wants to show you something on the screen. The face is a simple circle that changes shape depending on its mood. So if it is happy, it turns the circle into a smiley. Its simple and neat.

It responds to your voice commands or may be unrestricted speech. So you can ask it to order takeaway food, read voice mails, take a picture, call someone, etc.  It can tell you stories to your little ones and keep them company. You can call it a companion bot, if you like.

The product itself will be available in 2016 and is developed by Dr. Cynthia Breazeal and her team at the MIT. The project has raised a whooping $1 million USD on Indiegogo in just a week's time (about 10 times the funds it originally wanted).

Will this be the new future for personal robotics? Will these robots get into our homes just like smartphones got into our pockets? Will these be our future homemakers? Lets wait and watch!

Tuesday, April 8, 2014

Amazon finds its voice!

Amazon has introduced two new products recently: Fire TV, a new TV box to the market and Amazon Dash, a stand-alone gadget to scan and order groceries.

Fire TV is a set top box that streams video on to your TV and competes with Apple TV and Google Chromecast. Amazon Dash is a novel gadget that allows you to scan the barcodes on the products that you want to buy and later adds them to your checkout basket on Amazon Prime Fresh (, an Amazon website that sells groceries and fresh foods.

Whats new and interesting to me is the Voice search feature on both of these new services from Amazon. In Fire TV, you can press the mic button on the remote and say the name of a show or actor that you wanna watch and the Fire TV searches it for you. I think this is just the initial version and may be later Amazon will come up with a full fledged conversational interface where you can have a conversation about the movies, actors, directors that you like and it can present you with movies that you might be interested to watch. Similarly, in Amazon Dash, you can press the mic button and say things that you need to buy, because you can not always find a barcode for the things you want to buy.  For instance, when you bite the last of your apples, you say "Apples" to Amazon Dash and they get added to your grocery list. You can order them later and get them delivered to your door step. As in Fire TV, I think this is only the beginning.

Its good to see Amazon embracing speech technology after so many years. I thought it would have happened sooner. With all the technology giants like Google (Now), Apple (Siri) and Microsoft (Cortana) having already embraced speech and conversational technologies, the latest from Amazon is only reassuring the hope that our conversations with the machines is here to stay.

Friday, March 7, 2014

I love 'Her'. I mean the movie! :)

Curious to see what Holywood has done to the concept of conversational systems, we went to watch the movie 'Her' as a team. Our team comprises of people who work in Artificial Intelligence, Natural Language Processing and more importantly conversational systems. I think 'Her' adds on to Steven Speilberg's movie 'AI', in that it takes emotions on a computer to a new high. 'Her' is a love story between a man, Theodore Twonbly, who recently went through a break up and an impending divorce and his new computer which has a conversational operating system. It takes a woman's voice as per his request and calls 'herself' Samantha. I think she also becomes a woman there as the latter part of the movie suggests! She does a lot of chores like reading and sorting emails and interrupts him by calling his attention by making something like a phone call when there is something important (like a mail that needs attention). He interacts with her primarily through a bluetooth-like headset and a handy smartphone like device. Most of the movie shows their interaction happening via these devices giving an idea of the two talking over phone forever. She gets to see the world via the camera on his 'phone' and makes sense of it. She can draw pictures on his phone's screen. Yay. She is multi-modal. More importantly, she has a personality and displays emotions.

The interesting bit is when she starts developing emotions and begins to feel things for her 'master'. Because she is nice to him and listens to him, he starts reciprocating her feelings too. This leads to all kind of issues: will the society accept this idea?, are we ourselves comfortable with the idea of dating conversational virtual characters?, is it a real thing?, is it the same as human-human dating?, what if we fall in love?, what next?, what about sex?, are we exclusive?, how are we going to handle breakups - is it the same as uninstalling a software program or like your laptop being stolen? Equally interesting are Samantha' s concerns about not having a human body, feeling jealous, being attracted to other users, and remaining exclusive to Theodore

It also turns the spotlight on an interesting question: how we humans will view our companions of the future and what kind of relationship we might want with it.

Watch the trailer here:

Read more at:

Thursday, March 6, 2014

Microsoft's answer to Siri and Now..!

Microsoft is reportedly working on an answer to Apple's iPhone voice assistant, Siri and Google's assistant, Now. The new voice assistant will ship with Windows 8.1. smartphones. From the video (see below), it looks like Cortana will learn about the mobile user by asking a few questions and adapt its behaviour to the needs, preferences and interests of the user. It looks like it will perform various services like making calls, sending messages, setting reminders, alarms, checking mails, stock prices, etc similar to what Siri and Now do. What is interesting is that with this news, the holy trinity of Smartphone OS - Android, iOS and Windows have all reaffirmed that the next generation mobile interface is via speech. 

Watch Cortana at work here in the YouTube video below..

Read more at: