Friday, March 16, 2007

Human-Robot Interaction

Today Brian Scassellati from Yale came and spoke about his research in robot-human interactions. He said he was interested in how humans behave, and how to make robots act accordingly in their presence, so that one's grandparents can interact with them. His research also turned up theories about how humans learn certain skills that we take for granted, and things they learned about autism in children when experimenting with using robots as therapy or diagnosis tools. So here are some brief notes, from what I can remember:

  • The most interesting I thought was the last section, about using robots as therapy for children with autism. Autism is the 2nd highest genetic mental disorder, about 1 in 250 kids are born with it (much higher than I thought). Diagnosing it is pretty subjective, as children are just taken into a room with a clinician who check if they do certain behavorial traits or not. The rate of autism in the U.S. has also multipled tenfold in the last 8 years, but that may be because there are more people being diagnosed with it, using these subjective measures that differ between doctors. So he wants to develop some objective ways to diagnose autism.

    He's been studying how autistic children interact with robots, and the results are incredible. There are children who are completely asocial, who don't speak, who ignore their parents completely, but when placed in a room with a robot they exhibit many social behaviors. Even if their parents are in the room with them, they ignore their parents and focus on the robots. The researchers are trying to figure out why this is the case. They've ruled out that the robot is responsive to the child by having a robot that plays back a series of actions that have no relation to what the child is doing: the normal children get bored of it pretty quickly, but the autistic ones don't. Also they've ruled out that an anthropomorphic robot is the reason by having robots that have facial features misplaced or that don't resemble people at all. (One of the cute ones is this snowman creature, Keepon, from Japan; watch the brief video at the bottom!) So they're still working on why the children react to them, but when they leave the room, revert to their formal selves and stop social interactions completely. (One child has been seeing the Keepon robot weekly for 3 years, but still is only social in its presence). They're looking at what the children pay attention to when they say look at a scene in a movie: normal kids focus on the faces and the eyes, the autistic kids look at seemingly random things.

    Isn't it fascinating how those kids perceive the world? They don't seem to recognize human-ness/life in other people, but do in robots. Or they don't see the value of interaction with people. The researchers seem to want to fix them, to make them see things differently, but maybe there's some bit of truth, something valuable, in how they perceive.

  • Study in Vocal Prosody: teaching a robot to recognize speech is hard, but training them to recognize emotion should be easier. People can recognize is someone is angry or happy even if they're speaking in another language, and animals can do that too. So they trained a robot to do so (Kismet from MIT), and from the video it seemed to be doing pretty well, lowering its eyes and ears if it was being scolded. The strange thing was was that when they trained it using voice data from one gender, they could not get it to recognize the same emotions when the opposite gender was speaking to it, but it did well with members of the same gender. This was the case even when corrected for differences in pitch and such btw male and female. So that suggests that humans might have separate learning schemes for different voices, something for developmental psychologists to test.

  • Study in Gazes, in Language patterns: teaching a robot to figure out what/where a person is looking at. Studies were done before by teaching the robot by looking thousands of iterations of people gazing. But he improved the method by having the robot reach for an object, and then study a person who looked at it. It only took a couple hundred trainings to get that right.

    With language, his students taught a robot how to distinguish between pronouns when making a sentence like "I have the ball." One can easily teach the robot to recognize tangible objects, but it's hard for them, like for children at first, to learn how to use pronous properly. So they had two people toss a ball between them, saying "I have the ball", "Now you have the ball", etc. And by the end the robot could look for where a ball was and say "You have the ball" or "I have the ball".

Labels: ,

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home