Can you feel sorry for a robot? Research indicates you can

Celebrity Gig
Credit: Unsplash/CC0 Public Domain

A pitiful sound from tinny speakers, sad virtual eyes, trembling robot arms: It doesn’t take much to feel sorry for a robot. This is the conclusion of a study by Marieke Wieringa, who will be defending her Ph.D. thesis at Radboud University on 5 November. But she warns that our human compassion could also be exploited; just wait until companies find a revenue model for emotional manipulation by robots.

Objectively, we know that a robot cannot experience pain. Still, under certain circumstances, people can be slightly more inclined to believe that a robot is in pain, provided they are manipulated in the right way.

“If a robot can pretend to experience emotional distress, people feel guiltier when they mistreat the robot,” Wieringa explains.

Boring task or shaking robots

Through several tests, Wieringa and her colleagues studied how people respond to violence against robots.

“Some participants watched videos of robots that were either mistreated or treated well,” she says. “Sometimes, we asked participants to give the robots a shake themselves. We tried out all the variations. Sometimes the robot didn’t respond, sometimes it did—with pitiful sounds and other responses that we associate with pain.”

READ ALSO:  Why shipowners pay port fees in dollars – NPA

In the tests, it soon appeared that the tormented robot triggered more pity: Participants were less willing to shake the robot again. “If we asked the participants to shake a robot that showed no emotion, then they didn’t seem to have any difficulty with it at all.”

In one of the tests, the participants were asked to choose: complete a boring task or give the robot a shake. If participants chose to shake the robot for longer, it meant that they didn’t have to carry out the task for as long.

“Most people had no problem shaking a silent robot, but as soon as the robot began to make pitiful sounds, they chose to do the boring task instead,” notes Wieringa.

READ ALSO:  Australia's Canva expands A.I.-powered design business to Europe

Tamagotchi

Wieringa warns that it is just a question of time before organizations exploit emotional manipulation.

“People were obsessed with Tamagotchis for a while: virtual pets that successfully triggered emotions. But what if a company made a new Tamagotchi that you had to pay to feed as a pet? That’s why I am calling for governmental regulations that establish when it’s appropriate for chatbots, robots and other variants to be able to express emotions.”

But Wieringa doesn’t think that a complete ban on emotions would be good either.

“It’s true that emotional robots would have some advantages,” she explains. “Imagine therapeutic robots that could help people to process certain things. In our study, we saw that most participants found it good when the robot triggered pity. According to them, it signaled that violent behavior is not okay. Still, we need to guard against risks for people who are sensitive to ‘fake’ emotions. We like to think we are very logical, rational beings that don’t fall for anything, but at the end of the day, we are also led by our emotions. And that’s just as well; otherwise we’d be robots ourselves.”

Provided by
Radboud University Nijmegen


Categories

Share This Article
Leave a comment