The findings indicated that humans have such a strong tendency to anthropomorphize -- attribution of human traits, emotions, or intentions to non-human entities like robots -- that we can fall prey to emotional manipulation by such entities.
In 2007, a team of researchers carried out a study called "Begging computer does not want to die" in which volunteers were asked to switch off a robot cat but were not sure what to do when the cat begged them to not turn them off, Tech Xplore reported on Friday.
In this new effort, the researchers replicated this experiment using more volunteers and a different robot, the report said.
For the study, published in the journal PLOS ONE, the research team involved 89 volunteers who were asked to interact with a robot under the guise of helping it become more intelligent.
At the conclusion of the interaction, a researcher would ask the volunteer to turn off the robot only to have the robot beg them not to do so. In addition to voice requests, the robot also displayed bodily actions meant to bolster their request.
Some volunteers served as controls and were asked to turn off the robot but did not experience begging from the robot.
The researchers found that 43 of the volunteers were confronted with the decision between complying with the researchers' request and the robot's.
They also reported that 13 volunteers chose to heed the robot's wishes and all others took longer to turn off the robot than those in the control group.
Each of the volunteers was interviewed after their interactions with the robot and those who had refused to turn off the robot were asked why.
The researchers mentioned that many of the volunteers refused simply because the robot asked. Others reported feeling sorry for the robot or were worried about doing something wrong.
(This story has not been edited by Social News XYZ staff and is auto-generated from a syndicated feed.)
This website uses cookies.