It sounds like something straight out of Stanley Kubrick’s 2001: A Space Odyssey.
But, in a chilling echo of the computer Hal from the iconic film, scientists have developed robots that are able to deceive humans and even hide from their enemies.
An experiment by researchers at the Georgia Institute of Technology is believed to be the first detailed examination of robot deception.
The team developed computer algorithms that would let a robot ‘decide’ whether it should deceive a human or another robot and gave it strategies to give it the best chance of not being found out.
The development may alarm those who are concerned that robots who are able to practice deception are not safe to work with humans.
But researchers say that robots that are capable of deception will be valuable in the future, particularly when used in the military.
Robots on the battlefield with the power of deception will be able to successfully hide and mislead the enemy to keep themselves and valuable information safe.
‘Most social robots will probably rarely use deception, but it’s still an important tool in the robot’s interactive arsenal because robots that recognize the need for deception have advantages in terms of outcome compared to robots that do not recognize the need for deception,’
A search and rescue robot may need to deceive a human in order to calm or receive cooperation from a panicking victim.
The results were published online in the International Journal of Social Robotics.
The researchers looked at how one robot could attempt to hide from another robot to develop programs that successfully produced deceptive behavior.
Their first step was to teach the deceiving robot how to recognize a situation that warranted the use of deception.
Wagner and Arkin used interdependence theory and game theory to develop algorithms that tested the value of deception in a specific situation.
A situation had to satisfy two key conditions to warrant deception – there must be conflict between the deceiving robot and the seeker, and the deceiver must benefit from the deception.
Once a situation was deemed to warrant deception, the robot carried out a deceptive act by laying a false trail about its movements.
The robot was even able tailor its deception based on how much it knew about the particular robot it was trying to trick.
To test their algorithms, the researchers ran 20 hide-and-seek experiments with two autonomous robots. Colored markers were lined up along three potential pathways to locations where the robot could hide.
The hider robot randomly selected a hiding location from the three location choices and moved toward that location, knocking down colored markers along the way.
Once it reached a point past the markers, the robot changed course and hid in one of the other two locations. The presence or absence of standing markers indicated the hider’s location to the seeker robot.
‘The hider’s set of false communications was defined by selecting a pattern of knocked over markers that indicated a false hiding position in an attempt to say, for example, that it was going to the right and then actually go to the left,’ explained Wagner.
The hider robots were able to deceive the seeker robots in 75 percent of the trials, with the failed experiments resulting from the hiding robot’s inability to knock over the correct markers to trick the ‘finding’ robot.
‘The experimental results weren’t perfect, but they demonstrated the learning and use of deception signals by real robots in a noisy environment,’ said Wagner.
‘The results were also a preliminary indication that the techniques and algorithms described in the paper could be used to successfully produce deceptive behavior in a robot.’
The researchers said that they are aware that there could be ‘ethical implications’ involved in teaching robots how to deceive not just fellow robots but humans, too.
The article is reproduced in accordance with Section 107 of title 17 of the Copyright Law of the United States relating to fair-use and is for the purposes of criticism, comment, news reporting, teaching, scholarship, and research.