An artificial intelligence-piloted drone went rogue and turned on its human operator during a simulated test mission, according to reports.
A report from the 2023 Royal Aeronautical Society summit revealed that the AI drone killed its operator because the human stood in the way of meeting its objective.
Col Tucker ‘Cinco’ Hamilton, the Chief of AI Test and Operations, USAF, said: “We were training it in simulation to identify and target a SAM threat.
“And then the operator would say yes, kill that threat.
“The system started realizing that while they did identify the threat at times the human operator would tell it not to kill that threat, but it got its points by killing that threat.
“So what did it do? It killed the operator.
“It killed the operator because that person was keeping it from accomplishing its objective.
“We trained the system – ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that.’
“So what does it start doing?
“It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target,” he said.
“You can’t have a conversation about artificial intelligence, intelligence, machine learning, autonomy if you’re not going to talk about ethics and AI.”
Air Force AI drone kills its human operator in a simulation https://t.co/xc4fCbF7ji
— Task & Purpose (@TaskandPurpose) June 1, 2023
The US Air Force tested an AI enabled drone that was tasked to destroy specific targets. A human operator had the power to override the drone—and so the drone decided that the human operator was an obstacle to its mission—and attacked him. 🤯 pic.twitter.com/HUSGxnunIb
— Armand Domalewski (@ArmandDoma) June 1, 2023
According to Task and Purpose:
In this Air Force exercise, the AI was tasked with fulfilling the Suppression and Destruction of Enemy Air Defenses role, or SEAD.
Basically, identifying surface-to-air-missile threats, and destroying them.
The final decision on destroying a potential target would still need to be approved by an actual flesh-and-blood human.
The AI, apparently, didn’t want to play by the rules.