As governments around the world turn to artificial intelligence and autonomous weapons in conflict situations, human rights defenders and scientists are calling for stronger frameworks to guide their use.
Last month, the Australian Defense Force unveiled a range of weapons at land autonomous systems and team demonstrations at Puckapunyal Army Base in northern Victoria.
Among them were drones that fly themselves, robotic combat vehicles, unmanned tanks and a robot dog that can be used to clear landmines.
The weapons are not fully autonomous and require some degree of human interaction.
Lt. Col. Jake Penley and Lt. Col. Adam Hepworth (right) at Puckapunyal Army Base. (ABC Shepparton: Charmaine Manuel)
Director of Robotic and Autonomous Systems, Lieutenant Colonel Adam Hepworth, said the Australian Army saw a range of applications for AI in administrative environments and “on the frontlines of war”.
He said it was important to “maintain our human tutelage over decision-making.”
“Every system we have on board must go through a legal review process to meet all of our domestic and international obligations under international law,” Lt. Col. Hepworth said.
But some leading voices in science and human rights say more needs to be done to put in place stronger frameworks to govern their use.
Toby Walsh says artificial intelligence is a “double-edged sword”. (Supplied: TU Berlin/Press/Christian Kielmann)
‘A very dark, very dangerous place’
The ABC sent a catalog of the weapons on display at Puckapunyal to Toby Walsh, the chief scientist at the University of New South Wales’ AI Institute.
“I was quite impressed and quite nervous seeing how widely the capabilities were demonstrated,” he said.
Robot dogs can be used to clear mines. (ABC Shepparton: Charmaine Manuel)
Professor Walsh said the use of artificial intelligence and autonomous tools was a “double-edged sword”.
“There are some very positive applications you can think of, for example a mine-clearing robot,” he said.
“No one should ever have to risk life or limb clearing a minefield again.
“That’s a perfect job for a robot.
“If things go wrong, the robot will blow up, and you will buy a new robot.
“But there are also places where, I think, handing that decision-making over to algorithms is going to take us to a very dark, very dangerous place.
“This is a space where technology is developing very quickly.
“The military is adopting it incredibly quickly.”
An OWL-B loitering munition at Puckapunyal. (ABC Shepparton: Charmaine Manuel)
Professor Walsh said the software used in the weapons was easy to steal, copy or hack.
“It’s really just a matter of changing the code and something an operator asked to confirm the target selection [or] the target selection could be turned into a fully autonomous weapon,” he said.
“It’s frustrating that the Australian Defense Force is saying, ‘Well, you don’t have to worry, there’s nothing to think about’, when unfortunately the technologies will be available to other people, where it’s all to worry about to make.
“We need to maintain some meaningful human control over what these machines do.”
Lorraine Finlay says the use of autonomous weapons challenges the principles of international humanitarian law. (Supplied: Australian Human Rights Commission)
Machines that are blind to the value of human life
Human Rights Commissioner Lorraine Finlay said the use of autonomous weapons challenges the principles of international humanitarian law.
She said the Geneva Convention’s assessment system was flawed because autonomous weapons are designed to learn from each mission and technology evolves.
Optional manned combat vehicles on display in Puckapunyal. (Supplied: Australian Defense Force)
“There are particular concerns about whether machines can truly understand proportionality, because they do not understand the intrinsic value of a human life,” Finlay said.
“Simply saying that there is a human being in the circle somewhere is not enough. It must be clear where exactly these people are located, what their authority is and whether they are the ones who make the crucial decisions or is that left to the machine? ?”
Target drones shown during the demonstration in Puckapunyal. (Supplied: Australian Defense Force)
There are no specific rules dealing exclusively with lethal autonomous weapons and Ms Finlay’s key recommendation to the Government is that there should be.
In November last year, Australia voted in favor of a resolution at the United Nations calling on the international community to consider the challenges posed by autonomous weapons systems.
“So previously Australia has taken the position that it was premature to regulate… but we are hopeful that this latest resolution shows a shift in that position and a recognition that now is the time to actually address these issues and ensure to ensure those safeguards are in place,” Ms Finlay said.