Liran Antebi, Research Fellow at the Institute for National Security Studies (INSS)

"The problem is not with the weapon, it is with the AI"

Our fear of drones that carry out attacks is misguided- the real danger is the technology that runs them, explains Liran Antebi, Research Fellow at the Institute for National Security Studies (INSS)

"I know not with what weapons World War III will be fought, but World War IV will be fought with sticks and stones.”
In 1949, Albert Einstein may not have predicted what weapons we would one day use to fight each other, but as we approach 2023 it has become clear that the world of warfare is moving from the threat of nuclear attacks to cyberspace taking place online. Artificial Intelligence is penetrating every aspect of our lives and it was only a matter of time before it is deployed into our weapons systems both over the internet and on our artillery.

1 View gallery
Liran Antebi INSS
Liran Antebi INSS
Liran Antebi, Research Fellow at the Institute for National Security Studies (INSS)
(Photo: Chen Galili for INSS )
CTech spoke to Liran Antebi, Research Fellow at the Institute for National Security Studies (INSS) about some of the ways that AI is being used in weapons, notably the ability to make them autonomous, and how skeptics should not fret about the weapon itself - but offers a stark warning on how we should get to grips with the technology before it is too late.
“We have to understand the subject deeply but also the ecosystem and the holistic topics surrounding it,” Antebi said. “[Autonomy] is relevant to both military forces and commercial companies, and for small operators of technology. “In each arena, you have different challenges that will appear.”
Of course, AI is already in our everyday lives to help automate aspects of our daily routines. Private companies are building autonomous vehicles, which despite some citizen hesitancy, is widely seen as a progressive move to help save lives and improve the environment. However, Antebi argues that when the topic of autonomy is applied to military weapon systems, and the technology is adopted by governments and armed forces, then views change drastically.
“When we speak about the military issue we are speaking about human lives,” she explained. “If there is firepower, there is an issue of life, of taking life, and saving a life. Most of the time it is framed as the most critical one. I was arguing for a long time regarding autonomous lethal weapons systems that people are looking at it in a very critical way just because they're lethal. But autonomous cars are lethal as well. Because autonomous weapons systems have weapons on them, they're framed as weapons. But many other things with autonomy can become lethal. They're not lethal but they can become lethal.”
As Research Fellow at the Institute for the INSS, Antebi studies autonomy implementation and the challenges and opportunities that come with it. She has in the past advised the UN as part of a group of experts called IPRAW (International Panel of Regulation of Autonomous Weapons Systems) and has advised the Israeli tech Ministry of Defense - although she couldn’t discuss the details.
“I was one of the first, at least in Israel, to argue that the problem is not with the weapon, it is with the AI. We have to be certain that the AI is fine and perfect before operating something - and it doesn't matter if it has a weapon on it or maybe it can cause or lead to damage or death by mistake. I’m speaking about autonomous cars, I’m speaking about robotics like those of Telsa and whoever is presenting humanity’s robots.”
Antebi explained that these ethical decisions that are raised with autonomy are often up to the programmers at the private companies that are creating the technologies. The systems that are expected to transport children in autonomous vehicles or conduct drone attacks in war fields overseas must be perfect before they can continue to operate with the full confidence of citizens and governments. The famous dilemma of who a robot will save in a fire, or where a drone will attack to hit its target, will need vigorous oversight and regulation to make sure autonomy remains a convenience and not a lethal weapon.
“Who will decide for me, Tesla engineers? I need my people in Israel in the Knesset to create laws. But they don't know how to deal with those questions because the technology is hard,” she concluded.