VIENNA (AFP) - The world should establish a set of rules to regulate AI weapons while they’re still in their infancy, a global conference said on Tuesday, calling the issue an “Oppenheimer moment” of the time.
Like gunpowder and the atomic bomb, artificial intelligence (AI) has the capacity to revolutionise warfare, analysts say, making human disputes unimaginably different — and a lot more deadly.
“This is our generation’s ‘Oppenheimer moment’ where geopolitical tensions threaten to lead a major scientific breakthrough down a very dangerous path for the future of humanity,” read the summary at the end of the two-day conference in Vienna.
US physicist Robert Oppenheimer helped invent nuclear weapons during World War II. Austria organised and hosted the two-day conference in Vienna, which brought together some 1,000 participants, including political leaders, experts and members of civil society, from more than 140 countries.
A final statement said the group “affirms our strong commitment to work with urgency and with all interested stakeholders for an international legal instrument to regulate autonomous weapons systems”.
“We have a responsibility to act and to put in place the rules that we need to protect humanity... Human control must prevail in the use of force”, said the summary, which is to be sent to the UN secretary general.
Using AI, all sorts of weapons can be transformed into autonomous systems, thanks to sophisticated sensors governed by algorithms that allow a computer to “see”. This will enable the locating, selecting and attacking human targets — or targets containing human beings — without human intervention.
Most weapons are still in the idea or prototype stages, but Russia’s war in Ukraine has offered a glimpse of their potential. Remotely piloted drones are not new, but they are becoming increasingly independent and are being used by both sides.
“Autonomous weapons systems will soon fill the world’s battlefields,” Austrian Foreign Minister Alexander Schallenberg said on Monday when opening the conference.
He warned now was the “time to agree on international rules and norms to ensure human control”. Austria, a neutral country keen to promote disarmament in international forums, in 2023 introduced the first UN resolution to regulate autonomous weapons systems, which was supported by 164 states.
‘Uncorrectable errors’
A Vienna-based privacy campaign group said it would file a complaint against ChatGPT in Austria, claiming the “hallucinating” flagship AI tool has invented wrong answers that creator OpenAI cannot correct.
NOYB ( “None of Your Business” ) said there was no way to guarantee the programme provided accurate information. “ChatGPT keeps hallucinating — and not even OpenAI can stop it,” the group said in a statement.
The company has openly acknowledged it cannot correct inaccurate information produced by its generative AI tool and has failed to explain where the data comes from and what ChatGPT stores about individuals, said the group.