Autonomy in weapons systems, and AI in security and defence
- Vatican/International: Vatican presses world leaders at UN to work on rules for lethal autonomous weapons: This week at the UN, Archbishop Paul Gallagher, the Holy See’s foreign minister, ‘called for starting talks toward a legally binding pact to govern lethal autonomous weapons systems’ and for ‘“a moratorium on them pending the conclusion of negotiations.”’ Archbishop Gallagher also stated that ‘It is imperative to ensure adequate, meaningful and consistent human oversight of weapon systems.’ For more on the Vatican’s position on autonomous weapons, see here.
- Philippines/International: Philippines lobbies UN for rules on development, use of killer robots: The Philippines Foreign Affairs Secretary Enrique Manalo ‘has called on the international community to launch negotiations for legally binding rules on the development and use of artificial intelligence-powered killer robots’, and the Philippines will be hosting a meeting among Indo-Pacific partners in December on the issue. For more on the Philippines’ position on autonomous weapons, see here.
- Singapore/International: World must prepare for ‘profound implications’ of AI distribute its benefits fairly: Also at the UN, Singapore’s Foreign Affairs Minister Vivian Balakrishnan raised the issue of autonomy in weapons systems, noting that ‘There will be many occasions when humans may not even be in the firing loop, but we will be on the firing line. This would inevitably heighten the risks of unintended conflicts or the escalation of conflicts.’ This is Singapore’s first statement on autonomy in weapons systems at the UN General Assembly.
- Iran: Iran unveils jet-powered version of Shahed Kamikaze drone: In Forbes, David Hambling reports that Iran has unveiled ‘a jet-powered version of the Shahed series’, which will ‘come in at much greater speed’ than the 120mph that the Shahed 136 cruises at. The jet-powered version also ‘appears to have a nose-mounted camera’, which Hambling notes ‘may be used in two ways’: it ‘might be there to enable visual navigation’ or it ‘may enable terminal guidance, either controlled by an operator or with a machine-vision system to identify and lock on to a target feature such a specific building.’
- Finland: Automated Decision Research manager Catherine Connolly visited Finland this week for events with Stop Killer Robots Finland, including interviews with Finnish press. Coverage here and here (in Finnish).
- US: Palantir wins $250 million US Army AI research contract: The U.S. Army has awarded Palantir Technologies ‘a contract worth as much as $250 million to research and experiment with artificial intelligence and machine learning.’ Further details as to what exactly Palantir is developing for the U.S. Army have not been disclosed.
- US/International: Why the Pentagon’s ‘killer robots’ are spurring major concerns: The Hill carries a piece on U.S. plans for autonomous weapons systems, carrying comment from Peter Asaro, the co-founder of the International Committee for Robot Arms Control and a spokesperson for the Stop Killer Robots, noting that ‘AI use will be hard to prove and easy to deny’. A further piece in The Hill features quotes from numerous defence industry representatives, with the CEO of Aerovironment telling The Hill that the company will be able to ‘meet demand and make autonomous drones “in large volumes at the levels of reliability” expected from the Pentagon.’
- US/China: In US-China contest, the race is on to deploy killer robots: A lengthy piece by Reuters focusses on ‘how automation powered by artificial intelligence is poised to revolutionize weapons, warfare and military power – and shape the escalating rivalry between China and the United States.’
- US: This new autonomous drone for cops can track you in the dark: Wired reports on the Skydio X10 drone, a small new drone for police forces which ‘ has infrared sensors that can be used to track people and fly autonomously in the dark. Four payload bays on the X10 can carry accessories like a speaker, spotlight, or a parachute for emergency landings.’ The system is not weaponised and does not carry munitions. However, as Wired notes, new capabilities like those the X10 features ‘could encourage wider use of drones in law enforcement at a time when policy concerning their use is still developing.’ The CEO of Skydio has said that Skydio ‘doesn’t support weaponizing drones or robots, but he added that it’s difficult to stop people from making hacks or custom modifications.’
- Europe/Russia: Revealed: Europe’s role in the making of Russia’s killer drones: The Guardian carries a report on ‘a 47-page document submitted by Ukraine’s government to the G7 governments in August’ in which it is claimed that ‘there were more than 600 raids on cities using unmanned aerial vehicles (UAVs) containing western technology in the previous three months.’ This includes components used in the Shahed 136, a loitering munition type system which Russia has used extensively in Ukraine.
Regulation and enforcement
- EU: AI lie detectors at borders: Who does the EU’s AI Act actually protect? This piece notes that while the latest draft of the EU’s AI Act labels emotion recognition and AI lie detectors for border or law enforcement as high-risk technologies, meaning that they would need to comply with a range of requirements, experts and civil society organisations have called for a complete ban of such technologies, as ‘Emotion recognition technologies – tested in the iBorderCrtl research project – raise numerous challenges for fundamental rights, especially at borders, where identity-based profiling is already common practice.’
Facial recognition, biometric identification, surveillance
- UK: New report says government should ‘urgently stop live facial recognition’ in the UK: A new report published by the civil liberties group Big Brother Watch ‘shows a significant increase in the use of facial recognition for surveillance purposes in the UK by both the police and the private sector. This is despite the technology not being mentioned in any UK laws or debated by the House of Commons.’ The report, ‘Biometric Britain: The Expansion of Facial Recognition Surveillance’ recommends that ‘the use of live facial recognition by police forces and private companies for public surveillance must be immediately stopped in the UK.’
- US: The maker of ShotSpotter is buying the world’s most infamous predictive policing tech: Wired reports that SoundThinking, the company behind the gunshot-detection system ShotSpotter, ‘is quietly acquiring staff, patents, and customers of the firm that created the notorious predictive policing software PredPol’, marking its ‘latest step in becoming the Google of crime fighting—a one-stop shop for policing tools.’ Wired notes that ‘experts who study law enforcement use of technology say the bundling of two controversial technologies signals a new era for the cop-tech industry and has the potential to shape the future of policing in the United States’, and that while SoundThinking has rebranded “predictive policing” as resource management for police departments, a WIRED analysis of one of the company’s apps found that crime-forecasting technology remains one of its key offerings.’
- US: State Education Department issues determination on biometric identifying technology in schools: New York State Education Commissioner Betty A. Rosa has issued an order prohibiting schools in New York State ‘from purchasing or utilizing facial recognition technology. Schools can decide whether to use biometric identifying technology other than facial recognition technology at the local level so long as they consider the technology’s privacy implications, impact on civil rights, effectiveness, and parental input.’
- Africa: African nations spending $1 billion a year on harmful surveillance: Research by the Institute of Development Studies and the African Digital Rights Network, the first comprehensive study of the supply of surveillance technology to governments in Africa to monitor their own citizens, has found that ‘Governments in Nigeria, Ghana, Morocco, Malawi, and Zambia are collectively spending at least $1bn a year on digital surveillance technology contracts with companies in the US, UK, China, EU and Israel.’ The report, ‘Mapping the supply of surveillance technologies to Africa’, calls for the abolition of rights-violating surveillance technologies and for the defunding of mass surveillance of citizens.
AI, algorithms and autonomy in the wider world
- International: Amazon will use real user conversations to train Alexa’s AI model: Amazon has recently announced ‘new AI capabilities for its Alexa products’. John Davisson, the director of litigation and senior counsel at the Electronic Privacy Information Center, said ‘consumers should question Amazon’s interest in keeping and using voice data.’
- International: The automated hunt for cybergroomers: Algorithmwatch reports that algorithms have been developed to help track down cybergroomers, who stalk minors online, but that ‘the reliability of automated systems is controversial – and they can even criminalize children and teens.’
Research and reports
- Research blog: Five questions we often get asked about AI in weapon systems and our answers: The Autonorms project has published a blog on five questions they commonly receive on autonomous weapons systems and sketch out their answers to them, noting that ‘As AI technologies are increasingly diffuse across many areas of our social lives, we argue that this will entail setting fundamental guardrails around the types of tasks that we should use such technologies and those that should be reserved for human judgment. The latter category needs to include taking life—and not just in military settings.’
- Academic paper: The surveillance AI pipeline: This paper finds ‘stark evidence of close ties between computer vision and surveillance. The majority (68%) of annotated computer vision papers and patents self-report their technology enables data extraction about human bodies and body parts and even more (90%) enable data extraction about humans in general.’ A thread on the paper by one of the co-authors, Abeba Birhane, can be read here. 404 Media also covers the study, quoting one of the study’s co-authors: ‘I think it’s dehumanizing people quite literally,” co-author Myra Cheng told 404 Media on a call. “You’re literally objectifying people. I think that makes it easier to divorce whatever application or paper or method you’re building from the very real consequences that it might have on people and especially marginalized populations.”
Sign up for our email alerts
"*" indicates required fields
We take data privacy seriously. Our Privacy Notice explains who we are, how we collect, share and use Personal Information, as well as how you can exercise your privacy rights, but don’t worry, we hate spam and will NEVER sell or share you details with anyone else.