Autonomy in weapons systems, and AI in security and defence
- US: Pentagon vows to move quickly to buy more drones, citing China threat: Speaking to press at the NDIA Emerging Technologies for Defense Conference, Deputy Secretary of Defense Kathleen Hicks announced the Pentagon’s Replicator program, which seeks to field attritable autonomous systems at scale of multiple thousands, in multiple domains, within the next 18 to 24 months. The systems in question include ‘the kinds of small drones that carry bombs and loiter in the air until they find a target, or that can gather images and other intelligence, sharing it with other autonomous drones that carry out an attack.’
- Brazil/UAE: Brazil, UAE to collaborate on autonomous systems for defence: The Brazilian Air Force ‘will collaborate with Emirati defense conglomerate EDGE Group to develop state-of-the-art autonomous systems for defense’, to include ‘unmanned vehicles, smart weapons, and other defense items for air and space domains over an undisclosed period.’
- Australia: How the Russia-Ukraine conflict will change all future wars: In the Australian Law Society Journal, Australian High Commissioner for Human Rights Lorraine Finley, together with Patrick Hooten, Human Rights Advisor (Business and Technology) at the Australian Human Rights Commission. Finley and Hooten write that ‘the use of autonomous weapons poses an existential threat to combatants and civilians alike, and is likely to result in war crimes which violate the principles of warfare and human rights.’
- US: AI brings the robot wingman to aerial combat: In The New York Times, Eric Lipton reports on the U.S. Air Force’s pilotless XQ-58A Valkyrie, ‘a prototype for what the Air Force hopes can become a potent supplement to its fleet of traditional fighter jets, giving human pilots a swarm of highly capable robot wingmen to deploy in battle. Its mission is to marry artificial intelligence and its sensors to identify and evaluate enemy threats and then, after getting human sign-off, to move in for the kill.’ Lipton quotes Mary Wareham of Human Rights Watch, who notes that ‘You’re stepping over a moral line by outsourcing killing to machines — by allowing computer sensors rather than humans to take human life’.
- Russia/Ukraine: Russia’s new arsenal of deadlier FPV drones is coming – if they can get through the bureaucracy: In Forbes, David Hambling writes on the use of FPV kamikaze drones, and how ‘Russian engineers are now working to make them more powerful with longer range, smarter guidance and less need for a human operator.’
- International: Autonomous Weapons and the Future of War: In this podcast hosted by Prabu David, Paul Scharre, Executive Vice President and Director of Studies at the Center for a New American Security, discusses the ‘ethical, legal and technological implications of the use of autonomous weapons in modern warfare.’ Scharre differentiates between ‘what is legal’ and ‘what is right’, and says that the decision to apply force by a combatant is not motivated by legality alone, and also depends on moral and ethical values. He expresses doubt that these moral and ethical judgements could be left to the discretion of a machine.
- US: US Army Mulls Outfitting Robot Dogs With Next-Gen Squad Weapon: The Defense Post reports that ‘the US Army is considering equipping its Vision 60 Quadruped Unmanned Ground Vehicles (Q-UGVs) with the Next-Generation Squad Weapon (NGSW)’ aimed to ‘increase their lethality and ability to respond to frontline threats.’
- US: Shield AI drones demonstrate autonomous teaming under USAF contract: Defense One reports that ‘Shield AI V-BAT drones have demonstrated autonomous teaming’ using Shield AI’s autonomous piloting software, Hivemind. The demonstration included ‘Detect, Identify, Locate, and Report missions’, and future plans for Hivemind include enabling it to function in ‘GPS denied and comms denied’ environments by 2024.
Facial recognition, biometric identification, surveillance
- US: Metro COB shared concerns over police surveillance: A surveillance technology, called Fusus, is being used in Nashville, Tennessee. The technology gives law enforcement access to private citizens’ and business owners’ surveillance cameras with the community members’ permission. ‘Metro’s Community Oversight Board (COB) members only recently learned about the technology, which they called “alarming.”’ Until recently, COB members were not aware that the Metro Nashville Police Department has been using Fusus for nearly a year.
- UK: UK government seeks expanded use of AI-based facial recognition by police: The UK Home Office has ambitions to ‘deploy new biometric systems nationally over the next 12 to 18 months’ for policing and other security agencies to ‘track and find criminals’, reports Financial Times. Systems the Home Office seeks to procure include facial recognition softwares and live facial recognition technologies. There is debate surrounding privacy concerns raised by these technologies, and several activists have urged for better regulation. Simultaneously Biometric Update reports, the British Security Industry Association has sought clarity on how the UK government ‘intends to fill the void left after the recent resignation of its Biometric & Surveillance Camera Commissioner (B&SCC) and the abolition of its office.’
- Iran: Hacktivists claim to expose facial recognition used by Iranian regime to catch dissenters: Hacktivist group GhostSec has exposed 26 GB data from a video surveillance system with facial recognition capabilities called Behnama in Iran, reports Biometric Update. The breach also uncovered data on ‘BehCard, a face biometrics system for printing ID cards, a car GPS and tracking system called BehYab as well as number-plate recognition system BehKhan.’
- US: New York police will use drones to monitor backyard parties this weekend, spurring privacy concerns: The Associated Press reports that ‘the New York City police department plans to pilot the unmanned aircrafts in response to complaints about large gatherings, including private events, over Labor Day weekend’. The announcement drew ‘immediate backlash from privacy and civil liberties advocates, raising questions about whether such drone use violated existing laws for police surveillance.’
- US: There’s A Cop In My Pocket: Policymakers Need to Stop Advocating Surveillance by Default: A blog post from the Council on Foreign Relations, raises concerns over proposed anti-privacy bills. A proposed bill in France will essentially give French police the right to ‘remotely turn on anyone’s phone camera or microphone.’ Several countries globally propose similar surveillance mechanisms. The authors state that ‘policymakers keep trying to pass damaging technology policy bills, and they ignore technologists when we say that the thing they want is impossible without destroying something far more valuable: our civil liberties.’
AI, algorithms and autonomy in the wider world
- UN Security Council and AI: As the United Nations Security Council gears towards ‘convening its first-ever talks on AI risks’, Raiyan Shaik, writing for Athena (an edtech startup) notes that ‘while the potential of AI is awe-inspiring, it is equally crucial to address the risks and ethical dilemmas associated with its development and deployment.’ Shaik emphasizes that ‘the importance of responsible regulation cannot be overstated.’
Government regulation
- US: Proposing the CASC: A Comprehensive and Distributed Approach to AI Regulation: Tech Policy Press provides a summary of a new paper authored by Alex C. Engler, a Fellow at the Brookings Institution. The paper delves into the challenges of governing algorithms’ Engler proposes a new regulatory approach- ‘the Critical Algorithmic System Classification (CASC). The approach aims to guide ‘federal regulators to flexibly govern algorithms used in critical socioeconomic determinations.’
Sign up for our email alerts
"*" indicates required fields
We take data privacy seriously. Our Privacy Notice explains who we are, how we collect, share and use Personal Information, as well as how you can exercise your privacy rights, but don’t worry, we hate spam and will NEVER sell or share you details with anyone else.