During a parliamentary debate in the House of Lords, the UK Government repeatedly refused to rule out the possibility that the UK may deploy lethal autonomous weapons in the future.
At the dedicated discussion on autonomous weapons (also known as “killer robots”), members of the Liberal Democrats, Labour and Conservative parties all expressed concern around the development of autonomous weapons. Several parliamentarians, including Lord Coaker speaking for Labour and Baroness Smith speaking for the Liberal Democrats, asked the UK to unequivocally state that there will always be a human in the loop when decisions over the use of lethal force are taken. Responding to these calls, defence minister Baroness Goldie simply stated that the “UK does not use systems that employ lethal force without context appropriate human involvement”.
This formulation, which was used twice by the Minister, offers less reassurance over the UK’s possible future use of these weapons than the UK’s previous position that Britain “does not possess fully autonomous weapon systems and has no intention of developing them”. Furthermore, the UK's new posture does not rule out the possibility that the UK may consider it appropriate in certain contexts to deploy a system that can use lethal force without a human in the loop.
The debate was triggered by Lord Clement-Jones, former chair of the Lords Artificial Intelligence Committee and member of the UK Campaign to Stop Killer Robots’s parliamentary network, who described the Minister’s unwillingness to rule out lethal autonomous weapons as “disappointing”. He went on to describe the UK’s refusal to support calls for a legally binding treaty on this issue as “at odds with almost 70 countries, thousands of scientists [...] and the UN Secretary-General”.
Former First Sea Lord, Lord West, raised that alarm that “nations are sleepwalking into disaster...engineers are making autonomous drones the size of my hand that have facial recognition and can carry a small, shaped charge and they will kill a person”, mentioning that thousands of such weapons could be unleashed on a city to “horrifying” effect.
Speaking from the Shadow Front Bench, Lord Coaker asked for an unequivocal commitment from the Government that there will “always be human oversight when it comes to AI [...] involvement in any decision about the lethal use of force”. This builds on Labour’s position articulated by Shadow Defence Secretary, John Healey MP, in his September 2021 conference speech where he stated that a Labour Government would “lead moves in the UN to negotiate new multilateral arms controls and rules of conflict for space, cyber and AI.”
Former Defence Secretary, Lord Browne, asked for the UK to publicly reaffirm its commitment to ethical AI and questioned the Minister on the MOD’s forthcoming Strategy on AI. Minister Goldie explained that the MOD’s planned Defence AI Strategy was slated for Autumn while conceding that Autumn “had pretty well come and gone” adding that the Strategy would be published “in early course”. Conservative MP Lord Holmes asked a related question on the need for public engagement and consultation on the UK’s approach to AI across sectors. UNA-UK agrees with this imperative for public consultation, and was concerned to learn through a recent FOI request that “the MOD has carried out no formal public consultations or calls for evidence on the subject of military AI ethics since 2019”.
Today’s activity in the House of Lords follows similar calls made in December 2020 in the House of Commons, when another member of our parliamentary network: the SNP’s spokesperson on foreign affairs Alyn Smith MP made a powerful call for the UK to support a ban on lethal autonomous weapons.
On 2 December 2021, the Group of Governmental Experts to the Convention on Conventional Weapons (CCW) will decide on whether to proceed with negotiations to establish a new international treaty to regulate and establish limitations on autonomous weapons systems. So far nearly 70 states have called for a new, legally binding framework to address autonomy in weapons systems. However, following almost 8 years of talks at the UN, progress has been stalled largely due to the stance of a small number of states, including the United Kingdom, who regard the creation of a new international law as premature.
Growing momentum in the UK on this issue is not confined to parliament. Ahead of the UN talks, the UK campaign will release a paper voicing concerns identified by members of the tech community as well as a compendium of research by our junior fellows showing that the ethical framework around which research in UK universities is proceeding is not fit for purpose (early findings from the research has been published here and related research from the Cambridge Tech and Society group can be found here).
The UK Campaign to Stop Killer Robots, a coalition of NGOs, academics and tech experts, believes that UK plans for significant investments in military AI and autonomy -- as announced in the UK Integrated Review of Security, Defence, Development and Foreign Policy-- must be accompanied by a commitment to work internationally to upgrade arms control treaties to ensure that human rights, ethical and moral standards are retained.
While we welcome the assertion made in the Integrated Review that the “UK remains at the forefront of the rapidly- evolving debate on responsible development and use of AI and Autonomy, working with liberal- democratic partners to shape international legal, ethical & regulatory norms & standards" we are concerned that urgency is missing from the UK’s response to this issue. We hope the growing chorus of voices calling for action will help convince the UK to support the UN Secretary-General’s appeal for states to develop a new, binding treaty to address the urgent threats posed by lethal autonomous weapons systems at the critical UN meeting in December 2021.
A transcript of the debate is available here.
Image: Lord Clement-Jones introducing the debate on lethal autonomous weapons in UK Parliament