A student campaign to '#StopCambridgeKillerRobots' was launched on Tuesday (26/10)Minkus (Unsplash)

A student campaign to ‘#StopCambridgeKillerRobots’ has been launched in response to a report linking Cambridge University research to the development of lethal autonomous weapons (LAWs)- dubbed 'Killer Robots'.

The campaign was officially launched at a Cambridge Tech and Society event on Tuesday 26th October, following the release of an open letter whose signatories include St Catharine’s College JCR, Cambridge University China Forum and Extinction Rebellion Cambridge Universities.

The letter, addressed to Vice-Chancellor Professor Stephen Toope, responds to the findings of a report conducted alongside the Stop Killer Robots campaign, and states that their research found the University to be “contributing to the development of LAWs through military-funded research collaborations, close relationships with commercial LAWs developers, and through the encouragement of student recruitment to LAWs developers.”

The report, which has been seen by Varsity and was compiled by Cambridge Tech and Society in conjunction with the Stop Killer Robots Campaign, states that Cambridge University has received research funding from several organisations linked to the development of LAWs. Among these include Silicon Microgravity, a “sensor technology spin-out” of the University, which has granted £567,000 in research funding to Cambridge during the period 2015-19. The report links the company to LAWs due to its development of gyroscope and accelerometer technology; a statement taken from the company’s website states that this "will be integrated into a wide range of inertial navigation systems delivering MEMS based tactical and navigation grade sensing", and lists areas such as "defence, aerospace, autonomous vehicles and robotics" as possible areas of application.

Other major funding sources include ARM Ltd. and Trimble Europe, who both have links to the defence industry and whose grants to University research amount to £455,000 between 2017-19 and £193,000 between 2016-19 respectively.

The open letter criticises the facilitation of student recruitment by LAWs developers via the Computer Science Department ‘Supporters Club’, which allows companies to have contact with students in return for a fee. Members include AI specialists Rebellion Defence and Xilinx, both of whom the report describes as “explicitly developing LAWs”; Rebellion Defence’s main areas of AI development are listed as “comprehensive battlespace awareness” “autonomous mission execution”, and “cyber readiness”, while Xilinx are reported to have worked with “Turkish drone manufacturer Baykar Makina, who have supplied drones deployed in Armenia, Syria and Libya”. The campaign objects to these companies taking part in student recruitment activities such as talks and the advertisement of internships via mailing lists.

Other key points from the report include revelations around a collaboration between the Whittle Laboratory (based in Cambridge’s Department of Engineering) and Blue Bear Systems Research on a project called ‘Project InCEPTion’, which aims to “develop a novel all-electric propulsion module” for aircraft. Regarding this partnership, the report states: “The involvement of Blue Bear Systems Research, a leading unmanned aerial systems developer, indicates this project will contribute to the development of LAWs.”

The report also describes projects associated with the Centre for Photonic Systems based in the University’s Department of Engineering which involve collaboration with companies linked to the defence industry, such as Arquit and Tethered Drone Systems.

The campaign describes links with these organisations as an “active endorsement of lethal autonomous weapons systems” and calls upon the the University to take action: their demands include halting all activities “directly contributing” to LAWs development, greater transparency regarding the potential applications of researchers and students’ work, and endorsing the prohibition of LAWs by signing the Future of Life Pledge.

Laying out the basis for their objection to the development of lethal autonomous weapons, the campaign’s letter states:


READ MORE

Mountain View

Open letter condemns Cambridge University for ‘perpetuating the oppression’ of Palestinians

"The decision to take life cannot be delegated to algorithms. Such a decision lacks the intention, understanding and moral reasoning necessary to evaluate the proportionality of an attack, where human life is reduced to merely a factor within a predetermined computation.”

It continues: “A machine is incapable of exercising discretion, and lacks the compassion and empathy needed to make morally complex decisions.”

The letter goes on to state that LAWs are “incompatible with international human rights law, namely, the Right to Life (‘no one shall be arbitrarily deprived of life’), the Right to a Remedy and Reparation, and the principle of human dignity.”

They also cite concerns regarding “a lack of a clear line of accountability for unlawful civilian deaths” and the potential for bias to be coded into LAWs, where “where people of colour, women and non-binary people are at greater risk of misidentification and unlawful killing.”

A University spokesperson told Varsity that it has a “robust system for reviewing strategic relationships and donations.”

They continued: “The University of Cambridge Committee on Benefactions, External and Legal Affairs (CBELA) scrutinises sources of funding that might be inappropriate on ethical grounds or pose a reputational risk to the University.”

Cambridge Tech and Society’s campaign forms a part of the global Stop Killer Robots movement, which comprises more than 180 NGOs across 66 countries. The calls for change come against the backdrop of the UN Convention on Certain Conventional Weapons, which is currently debating an international ban on lethal autonomous weapons.