The Computer Laboratory at the Department of Computer Science and TechnologyChristian Richardt/ Wikimedia Commons

Cambridge Pro-Vice Chancellors have expressed their concern surrounding claims made that the University is “contributing to the development of” lethal autonomous weapons, otherwise known as ‘Killer Robots’.

The claims were made in a report compiled by Cambridge Tech and Society in conjunction with the Stop Killer Robots Campaign.

The Pro-Vice Chancellor for Research, Professor Anne Ferguson-Smith, and the Pro-Vice-Chancellor for Enterprise and Business Relations, Professor Andy Neely, told Varsity that they “protest in the strongest terms” the “misrepresentation” of the projects of Cambridge researchers referenced by the report:

“These projects do not involve research into lethal autonomous weapons (LAWs), neither do they directly contribute to the development of LAWs.”

They continued: “The research being carried out, be it in sensor chips, communications, or robotics, was envisioned to address specific challenges facing society, and, as part of the University’s mission, to benefit all.”

The Pro-Vice Chancellors went on to outline to Varsity the ways in which they believed certain projects had been “misrepresent[ed]” by the report, including in its references to Silicon Microgravity, which they say “has never had any involvement with projects relating to LAWs and has no intention of entering this space.” They also stated that none of the projects sponsored by the company “has anything to do with the development of LAWs.”

Regarding the report’s claims that the involvement of Blue Bear Systems Research on project ‘INCEPTION’ “indicates this project will contribute to the development of LAWs”, Ferguson-Smith and Neely told Varsity that The Whittle Laboratory collaboration, funded by the Aerospace Technology Institute, “aims to develop a zero emission electric propulsion system for a future civil aircraft”, which will be “key to decarbonising the aviation sector by 2050”. They stated that this project was “urgent[ly] need[ed].”


Mountain View

‘Stop Cambridge Killer Robots’ campaign launched following lethal autonomous weapons investigation

They also said that the two projects being carried out by the Centre for Photonic Systems described in the report are Innovative UK (IUK) funded, with all Cambridge funding coming from IUK - “nothing from the companies involved.” They stated that “both projects - Aquasec (involving Tethered Drone Systems), and 3QN (led by Arqit) - are looking at quantum technology in civilian communications, and are not applicable to LAWs.” They went on to say that the research being carried out by the Centre is “primarily focused on optical communications and passive sensing for civilian applications”, stating that the Centre “has never carried out any research into LAWs and has absolutely no intention of doing so.”

In response to criticisms of the Department of Computer Science and Technology’s Supporters’ Club, which the report claims includes member organisations who are “explicitly developing LAWs”, the Pro-Vice-Chancellors said: “The Department of Computer Science and Technology’s Supporters’ Club has close to 100 members, whose fees are used - among other things - for outreach activities and relieving student hardship.”

They continued that “members are given the opportunity to take part in an annual recruitment fair, which is one of a number of activities used to offer students as broad a picture as possible of the career opportunities available to them, and provide them with as much information as possible to make informed choices”, but that it is “entirely for our students – as individuals – to decide which careers they want to explore or which companies they might wish to join.

Responding to the Pro-Vice-Chancellors’ letter, Cambridge Tech and Society told Varsity that “component technologies of lethal autonomous weapons (LAWs)” are most often “dual-use”: “while research into technologies such as sensor chips, communications, or robotics has much potential to create positive impact, there also exists the potential for the same research to contribute to the development of LAWs.”

 They went on to warn that institutions must be “vigilant” when taking measures to “safeguard against the unethical use of their research”: “The purpose of our report was to identify those ongoing research projects developing technologies that are applicable to LAWs, that are liable to be applied to LAWs systems. We want to emphasise that we do not aim to condemn individual researchers. Rather, we want to alert the University community to the fact that, by maintaining research relationships with companies developing lethal autonomous weapons, the University is allowing its research, for whatever original purpose, to be used to advance LAWs.”