Autonomous weapons systems have been under growing international scrutiny since 2014Campaign to Stop Killer Robots@ Wikimedia Commons

Cambridge undergraduates may be contributing to the development of autonomous weapons systems (AWS), according to a recently published report.

The report into the ‘British military-university nexus’ flagged links between Cambridge’s Department of Computer Science-based ‘Prorok Lab’ and the development of autonomous robots by the US Army. It also stated that past projects at Prorok Lab have been funded by the US Army’s corporate research laboratory.

The investigation, led by Stop Killer Robots in UK Universities, alleges that undergraduate computer science students are involved in project work for the Prorok Lab “which is developing autonomous robots” alongside research groups affiliated with the US army.

Issues were raised due to “concerns as the ethics teaching within Cambridge’s Computer Science MSc-BSc core curriculum does not consider the military applications of computer science research”. The investigation also found a “lack of transparency” in ethical decision-making procedures at the University.

This was said to raise “questions about the university’s ability to ensure that its funding sources and the potential outcome of its research fully respect and reinforce International Human Rights Law principles”.

The report states that not all projects under investigation “directly contribute to autonomous weapons systems”, but nonetheless have the potential to do so “whether intentionally or unintentionally”

The allegations comes a year after pro-vice-chancellors condemned allegations that Cambridge contributes to the development of lethal autonomous weapons as a “misrepresentation” of university research projects. Anne Ferguson-Smith, pro-vice-chancellor for research, and Andy Neely, pro-vice-chancellor for enterprise and business relations, said at the time that the research in question was “envisioned to address specific challenges facing society, and, as part of the University’s mission, to benefit all”.

Autonomous weapons systems, which select and engage targets without meaningful human input, have been under growing international scrutiny since 2014. Over 40 countries, the majority of which are in the Global South, have called for the implementation of international laws restricting the new forms of high-tech weaponry. Negotiations to bring the new technologies under the UN Convention on Conventional Weapons framework were brought to a halt last year by opposition from major powers including the United States, Russia, India and Israel.


READ MORE

Mountain View

Pro-Vice-Chancellors condemn ‘Killer Robots’ accusations as ‘misrepresentation’

Speaking with Varsity, Stop Cambridge Killer Robots said: “we’re asking Cambridge to establish a clear policy and commit publicly to not contributing to the development of LAWS [lethal autonomous weapons systems] and to ensure university staff and researchers are fully aware of what their technology may be used for and understand the possible implications of their work, and allow open discussions about any related concerns”.

The University of Cambridge was contacted for comment