CoverDrop hides real whistleblowing messages amongst fake ones to make them untraceableFLYD VIA UNSPLASH // HTTPS://UNSPLASH.COM/LICENSE

In a capitalist world where toxic work culture thrives, business malpractice often goes unchecked. Whistleblowing offers a crucial outlet for workers to expose wrongdoing – but how safe is it to send that email to a journalist, and does anyone ever receive it?

Researchers at the University of Cambridge Computer Laboratory, in partnership with The Guardian, have developed an anonymous secure messaging service called CoverDrop. I spoke to Professor Alastair Beresford, Head of Department and Professor of Computer Security, to find out more.

“Metadata can be just as revealing as the message itself”

“Our aim was to provide confidentiality and integrity of messages, but also to keep metadata private,” says Beresford. Metadata – additional information such as who sent a message, when, and from where – can be just as revealing as the message itself. While platforms like WhatsApp or Signal use end-to-end encryption to protect message content, metadata can remain exposed. Securing it is significantly harder: a message must still be routed to its destination without revealing its origin.

CoverDrop tackles this problem through ‘cover traffic’. All users of The Guardian app send periodic dummy messages. Only when a real whistleblower submits information is a genuine message hidden among thousands of fake ones, making it effectively untraceable. There are also additional security features to further protect anonymity.

“So, with this technology, we moved from the scenario where people may communicate with a journalist via emails or text messages, which are not very secure, to a much safer alternative.” explains Beresford, “This is particularly true if, say, you were sending an email from your corporate account about bad behaviour inside your company.”

“While this protects privacy, it also introduces safety concerns: encrypted platforms can be used to organise and perform harmful or illegal activity”

The team – including Beresford and postdoctoral researcher Daniel Hugenroth – were keen to ensure the system could be widely adopted. CoverDrop is open source, allowing other news organisations to hopefully integrate it into their own platforms. This raises a broader question: what are the ethics of encryption?

“Major messaging platforms like iMessage, WhatsApp and Signal all offer so-called end-to-end encryption,” explains Beresford. “Which means that the service provider cannot read the messages.” In practice, this involves exchanging cryptographic keys so that only the intended recipient can decrypt the message. While this protects privacy, it also introduces safety concerns: encrypted platforms can be used to organise and perform harmful or illegal activity. “The question for us as a society is: what’s the right trade-off?”

Beresford suggests that policymakers and researchers must work in tandem to balance privacy with public safety. This ties into a wider debate about whether online platforms should be responsible for the content they host. With further complications introduced by AI, this question is only becoming more urgent, particularly as the social and mental health impacts of technology come under increasing scrutiny. What role, then, can the Computer Laboratory play?

“We like to treat our staff and students as a community of scholars working to address major technical and societal challenges,” Beresford states proudly. He highlights how the department’s research – spanning from hardware to artificial intelligence – addresses many of the defining challenges of the 21st century.


READ MORE

Mountain View

Mastering the multiverse

Since becoming Head of Department in 2023, Beresford has prioritised collaboration and building links between academia, industry, alumni, staff, and students. “When Daniel and I were doing our research [on CoverDrop], we wanted to have real-world impact,” he says, noting that this applied focus has been part of the department’s ethos since the 1940s. Beresford recounted many anecdotes of researchers who created start-ups based on their work in the department.

Our conversation ended on one of the biggest challenges: funding. “Funding for PhD students is expensive,” Beresford notes, “but PhD students are also the real engine of research in the department.” Expanding support, he argues, is essential to sustaining the work needed to tackle the technological problems ahead.