Facebook founder Mark Zuckerberg weilds a lot of powerAlessio Jacona

It provides, whether we like it or not, the backdrop to much of our lives. In the age of social media, Facebook, the ageing titan, the weary juggernaut, still retains its prominence. Its policies matter, just as they affect the lives of its billions of users – in both big and small ways. And something which may seem small, but is actually rather significant, is Facebook’s policy towards news and images. Some websites – many of them irritatingly modern and faddish – derive most of their traffic from Facebook shares, using it to generate millions of clicks.

Facebook has led to certain websites becoming significant almost overnight; the sheer weight of numbers, captured through likes and shares, has made that an inevitability. That’s why there are so many videos of disembodied hands making unrealistic meals in under a minute; it’s why so many viral videos, little more than a compilation of less well-known clips, exist. The power of Facebook is assured, and unlikely to diminish in the coming years, even as its average user gets older and younger people are less likely to take it up on the promise of connectivity and sociability which so many other apps and websites now offer.

Upworthy, BuzzFeed, and any number of side-projects or imitators – none of these would exist, at least in their current form, without Facebook. As annoying and utterly insubstantial as they prove, it is difficult to suggest that such websites are actively malign; they are simply dull.

It is because of this perception that what happens on Facebook is newsworthy that any change in its policies merits coverage; it directly affects the news – both as it is created and as it is consumed. This is what makes any claim of censorship on Facebook so disturbing, and why so many will, perhaps ironically, weigh in on social media with their perspective on whether such things are merited, or whether they are a threat to civilisation as we know it.

The latest round of this perennial subject was initiated when Facebook deleted a photograph. This is not exactly rare, and although it sometimes does not act to remove spectacularly violent content – gangland beheadings in Mexico, suicide bombings in Iraq – Facebook’s intentions and mechanisms are in these cases well known and well understood. But this was different: the thing under consideration was a work of real historical importance.

It was the so-called ‘napalm girl’ photograph, a shot from 1972 of Phan Thi Kim Phuc, badly burnt, fleeing a South Vietnamese attack using napalm during the Vietnam War. The photograph won the Pulitzer Prize. It was shared by several politicians in Norway, including the Prime Minister. But that was not sufficient justification, it seems, for Facebook to avoid deeming it in breach of its terms of service and deleting it.

One can, in a way, see why this happened, especially in a world where such things are decided by two different but occasionally complementary impulses: the algorithm-heavy way Facebook trawls the millions of posts made every day, and the manual way in which images are reported and flagged for deletion by concerned users. Regardless of all this, however, and regardless of the leaden tools with which Facebook polices its service, it cannot be argued that the image warranted deletion. It is unpleasant, with its evident pain and emotion and unvarnished portrayal of human misery, but such is the lot of war, and such, too, is life. These things cannot be edited out or glossed over. This fact was recognised by Facebook itself, and the images were restored soon after.

But this was still something of a big deal. It represented an editorial decision to remove content shared by elected officials in a democratic, modern state. This was also, in a way, an American firm interfering, even unintentionally, in the domestic politics of a fellow democracy. Some said this constituted censorship, and not an honest mistake. They say this is part of a long history of Facebook appearing to censor news – normally, of course, news appealing to their political persuasion. They say such things are sinister, and altogether too common.

This is not entirely accurate. The reverse of all this is Facebook’s recent decision no longer to curate news stories for its ‘trending’ bar, which can direct many millions of people to news of varying degrees of seriousness and respectability. This has its own problems, and means that conspiracy theories and outright lies, always common on social media, now have more traction than ever before. They can get a seemingly official rubber stamp just by being popular, and websites like infowars.com and americanmilitarynews.com, both of which promote active lies to a partisan, paranoid audience, no doubt stand to benefit from such changes.

As bad as things have apparently got, however, there is no way Facebook could avoid appearing to be malign: either it employs people to select news stories and censor them, blocking out unpleasant or untrue or unhelpful content, or it does not; in that case, truly unpleasant far-right anti-news can thrive just as easily as all the absurdist fluff promoted by NowThis or AJ+.

More important than the ‘napalm girl’ story, which is the inevitable and sad result of algorithms determining what can and cannot be posted, it's more interesting to think about Facebook’s trending news bar, and the effect it can have on the billions of people who use the site regularly.

The effect Facebook has on its users is immense and still not entirely understood. The site has recognised its error and restored the offending image in this case, but its policies on news could poison the well of public discourse in years to come. And what happens next may shock you