From conspiracy theories linking 5G networks to the coronavirus, to misleading advice about the best way to battle Covid-19, social media sites have been awash with misinformation again.
On the Tech Tent podcast this week, we look at the pressure on the likes of Facebook, WhatsApp and YouTube to do more to stem the flow of falsehoods.
- Listen to the latest Tech Tent podcast on BBC Sounds
- Listen live every Friday at 15:00 GMT on the BBC World Service
At first it seemed we could laugh off or ignore some of the outlandish falsehoods about the coronavirus spreading across the internet: that it did not exist; that it was engineered in a lab in China, or perhaps the United States; that it was all part of a plan by Bill Gates to insert microchips in our brains to control us.
But it has become clear such misinformation can have damaging effects.
Researchers at King’s College London questioned people about Covid-19 conspiracy theories, such as the false idea the virus was linked to the rollout of 5G mobile phone networks.
And they found those who believed them were less likely to believe there was a good reason for the lockdown in the UK, with potentially damaging effects on health if they chose to ignore government instructions.
In the past week, there have also been arson attacks on phone masts, apparently inspired by the conspiracy theories.
Campaigns against 5G have been around for a while, with plenty of Facebook groups in the UK lobbying local councils to ban the new networks on health grounds, despite a lack of evidence they could cause harm.
Then in late January, the idea 5G might be linked to the coronavirus started popping up on some of these groups. The idea was even featured in a couple of newspaper articles.
While the social networks have previously been content to allow misinformation about 5G or vaccines or various conspiracy theories, this week they began to act.
WhatsApp made it harder to forward all kinds of stories, while YouTube brought in a policy of deleting video linking symptoms of Covid-19 to 5G. Among the first videos to be taken down was a lengthy interview with the conspiracy theorist David Icke.
Emily Bell, director of the Tow Centre for Digital Journalism, at New York’s Columbia University, tells Tech Tent there is a good reason the companies have taken a harder line.
“The pressure has been intense, not least because this is much more directly correlated to outcomes which affect life and death,” she says.
“If people don’t take the science seriously, if they don’t do what they’re being asked to do in terms of social distancing, then people die.”
In the past, Facebook has come under fire from American conservatives for alleged bias when it has moved to ban people or limit the spread of misinformation.
But for now, while a few free-speech advocates are pushing back against the crackdown on the conspiracy theorists, the social media companies are facing far less opposition than when they have silenced figures from the extreme right.
A far bigger problem, as Emily Bell points out, is dealing with the sheer volume of misinformation about the virus flowing across the various platforms.
“What I hear from all of them is that they still don’t really have the internal mechanisms to deal with the tsunami of false information that they’re currently seeing,” she says.
The other question is whether it is ever possible to change the minds of people convinced 5G caused the coronavirus, or vaccines are deadly, or Apollo 11 never landed on the Moon.
Some argue that rather than banning such material, the answer is to provide rigorous fact-checking and detailed scientific information.
But, having spent far too much time in recent weeks lurking in various conspiratorial Facebook groups, I am not so sure.
A member of one such group posted a message explaining how to type in the words “fact check” and “fact checker” into Facebook’s settings, so all such material was blocked.
Facts, it seems, are not required by those pushing eccentric and damaging narratives.