closeicon

Danny Stone

It’s not the social media user, it’s the platform

The Online Harms Bill provides an opportunity to change the way these companies operate

articlemain
December 17, 2021 08:07

Last week, Twitter’s lawyers were in action in Paris. They were arguing that a court order forcing the company to reveal the phone numbers and addresses of the staff in charge of processing French tweets flagged for removal should not be upheld. The company had been successfully sued by a coalition of French NGOS, including the Union of French Jewish Students, all seeking to prove that Twitter was failing in the fight against hate speech, having found that only some 11 per cent of content that was “obviously unlawful” was being removed.

On Friday, we at the Antisemitism Policy Trust revealed the full scale of antisemitic horror on the platform. Together with the Community Security Trust and with research undertaken by the Woolf Institute, we have produced data which estimates that there are up to 495,000 explicitly antisemitic tweets in English per year.

With the UK Jewish population standing below 290,000, this is nearly two tweets annually for every single Jew in the country.

It isn’t just Twitter. This is the third report in a series which found that Google’s public facing ‘Safesearch’ facility has no impact on the amount of antisemitic content that is returned when people search for jokes about Jews. The company’s response, which boiled down to explaining that the system wasn’t designed to capture racism, underlines the problem.

We have also found that antisemitism is extensive on Instagram. It is associated with a chaotic trolling phenomenon in which people mix antisemitic hashtags with a number of others on the platform. It demonstrates strong links to conspiracy theories and anti-Israel attitudes.
Taken as a whole, it is fair to say that despite there being fantastic and inspiring Jewish content online, being a Jew in the digital world can be a depressing, frightening, isolating and tiring experience.

It doesn’t have to be this way. Across the world, governments are waking up to the need for regulation. In the US, Canada and Europe, changes are afoot which are forcing social media companies to reassess the cost of having put profit over people.
In the UK, the government has already published a draft Online Safety Bill. Though many have identified problems with it, there is general agreement that it is a bold and necessary piece of legislation.

Anti-Jewish racism on social media is widespread. Some of it is illegal, while some is harmful, but within the law. The Bill, if well designed, could have a significant impact on both. The harm needs to be addressed not where it is deployed but by the systems which should capture it upstream.

So, when the grime artist Wiley broadcasted antisemitic tweets to hundreds of thousands of people, it wasn’t the content of the tweets per se, but the failure of the system that would need to be addressed. That is, what was the company doing to stop accounts with large numbers of followers broadcasting hate? Where was the friction in the system? And how, given that Wiley returned to the platform recently to spread further hate, does Twitter stop repeat offenders?

If the Bill remains in its current form, there are limited requirements for the companies to act — merely having Terms and Conditions to “deal with” harm. And some services (including Google search) are entirely exempt from having to tackle such content. An overarching duty forcing all services to address reasonably foreseeable harms would be a far better model.
Furthermore, we should be holding senior executives to account for serious failures to apply duties of care. The Bill should go further in this regard, matching the penalties of seven years in prison that we have in financial services.

There is a way to go with the Online Safety Bill but it offers a glimpse of what a brighter future may look like. We will continue to work with Twitter. We have to have honest and open conversations with representatives of the platforms, and we know there are good people working there that want to improve its service, but at present it isn’t good enough.

For too long, social media companies have been experimenting on us. The results, as our report makes clear, can be extremely harmful. It’s time for the companies to ensure that they are safe by design.

Danny Stone MBE, Chief Executive, Antisemitism Policy Trust

December 17, 2021 08:07

Want more from the JC?

To continue reading, we just need a few details...

Want more from
the JC?

To continue reading, we just
need a few details...

Get the best news and views from across the Jewish world Get subscriber-only offers from our partners Subscribe to get access to our e-paper and archive