Jump to Main ContentJump to Primary Navigation

Policing the web is not an easy task

Antisemitism is rife online - but no one wants to take responsibility for it

    Amid all the arguments over antisemitism, abuse, harassment and hate on social media, one thing that everyone seems to share is an extreme reluctance to take responsibility for policing the internet.

    The police and Crown Prosecution Service are daunted by the scale of the problem and the difficulties in acquiring the necessary evidence for a prosecution.

    In London the Metropolitan Police now has a specialist Online Hate Crime Hub, a model that is being copied nationally by the Home Office.

    CPS guidelines try to meet demands for prosecutions without unnecessarily restricting free speech or over-burdening their already stretched prosecutors. Both services will admit that just keeping up with the problem is an ongoing challenge.

    Social media companies insist that they are not publishers and therefore are not responsible for the content on their sites, but they are hardly a neutral space either. They increasingly decide what you see in your feed, the order in which you see it, and the adverts you see alongside it: and they do it in the way they think will generate the most income from your presence on their site.

    More a curator than a publisher perhaps, but still with rules limiting what you can and cannot post.

    Facebook, for example, allows Holocaust denial but not nudity. One unintended consequence of these two separate rules is that Facebook allows posts denying the Holocaust, but if you then post photos of Holocaust victims to argue back against the deniers, those photos will be removed because the bodies piled up at Belsen when the British army liberated it were not clothed.

    Nobody said determining the limits of free speech online would be easy.

    We often make the mistake of thinking social media is just another public space. It isn’t: these are privately-owned companies that monitor and monetise everything that happens on their platforms.

    Consequently, this isn’t about social media companies setting the rules for free speech across society. Facebook, Twitter, YouTube or Snapchat can’t tell me what I can say to my friends in the pub or shout at a football match.

    However, they do have the right, and a responsibility, to decide what I can and can’t write on their platforms.

    The main social media companies increasingly accept this point. Whether due to pressure from politicians, advertisers, adverse media coverage or just a change of heart, Facebook, Twitter and YouTube have tightened up their rules considerably over the past year and are removing more hateful and abusive content as a result.

    This welcome step brings its own problems, as poor training and inconsistent removal of content can cause confusion and undermine trust further, but these changes and the shift in attitude they reflect are vital.

    Because be in no doubt: if the social media companies themselves don’t accept this responsibility, they may find that governments force it on them through legislation.

     

    Dave Rich is Head of Policy at the Community Security Trust

Comment

Jeremy Corbyn will not accept his own role in the mess

Dave Rich

Wednesday, April 25, 2018

Jeremy Corbyn will not accept his own role in the mess
Comment

Loach, Livingstone and the Holocaust

Dave Rich

Wednesday, September 27, 2017

Loach, Livingstone and the Holocaust
Comment

Not sure why that mural is antisemitic? Let me explain...

Dave Rich

Wednesday, March 28, 2018

Not sure why that mural is antisemitic? Let me explain...
Comment

Antisemitism is simply on the rise.

Dave Rich

Thursday, July 27, 2017

Antisemitism is simply on the rise.
News Features

The UK's secret deal for Polish Jews

Dave Rich

Tuesday, January 23, 2018

The UK's secret deal for Polish Jews