Facebook is “inconsistent” in the way it deals with terrorism-related and antisemitic material appearing on its platforms, the Community Security Trust has said.
Leaked manuals showing Facebook polices guiding moderators on what can and cannot be published have shown how the company formulates its policy on postings about violence, hate speech, pornography, self-harm, animal cruelty and other topics.
The documents, which were leaked to the Guardian newspaper, showed that comments such as “#stab and become fear of the Zionist” were considered unacceptable.
Meanwhile comments such as “To snap a b****’s neck make sure to apply all of the pressure to the middle of her throat” were permissible because they were not seen as credible threats.
CST’s director of communications Mark Gardner told the JC: “CST works quite closely with Facebook on these issues and it is good to see that they are addressing the problem in a serious and thoughtful way. There are, however, still inconsistencies in the guidelines and in how they are applied by moderators, and there is still far too much antisemitism on their platform.”
Facebook guidelines suggest that videos of violent deaths do not always have to be deleted if they are deemed to raise awareness of issues such as mental health.
Photos depicting physical abuse or bullying can also be left unless they are deemed to be celebratory or sadistic.
Earlier this month, Parliament's influential Home Affairs Select Committee strongly criticised Facebook and other social media companies as being "shamefully far" from tackling the spread of hate speech and other illegal and dangerous content.
The government should consider making sites pay to help police content, it said.
Soon after, Facebook revealed it was hiring more than 3,000 more people to review content.
Critics claim Facebook has become unable to control its content because it has become too big too quickly.
Last month an investigation by The Times found it was possible to befriend more than 100 jihadists online and to obtain pro-Daesh propaganda.
In a statement on Monday, Monica Bickert, Facebook's head of global policy management, said: "We work hard to make Facebook as safe as possible, while enabling free speech.
"This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously," she added.