Google faces renewed criticism over its filtering tools which anti-racism charities warned on Tuesday were not fit for purpose in the fight against online antisemitism.
A new report by the Antisemitism Policy Trust (APT) and Community Security Trust (CST) found Google’s safety function tool, SafeSearch, had little to no impact on antisemitic imagery.
With the function enabled, over a third of images (36 per cent) returned for the search “Jewish jokes” were deemed antisemitic by the report’s authors.
The figure rose to 57 per-cent for the search “Jew jokes” with the safety tool enabled.
Both searches returned similar levels of antisemitic content with the safety feature switched off, at 33 per cent for “Jewish jokes” and 48 per cent for “Jew jokes.”
The report, based on research from Cambridge University’s Woolf Institute, also found that a separate Google tool for developers lacked the ability to accurately identify antisemitic content.
The findings were unveiled as the government published its draft online safety bill on Tuesday to be scrutinised by a joint committee and brought to Parliament later this year.
Changes would include empowering Ofcom to block access to websites or fine companies up to £18m or 10 per cent of their annual global turnover if they fail in a new duty of care.
The report’s authors say their findings “underline the need for internet regulation rather than relying on internet companies themselves to address the problem of online hate.”
Danny Stone, APT chief executive, warned the “report proves why we urgently require that bill to place a duty of care on companies like Google”, while the CST’s Dr Dave Rich said failings were “yet another example of Internet companies simply not doing enough to proactively stop the spread of hateful material online.”
A Google spokesperson said: “In Google Search, we strive to provide open access to information while also not exposing people to potentially shocking or offensive content if they have not explicitly searched for it. While SafeSearch can be used to block explicit content from search results, it is not designed to block offensive or hateful results.
“When people are looking for images online, search engines largely rely on matching the words in the query to the words that appear next to images on the web page. In some cases, these searches match content on web pages that contain offensive and hateful images. We’ve made considerable improvements to address low quality content, and will continue to improve our systems.”
Google’s search filters not fit for purpose, say analysts
Report by two charities says a ‘SafeSearch’ tool created by the tech company fails to eliminate antisemitic content
Woman using smartphone. The concept of using the phone is essential in everyday life.
Have the JC delivered to your door
©2024 The Jewish Chronicle