When people talk about online antisemitism, they usually point to the big platforms: X, TikTok, Instagram, YouTube. That’s where Jews see it, experience it, report it.
But our new research, a collaboration between the Antisemitism Policy Trust and Mozaika, a Barcelona-based Jewish cultural platform, shows something crucial – and often missed. The worst antisemitism doesn’t always start there. It can and does start elsewhere, in the digital shadows, before being laundered into the mainstream.
Small, high-harm platforms – sites like Gab, BitChute and anonymous forums – have become incubators for the most extreme antisemitic content online. These spaces operate with minimal moderation, weak enforcement and, in some cases, near-total impunity.
What circulates there is not “just” prejudice or dog whistles, but open conspiracy theories, Holocaust distortion, glorification of violence and calls for harm against Jews.
And crucially, it doesn’t stay there.
Our research tracks how antisemitic memes migrated from 9gag, a medium-size platform hosting memes, into the much larger Reddit, where they are amplified to its one billion monthly users. Each time our researcher browsed 9gag, they encountered at least 6 antisemitic memes. In the comment section, antisemitic content was always present. Alarmingly, we found that 99.1 per cent of the antisemitic memes that originated on 9gag, ended up on Reddit. The result is a pipeline of hate, flowing steadily from the margins into everyday discourse. Antisemitic supply, rather than demand. These antisemitic memes often evade automated and human moderation because their use of ironic humour, innuendo and ambiguous symbolism is designed to bypass detection systems while normalising hate.
This matters because online antisemitism does not exist in a vacuum. It shapes attitudes, normalises hostility, and fuels real-world harm. Community Security Trust figures show antisemitic incidents in Britain at record levels. While no single factor explains that rise, the role of online radicalisation is impossible to ignore.
When antisemitic conspiracies become familiar, shareable and socially acceptable, the consequences are felt offline. On 9gag, antisemitism is frequently delivered through humour, increasing its accessibility and appeal while obscuring its harmful intent.
One of our most troubling findings is how poorly current regulation is equipped to deal with this ecosystem. Laws and platform policies tend to focus on individual services in isolation, while the smaller platforms that generate and refine the most extreme content fall through the cracks.
Some deliberately position themselves as “free speech” havens to avoid scrutiny. All benefit from the fact that regulation has not kept pace with how online harms actually spread.
This is not just a British problem. The same patterns appear across Europe and beyond. Antisemitism online is transnational, adaptive and fast-moving. Tackling it requires regulators to think less about individual platforms, and more about systems, networks and circulation, and about meeting the risks those systems should be assessed for.
So what can be done?
First, regulators need to stop pretending that small platforms don’t matter. The safety frameworks around them should be expanded, with proportionate but meaningful obligations around moderation, transparency and cooperation with authorities.
Second, platforms must be required to assess not just the harm on their own services, but how their systems enable the amplification of content that originates elsewhere.
If a mainstream platform becomes the delivery mechanism for extremist antisemitism, it cannot shrug and claim the problem started somewhere else.
Third, moderation systems must be improved to account for coded antisemitism, including memes, irony and symbolism, which the research identifies as key methods used to evade detection.
Finally, antisemitism must be recognised as a specific and serious form of harm within online safety regimes – not treated as a niche issue or folded into generic hate categories that miss its distinctive patterns and risks.
Antisemitism online is evolving. Our response must evolve with it. If we continue to focus only on what Jews see on the biggest platforms, we will always be one step behind. To confront the problem properly, we need to follow it back to its source – and shut the pipeline down.
Danny Stone is the chief executive of the Antisemitism Policy Trust
To get more from opinion, click here to sign up for our free Editor's Picks newsletter.

