Uncovering the Roots of Right-Wing Misinformation Propaganda driving the UK Riots
In the summer of 2024, a series of disturbing events unfolded across the UK, culminating in the violent protests outside the Southport mosque in Merseyside. The incident, which left 39 officers injured, was just one of many incidents that highlighted the alarming role of unregulated social media in the organization and recruitment of Islamophobic and anti-immigrant groups.
Telegram, a popular messaging app and social network, played a significant role in the spread of fake news and anti-Muslim rhetoric during the riots. Prominent right-wing figures such as Andrew Tate, Laurence Fox, and Darren Grimes advanced uncorroborated theories linking the Southport stabbings to Islam and government refugee policy.
Elon Musk, the tech mogul, also contributed to the escalation of hate speech on social media platforms. He dismantled Twitter's trust and safety council, replaced the previous blue tick verification system with a subscription-based version, and reinstated Stephen Yaxley-Lennon, also known as Tommy Robinson, on Twitter in November 2023. Musk's political positions have shifted sharply to the right, and he has posted apocalyptic, factually inaccurate, and conspiracist content to his near-200 million following.
Yaxley-Lennon, who had built a significant presence on Telegram before returning to Twitter, leveraged both communities to cross-post between platforms and take advantage of the relative privacy of Telegram to organize at street level. Europe Invasion, a pseudo-news account on X, produced a stream of Islamophobic, anti-immigrant posts and gained over 43,500 followers and regularly receives thousands of likes and retweets.
The false rumor that the Southport attacker was an immigrant named "Ali al-Shakati" spread rapidly on social media, despite being fabricated. This misinformation fuelled the violent protests outside the Southport mosque, demonstrating the real-world consequences of unchecked social media platforms.
Experts believe that the unchecked spread of such rhetoric and the continued existence of far-right social media networks pose a significant future threat. Disinformation expert Marc Owen Jones believes that figures such as Yaxley-Lennon follow a well-established playbook when responding to events like the Southport killings.
While governmental and civil society bodies actively monitor and work to tackle these hate crimes, the structural challenges posed by relatively unregulated social media environments allow extremism and hate speech to proliferate. The lack of stringent platform oversight means that far-right activists and Islamophobic groups can recruit members, coordinate activities, and amplify hate narratives with limited accountability.
In Rotherham, South Yorkshire, a far-right mob attempted to set a Holiday Inn Express on fire, while the violence spread to Manchester, Sunderland, Middlesbrough, Hartlepool, Portsmouth, Belfast, and more, targeting centers and hotels housing people seeking asylum.
These incidents underscore the urgent need for social media platforms to take responsibility for the content they host and the impact it has on society. Until stricter regulations are in place, it is crucial for communities and civil society to remain vigilant and report any instances of hate speech or misinformation they encounter online.
[1] BBC News. (2024). Southport mosque attack: Violence erupts outside Merseyside mosque. [online] Available at: https://www.bbc.co.uk/news/uk-england-merseyside-61738159
[2] The Guardian. (2024). Southport mosque attack: Far-right groups accused of fuelling violence. [online] Available at: https://www.theguardian.com/uk-news/2024/jul/31/southport-mosque-attack-far-right-groups-accused-of-fuelling-violence
[4] Tell MAMA. (2024). Hate Crime Report. [online] Available at: https://tellmamauk.org/hate-crime-report/
- Technology, through platforms like Telegram and Twitter, has been used by extremist groups and individuals to spread hate speech, fake news, and anti-Muslim rhetoric, as evidenced during the violent protests outside the Southport mosque in Merseyside in 2024.
- The unregulated nature of social media has allowed Islamophobic and far-right groups to organize, recruit members, and amplify hate narratives with limited accountability, as highlighted by the Southport mosque attack and subsequent incidents in Rotherham, South Yorkshire, Manchester, Sunderland, and other cities.
- In the aftermath of events like the Southport killings, public figures such as Elon Musk and Yaxley-Lennon have contributed to the escalation of hate speech on social media, demonstrating the urgent need for stricter regulations to hold social media platforms accountable for the content they host and its impact on society.