Discord's Dark Side: Illicit Communities Thrive as Security Experts Warn
Discord, initially designed for gamers, is increasingly being used by illicit communities and threat actors. Security experts warn of its potential for harm, as seen in the creation of illicit marketplaces, extremist discussions, and event planning by protestors.
Discord's server-based structure, while convenient for users, poses challenges for investigators. Harmful discussions and illegal activities can hide within these servers, making it difficult to track. Specialized search engines like dyskadia.com and discord.me can help locate publicly listed servers, but operational security (OPSEC) and ethical considerations are crucial during investigations.
Threat actors are exploiting Discord's features to create illicit marketplaces, coordinating and selling contraband. Extremist groups are using it for ideological discussions and planning real-world activities. Even protestors and rioters are leveraging Discord for event planning and logistics. Communities are sharing sensitive information, posing risks to executive protection.
Recorded Future, a security company, hosted an on-demand webinar about using Discord for open-source intelligence. They highlighted OSINT pivoting techniques using other social media platforms to uncover hidden Discord communities. Discord's internal search function can also be used to filter thousands of messages and find specific information.
Discord's popularity among illicit communities and threat actors underscores the need for vigilance and responsible investigation. Security experts must stay informed about evolving threats and the tools used to facilitate them. As Discord continues to grow, so too must our understanding of its potential risks and how to mitigate them.