Site icon Tech News Mart

New Study: Child Sexual Abuse Material Found on Mastodon

New Study Reveals Mastodon’s Dark Side: Availability and Spread of Child Sexual Abuse Material

Mastodon, the decentralized network often regarded as an alternative to Twitter, has been found to be a hub for child sexual abuse material (CSAM), according to a recent study from Stanford’s Internet Observatory. Shockingly, within just two days, researchers discovered 112 instances of known CSAM across 325,000 posts on the platform. The first case was identified a mere five minutes into their investigation. To conduct this research, the Internet Observatory analyzed the 25 most popular Mastodon instances for CSAM. In addition, they utilized Google’s SafeSearch API and PhotoDNA, a powerful tool designed to identify flagged CSAM content. The results were alarming, with 554 pieces of explicit content matching hashtags and keywords associated with online child sexual abuse groups. Google SafeSearch classified all of these findings as explicit with the “highest confidence.”

Understanding the CSAM Problem

CSAM is a serious and illegal issue that plagues various online platforms, including social networks. Due to its decentralized nature, Mastodon faces specific challenges in curbing the spread of CSAM content. Unlike centralized platforms, where content moderation can be implemented more centrally, Mastodon’s federated model distributes moderation responsibilities among different instances. While this promotes independence and diversity, it also makes uniform content moderation challenging.

CSAM content on Mastodon poses significant legal and ethical concerns. Hosting such material not only violates the platform’s guidelines but also contributes to the perpetuation of harm against minors. Additionally, it puts Mastodon at risk of legal repercussions and damages the platform’s reputation as a safe space for users.

The study also uncovered 713 instances of the top 20 CSAM-related hashtags being used across the Fediverse in posts containing media, as well as 1,217 text-only posts that referred to “off-site CSAM trading or grooming of minors.” This widespread posting of CSAM is deeply concerning. Notably, the study references a recent incident involving the mastodon.xyz server outage, which was caused by CSAM being posted on Mastodon. The sole maintainer of the server acknowledged being alerted to CSAM content, but due to the limited capacity of moderation, it can sometimes take a few days to address such issues. Unlike larger platforms like Meta with extensive teams, Mastodon relies on the efforts of just one person.

Although the content in question was addressed, the mastodon.xyz domain was temporarily suspended, preventing users from accessing the server until the situation was resolved. Subsequently, the domain was added to a “false positive” list by the registrar to prevent future takedowns. However, as researchers highlight, the action taken was not a false positive. David Thiel, one of the researchers involved in the study, expressed his concern, stating, “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close.” He emphasized the need for more effective tools to address child safety issues in decentralized networks like Mastodon.

The rise in popularity of decentralized networks like Mastodon has brought about safety concerns. Unlike mainstream platforms, decentralized networks grant moderation control to individual instances, leading to inconsistencies across the Fediverse. To combat this, researchers recommend that networks like Mastodon adopt more robust tools for moderators, integrate PhotoDNA, and implement CyberTipline reporting.

As the study sheds light on the disturbing prevalence of CSAM on Mastodon, it is clear that urgent action is required to protect the vulnerable from exploitation on decentralized networks.

 

Exit mobile version