Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Fujifilm’s Instax Pal: The Next Big Thing in Photography

    September 22, 2023

    Bose QuietComfort Ultra Earbuds Review

    September 21, 2023

    Ray Tracing Technology in the Surface Laptop Go 3

    September 21, 2023
    Facebook Twitter Instagram
    Trending
    • Fujifilm’s Instax Pal: The Next Big Thing in Photography
    • Bose QuietComfort Ultra Earbuds Review
    • Ray Tracing Technology in the Surface Laptop Go 3
    • Uber Eats to Begin Accepting Food Stamps for Grocery Deliveries
    • Microsoft’s Surface Laptop Studio 2: A Gaming Powerhouse
    • Are Apple’s new AirPods Pro with USB-C charging case worth buying at a discount?
    • Windows’ Copilot AI: A New Era in Productivity Tools
    • The Steam Deck Returns to an All-Time Low – Get the Week’s Best Tech Deals
    Facebook Twitter Instagram YouTube
    Tech News Mart
    • Gadgets
    • How to
    • Tips
    • News
    • Reviews
    • Tech Guide
    • Technology
    Tech News Mart
    Home » New Study: Child Sexual Abuse Material Found on Mastodon

    New Study: Child Sexual Abuse Material Found on Mastodon

    adminBy adminJuly 25, 2023Updated:July 25, 2023No Comments3 Mins Read News
    Share
    Facebook Twitter LinkedIn Pinterest Email

    New Study Reveals Mastodon’s Dark Side: Availability and Spread of Child Sexual Abuse Material

    Mastodon, the decentralized network often regarded as an alternative to Twitter, has been found to be a hub for child sexual abuse material (CSAM), according to a recent study from Stanford’s Internet Observatory. Shockingly, within just two days, researchers discovered 112 instances of known CSAM across 325,000 posts on the platform. The first case was identified a mere five minutes into their investigation. To conduct this research, the Internet Observatory analyzed the 25 most popular Mastodon instances for CSAM. In addition, they utilized Google’s SafeSearch API and PhotoDNA, a powerful tool designed to identify flagged CSAM content. The results were alarming, with 554 pieces of explicit content matching hashtags and keywords associated with online child sexual abuse groups. Google SafeSearch classified all of these findings as explicit with the “highest confidence.”

    Understanding the CSAM Problem

    CSAM is a serious and illegal issue that plagues various online platforms, including social networks. Due to its decentralized nature, Mastodon faces specific challenges in curbing the spread of CSAM content. Unlike centralized platforms, where content moderation can be implemented more centrally, Mastodon’s federated model distributes moderation responsibilities among different instances. While this promotes independence and diversity, it also makes uniform content moderation challenging.

    CSAM content on Mastodon poses significant legal and ethical concerns. Hosting such material not only violates the platform’s guidelines but also contributes to the perpetuation of harm against minors. Additionally, it puts Mastodon at risk of legal repercussions and damages the platform’s reputation as a safe space for users.

    The study also uncovered 713 instances of the top 20 CSAM-related hashtags being used across the Fediverse in posts containing media, as well as 1,217 text-only posts that referred to “off-site CSAM trading or grooming of minors.” This widespread posting of CSAM is deeply concerning. Notably, the study references a recent incident involving the mastodon.xyz server outage, which was caused by CSAM being posted on Mastodon. The sole maintainer of the server acknowledged being alerted to CSAM content, but due to the limited capacity of moderation, it can sometimes take a few days to address such issues. Unlike larger platforms like Meta with extensive teams, Mastodon relies on the efforts of just one person.

    Although the content in question was addressed, the mastodon.xyz domain was temporarily suspended, preventing users from accessing the server until the situation was resolved. Subsequently, the domain was added to a “false positive” list by the registrar to prevent future takedowns. However, as researchers highlight, the action taken was not a false positive. David Thiel, one of the researchers involved in the study, expressed his concern, stating, “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close.” He emphasized the need for more effective tools to address child safety issues in decentralized networks like Mastodon.

    The rise in popularity of decentralized networks like Mastodon has brought about safety concerns. Unlike mainstream platforms, decentralized networks grant moderation control to individual instances, leading to inconsistencies across the Fediverse. To combat this, researchers recommend that networks like Mastodon adopt more robust tools for moderators, integrate PhotoDNA, and implement CyberTipline reporting.

    As the study sheds light on the disturbing prevalence of CSAM on Mastodon, it is clear that urgent action is required to protect the vulnerable from exploitation on decentralized networks.

     

    Related Posts

    Fujifilm’s Instax Pal: The Next Big Thing in Photography

    September 22, 2023

    Bose QuietComfort Ultra Earbuds Review

    September 21, 2023

    Ray Tracing Technology in the Surface Laptop Go 3

    September 21, 2023

    Leave A Reply Cancel Reply

    Categories
    • AI
    • Gadgets
    • How to
    • News
    • Reviews
    • Tech Guide
    • Technology
    • Tips
    • Uncategorized
    Archives
    • September 2023
    • August 2023
    • July 2023
    • June 2023
    • April 2023
    • March 2021
    Contact Us

    [email protected]

    Facebook Twitter Instagram Telegram
    Categories
    • AI
    • Gadgets
    • How to
    • News
    • Reviews
    • Tech Guide
    • Technology
    • Tips
    • Uncategorized

    Type above and press Enter to search. Press Esc to cancel.