Mastodon has a problem with child sexual abuse material, researchers say

solo
Tech news
solo26 July 2023Last Update : 2 months ago
Mastodon has a problem with child sexual abuse material, researchers say

In just two days, Stanford University researchers found hundreds of posts on Mastodon containing child sexual abuse material.

Mastodon, a social media platform that has become popular as a possible alternative to Twitter, is full of child sexual abuse material, according to a new report From researchers at Stanford University.

In just two days, researchers found 112 instances of child sexual abuse material out of nearly 325,000 analyzed posts.

They also found “554 instances identified by Google SafeSearch as sexually explicit content with the highest confidence,” reported Stanford’s Internet Observatory.

It only took about five minutes to find the first instance of child sexual abuse material.

Unlike big companies like Facebook, Twitter and YouTube, Mastodon is a decentralized social media site.

This means that multiple servers, called instances, are run independently and users are able to create their own accounts. Each instance creates its own code of conduct and rules.

Mastodon states that these are “implemented at a local level, not top-down like corporate social media, making it the most flexible to meet the needs of different groups of people.”

The Stanford researchers analyzed the top 25 Mastodon instances based on their total user count. The media was submitted to PhotoDNA and Google’s SafeSearch for analysis.

He also looked at the larger Fediverse, a group of decentralized social media platforms including Mastodon, BlueSky, Pleroma and Lemmy.

On Fediverse, the researchers found 713 uses of the top 20 child sexual abuse-related hashtags on posts with media and 1,217 on posts without media.

Decentralized social media security presents moderation problems

“At a time when the intersection of moderation and free speech is a difficult topic, decentralized social networks have attracted significant attention and many millions of new users,” Stanford researchers David Thiel and Renee DiResta wrote in a report published Monday and first published in The Washington Post.

They say this “decentralised” approach to social media presents challenges for security as there is no central moderation team to remove images of child abuse.

Thiel and Diresta write, “Although Mastodon allows user reports and has moderator tools to review them, it has no built-in mechanism for reporting CSAM (child sexual abuse material) to relevant child protection organizations.”

“It has no tools to help the moderator when exposed to painful material – for example, grayscaling and fine-grained blurring mechanisms.”

Mastodon has grown in popularity since Elon Musk took over Twitter.

Last week, founder and CEO Eugen Rochko forum said The monthly active user count was up to 2.1 million, which is “not too far off” from its previous peak.

Thiel and Diresta argue that decentralized social media can help “foster a more democratic environment”, but that it will need to solve security problems in order to “prosper”.

Short Link

Sorry Comments are closed