Stanford researchers find Mastodon has a massive child abuse material problem

1 Like

Thatā€™s shocking, I had never even considered this as being a drawback to this federated approach.
Maybe Threads isnā€™t such a bad idea. Maybe Iā€™d prefer to leave Mastodon. I donā€™t know.
Shocked.

1 Like

Evil knows no geographic and technical boundaries.

4 Likes

Content Moderation is not an easy problem.

1 Like

I checked that the Mastodon instance Iā€™m on (mstdn.social) has a loooong list of other instances there are blocking for a variety of reasons, so I feel good that people administering instances are openly blocking this sort of thing.

2 Likes

I deleted a post I made before reading the article. I should know better.

Itā€™s a real problem. If you put up a public server, there needs to be a mechanism for jumping on CASM and other egregious abuse as it happens. Itā€™s a problem the fediverse needs to solve.

It can be shocking what crawls out of the woodwork when you put up a service that allowā€™s user-generated content. Being able to de-federate helps a lot, but that doesnā€™t prevent someone from making an account on an instance that isnā€™t known for CSAM and starting to post some.

Meta can throw money at the problem, but a volunteer-run mastodon instance probably canā€™t hire content moderators. Maybe Microsoft could make a Mastodon plugin, (if such things exist), for PhotoDNA that automatically scans posts, and make integration super simple. Or maybe someone can make a system or service to scan for commonly-used phrases. Basically, could someone make doing what Stanford did super easy and free? If so, I believe many instance admins would adopt it. Kind of like how blog comment spam was a problem in the early days of blogging, and everyone was hand-moderating, then Akismet came along and basically solved the comment spam problem.

4 Likes

Thereā€™s a surprise ā€”not

I was on Mastodon for 3 weeks and was appalled at the content. People may not like the restrictions of Twitter, but the unrestricted content of mastodon is worse. Big corporate organisations are obligated to clean the feeds by law. Personal federated instances of mastodon not so much.

1 Like

I realised that my use case is finding people I want to follow through other channels, then following them on Mastodon.

So, I wonā€™t see the bad stuff because I donā€™t ever look at the stream.
Hereā€™s hoping law enforcement will be able to follow up on on things.

I think the issue is a more complicated than whatā€™s briefly discussed in this article - one of the researchers did a thread about it on Mastodon to explain in more detail.

The most important point is that a lot of this content is coming from specific servers, many of which are already on blocklists. The researchers themselves state 87% of the content isnā€™t available on most main western Mastodon servers, where common blocklists are being run. This is a problem with a specific part of Mastodon, and most western users will never see it unless theyā€™re not on a local server or theyā€™re on a server with poor administration/moderation.

There is an issue that new western servers may not know that there are ā€œpre-madeā€ blocklists they can use, the new admins may not know how these work, or they may simply not care. My understanding (I am not an admin on Mastodon) is there is already discussion about how to make these pre-made blocklists more accessible to new servers, but the problem will still boil down to whether admins actually implement the use of pre-made blocklists. This will always be a problem, but itā€™s why you should choose a server with sensible admins, or pick a big established one that suits your preferences.

The researcher notes that the majority of the servers identified are being hosted in Japan, and that there has been an issue with the content of Japanese servers for many years now (not just illegal content, but also content that requires warnings, etc.). Because of this, many of the more established servers operate a precautionary principle to Japanese servers anyway. There is a specific issue here about law enforcement that international users of the internet cannot fix.

Finally, as @GraemeS notes, many folk curate their own feeds, so wonā€™t be seeing content like this, but as mentioned many of the bigger established servers already have the servers hosting content like this blocked.

@svsmailus your comment isnā€™t reflective of Mastodon as a whole, which is decentralised. If the server you joined was full of rubbish, you could look for another server. There isnā€™t ā€œone Mastodonā€. There are just different servers, some speaking to each other, some not. Saying that Mastodon is worse because of your experience on one server is like saying all of Reddit was rubbish because of one subreddit. With regards to your final point, hosting illegal content is still illegal, whether youā€™re a company or an individual, so it makes no difference who runs the servers - the law is the law. There is a specific issue here that one country appears to not be enforcing laws to the standard expected by other countries, but many Mastodon admins have long implemented rules to try and limit the impact this has on their own users.

6 Likes

It is not a good habit to comment on an article without having read the article. I am aware of that. To comment on the headline and on the sentence ā€œResearchers found CSAM within five minutes of searching Mastodonā€:

I do not post anything on social media platforms to be honest - with one exception (if you consider it social media): this forum. I have been a Mastodon user since November. Reading ā€œon Mastodonā€ that is. Daily. This is where I learn about tech stuff and where I am able to see where indie developers are headed at. I enjoy nature stuff. I enjoy stuff about dogs and what not. I never ever have encountered anything offensive. Why?

Mastodon is not like Twitter. And I am not referring to the banter whether it is not ā€œas goodā€ or ā€œnobody thereā€ and what not. Mastodon is the open internet. ā€œStanford researchers find the internet has a massive child abuse material problemā€. That would be more honest as far as I am concerned. Look for CSAM material on the internet and you probably will find it.

Everybody can set up his or her own server running a Mastodon instance.

So, I wrote that I never have encountered anything offensive and no child abuse material for sure ā€œon Mastodonā€. How did that happen? Well @Pupsino already has explained it pretty well. ā€œThere isnā€™t ā€˜one Mastodonā€™.ā€

I have consciously made the decision to choose a Mastodon server that fits my needs:

  • a big instance with a good reputation
  • an instance run by an admin that knows his stuff
  • an instance hosted in Europe (protected by and - yes - obliged to abide the European laws)

My choice was mstdn.social run by stuxāš” (@stux@mstdn.social) - Mastodon šŸ˜, hosted in Falkenstein, Germany at Hetzner.

The big advantage of Mastodon is that you can make of it what you want. I do not use the federated timeline (equals ā€œall public posts that the server you are on knows aboutā€). Even if I did, I am confident that ā€œmyā€ instanceā€™s admin has done a good job blocking shady instances. I do not even really use the local timeline (equals ā€œpublic posts created on your serverā€) any longer. Just the stuff ā€œmyā€ authors do post - on whatever instance they are on. Which is a very enjoyable experience. I am not bothered with offensive stuff, no adverts, no algorithm. Just the content authors and bots I have chosen provide me with, which leads to new authors and new bots - and sometimes to me unsubscribing from an author or bot. Yay!

I am a strong advocate of having an open internet. I do not think that billionaire-controlled entities (no matter if X, Y or Z :wink: ) do provide me with a better experience than the fediverse does. mstdn.social has proven the exact opposite to me: a decentralized open social media platform can and does work.

The big downside of Mastodon is that it has this image of being complicated, boring and what not. This is not my experience. Which does not mean that I do not understand those who have had a different, not so positive experience ā€œon Mastodonā€. I get it. But it really works for me. :slight_smile:

4 Likes

Sorry, I really donā€™t want to digress much on this, but just wanted to comment on this one aspect.
Saying the law is the law is simply not true. The law is different in different countries, sometimes wildly. Sometimes differences come from how rigorously laws are enforced, sometimes they come from different laws.

Now, this discussion is sadly focused around CSAM, but even with this I wouldnā€™t be surprised if it is not illegal to host the material in some parts of the world. Serving it globally, well thatā€™s another matter.

Iā€™ll also add that it is fair to say Reddit is rubbish because of one subreddit if that subreddit was allowed to persist, purely on the basis that it is a private company which I believe hosts everything and therefore, in my eyes, has a responsibility. The equivalent in Mastodon is that the host of a server has some responsibility for what is permitted to persist.

What you wrote in the preceding paragraphs was really interesting and informative :slight_smile:

Mastodon is not nice and tidy. :rofl:

1 Like

Same experience for me. Itā€™s fantastic. The chronological timeline with no algorithm is perfect. The website design and features are actually quite nice but Iā€™ve been using the Mona app for many months now and far prefer it. Iā€™m on an excellent server, carefully managed and have had zero problems. But also, I just spend my time reading the people Iā€™ve chosen to follow and if someone gets on a topic I tire of I can easily mute them or the topic. So nice. And the culture of Mastodon is so much friendlier and authentic in that those Iā€™m following feel like real people sharing life rather than influencers, broadcasters, etc that are in a space to promote themselves or their product. More af that has seeped in over the past 10 months but itā€™s easy to avoid.

By comparison when Threads opened up I set it up because I have an Instagram account (not posted to in a couple years) and have a few friends and family on Instagram. If Threads sets up ActivityPub support Iā€™ll follow a few folks from my Mastodon account. But in the hour I looked at Threads it was mostly algorithm. I went back a few days later and it was a little better in that more of the feed was my ported over Instagram contacts but still pretty yuck. Having not used it or Twitter in recent months Iā€™d somewhat forgotten the influence of the algorithm on a timeline. Ugh.

1 Like

When I first read the headline, but hadnā€™t yet read the article, my initial response was something along the lines of ā€œNo Sh*t, Is the next article going to be 'Researchers find Apache has a massive child abuse material problemā€ or ā€œā€¦ Linux has a massiveā€¦ā€ - you get the picture. Mastodon is software, which can be used for good or evil.

Then I read the article and they talked about CSAM appearing on larger, more mainstream, servers, so I figured the main problem was sorting the good from the bad on any given server, but if the researchers did a thread saying the main problem is bad instances, then thatā€™s mostly a solved problem through defederation or blocking of instances. Yes, making pre-built blocklists more discoverable is a good idea, but at least a solution exists, unlike unacceptable content mixed in with acceptable content on a single server, which can be reported, but canā€™t, (that Iā€™m aware of), be proactively blocked.

I guess this comment is a disguised rant against a poorly-written article?

Itā€™s also worth noting that law enforcement is frequently both slow and expensive, which leads them to prioritize things.

Reporting an instance of CSAM on a server might get your complaint on somebodyā€™s desk, who then issues a takedown notice. Itā€™s not like they can just ā€œarrestā€ the server. Then the platform/server owner responds, (possibly) bans an account, (possibly) hands over the userā€™s info, etc. The user may claim to have been hacked, the information may or may not even be trackable in a meaningful way, etc. Meanwhile, another anonymous user (possibly the same person!) pops back up somewhere else with a different account. Wash/rinse/repeat.

Playing whack-a-mole with people that are good at dodging technical restrictions is a Hard Problem. And much of the time, law enforcement is poorly-equipped to handle it.

Same here. I love the options Mona provides me with. And the syncing between devices works very well. :blush:

With sensational reports on research findings my first reaction is to ask ā€œwho conducted the research?ā€ and second ā€œwho funded it?ā€ Stanford is a reputable institution so okay they are probably using good methodologies. There is no indicator who paid for this specific research but the list of major donors to CSAM looks okay although many of them are related to computer companies.

So thirdly what does the research actually say? CSAM found 112 instances out of 325,000. It isnā€™t completely clear from the Verge report what they mean by ā€œinstanceā€. Is it 112 indiviudal images of abuse or 112 specific Mastadon servers hosting such material.

Fourthly, however one inteprets ā€œinstancesā€ what are the percentages? Their 112 out of 325,000 is 0.03%.

Fifthly is this anything new? From the moment mass delievery of media began (with Hillā€™s invention of the postage stamp) and convergence with the invention of photography (by Fox Talbot) has been used to distribute salacious material. Charles Dodgon might be remembered as an author but he had a highly dubious sideline in photographs of young/prepubsent girls some of whom where barely if at all clothed; some have argued that his photographs are abusive and were taken coersively, which might explain why Alice Liddel reputedly held a rather critical opinion of Dodgson.

Which makes me return to the first two questions who performed and paid for the research. What are their backgrounds. A couple of years ago there was a research report published by WHO that said we should all stop eating red meat immediately. There is some sense in that as causal links between consumption of red meat and particular forms of cancer are known. But ā€” and hereā€™s the kicker ā€” the lead researcher is an evangelical vegan. One is forced to question his objectivity over the conclusions drawn.

I am not arguning for abusive material but sensational headlines like this do not help prevent the creation nor stop the distrubtion of such material.

1 Like

This is a problem w/ decentralization that most donā€™t consider. The good news is that moderation can be splintered. Donā€™t like your current instance? Make your own, connect it up. The bad news is that any instance can be blocked and blacklisted.

However, if your intent for a Mastodon instance is illegal stuff then why would you federate it to begin with? You probably want a nice private isolated server. Of course, you can do this with hundreds of different pieces of technology, not just the Mastodon code base.

X (formerly known as Twitter) has complete control over any content and can track and trace and push anything bad into oblivion. Hate speech is way down:

https://twitter.com/Safety/status/1681661118018666496?s=20

The same has been reported for child abuse material.

So I guess thereā€™s something to be said for centralizaitonā€¦

https://www.washingtonpost.com/technology/2023/07/27/twitter-csam-dom-lucre-elon-musk/

Centralisation under an unaccountable leader can have its own difficulties.

2 Likes