Just catching up with the WeDistribute.org articles about the harrassment of fediverse developers. I noticed at the time the vitriol against Ryan for adding ATProto support to BridgyFed...
https://wedistribute.org/2024/02/tear-down-walls-not-bridges/
... and stuck up for him.
But I had no idea what was happening to that poor guy from ContentNation;
https://wedistribute.org/2024/03/contentnation-mastodons-toxicity/
Seriously WTF?!?
Wow. I just can't imagine what goes through these people's heads.
"Ha, I'm so pious and justified in everything I do, and I dislike what this person is doing, so there's nothing wrong with me literally uploading CSAM to his service in order to get him shut down"
They tell you that there is no neutral ground, that you have to pick a side, and you're either with them or against them.
Do you want to be on the side that posts fucking child porn to get what they want?
Let me put it to you this way..... I have some serious connections and can get access to some really fucked up shit, physical and digital, pretty readily.
Even given all that, I don't think I could get my hands on CSAM if I wanted to pull a stunt like that. So that makes me think a few thoughts about the crowd that's so anti-public-meaning-public, and just so happens to have CSAM on hand (*cough* mastodon.art)
Maybe... just maybe... They're trying to hide child abuse?
@Hyolobrika @strypey Anyway, this is why as:Public will **NEVER** store images. It's a storage nightmare.
We *may* do CLIP interrogation on every image to allow you to search a *description* of the image, but storing the image itself is a fucking no-go.
I'll also point out that all of these losers stopped targeting as:Public literally as soon as they figured out I wasn't going away.
@strypey @Hyolobrika https://aspublic.org/
Search engine that sits on websocket endpoints on instances, so it doesn't need to scrape.
The existence of this search engine led to multiple police visits and the forced-disabling of public timeline access on Mastodon at the software level.
Remember that part of the allure of Mastodon in the first place was that Twitter was locking down APIs so they could be sold. Mastodon is now doing the same thing, causing the same problems, for "safety"
@r000t
> part of the allure of Mastodon in the first place was that Twitter was locking down APIs so they could be sold. Mastodon is now doing the same thing, causing the same problems, for "safety"
I guarantee I've been in as many frustrating arguments as you, with people who want the online equivalent of suspending object permanence so they can teleport, without any of the other logical consequences. This is a downside of trying to make decentralised social software ; )
(1/?)
@Hyolobrika
The root problem (no offence) is the lack of a pan-fediverse set of standards, defining what kinds of expectations can be met by social software, and how to label them so people can pick the settings that match their expectations.
I don't think it's unreasonable to want a posting type visible to anyone logged into a fediverse app, but not on the web. But "Public" is not a sensible word to use in describing that scope.
(2/2)
@r000t @Hyolobrika
@mint
> A few old-timers like @p could tell you about the reason why archiveteam.org abandoned their efforts at archiving fedi in its early days
I would like to read more about this. I'd love for some of the public fediverse to be archived, but it's pretty obvious that it needs to be selective and opt-in. Given the ability to opt-in to Mastodon posts being keyword searchable now, this has become more tractable.
@r000t
> just so happens to have CSAM on hand
I believe lolicon counts as CSAM in Germany, where CN is hosted, so that's not a high bar. I'm pretty sure anyone with a net connection could find lolicon if sufficiently motivated.
Also differing age of consent laws create grey areas. In NZ it's 16, so legitimate porn made here with a 17yo performer could be technically considered CSAM in a country with an age of consent of 18 or over.
@r000t
> Maybe... just maybe... They're trying to hide child abuse?
Let's not add fuel to the fire by throwing wild accusations around. Isn't that the kind of thing this thread is objecting to?
@strypey @Hyolobrika I mean these are the same people that sent the FBI to my house twice over a search engine and I have actually run into CSAM on mastodon.art where the admin actively refuses to do anything about it, choosing instead to harass anywhere I have an account.
@r000t
> sent the FBI to my house twice over a search engine
I utterly condemn this, on so many levels. But two wrongs don't make a right. Let's not descend to their level.
(1/2)
Although if this is the case;
@r000t
> I have actually run into CSAM on mastodon.art where the admin actively refuses to do anything about it
... I hope you reported it to the appropriate authorities. There is no excuse for hosting CSAM (1). The only effective way to prevent its production is old-fashioned detective work to find and free the children.
(1) CSAM is images of real humans. Lolicon, while distasteful, doesn't count, because no human is abused in its production.
@lamp
> mastodon is for sickos!!!
All generalisations are false ; )
... posted from a Mastodon account.
@r000t@ligma.pro
haha fuck no. who would?
@Hyolobrika@social.fbxl.net @strypey@mastodon.nzoss.nz