- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
Im not going to comment on whatever he’s commenting on.
Im just going to re-affirm that Tim Sweeney is a fucking moron. in any and all cases
Did Covid-19 make everyone lose their minds? This isn’t about corporate folks being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026… for Deepfake porn including of minors??? From the Fortnite company guy???
Unironically this behaviour is just “pivoting to a run for office as a Republican” vibes nowadays.
Its no longer even ‘weird behaviour’ for a US CEO.
For some reason Epic studios just let Tim Sweeney say the most insane things. If I was a shareholder I’d want someone to take his phone off him.
They could learn a lesson from Tesla.
Just look at that guy. If you were to ask 1000 people to describe what they thought a typical CSAM viewer looked like and averaged their responses together you would get something like this photo of Tim Sweeney.
That’s not even what gatekeeping means. Unless he’s trying to stand up for the universal right to participate in the child porn fandom.
If my political opponents are actually sexual predators and their speech is sexual harassment, I’m down with censoring them. That should be the least of their problems.
TIL Tim Sweeny is into child porn. Not surprising tbh.
The fall of rome… The fall of the perverse…
Literally this meme again

It helps that Tim Sweeney seems to always be wrong about everything.
I believe it’s called Let Them.
If you wait by the river long enough, the bodies of your enemies will float by
It’s called being so effective at marketing and spending so much money on it that people believe you don’t do nothing.
If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.
Nothing made-up is CSAM. That is the entire point of the term “CSAM.”
It’s like calling a horror movie murder.
It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.
I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.
Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.
You are completely wrong.
https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/
“CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”
“Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.
RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.
We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.
Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.
Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
[…]
Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
(Emphasis mine)‘These several things are illegal, including the real thing and several made-up things.’
Please stop misusing the term that explicitly refers to the the real thing.
‘No.’
deleted by creator
The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content
Were you too busy fapping to read the article?
Is it a sexualized depiction of a minor? Then it’s csam. Fuck all y’all pedo apologists.
AI CSAM was generated from real CSAM
AI being able to accurately undress kids is a real issue in multiple ways
AI can draw Shrek on the moon.
Do you think it needed real images of that?
It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.
The child porn it’s generating is based on literal child porn, if not itself just actual child porn.
You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?
Like combining unrelated concepts isn’t the whole fucking point?
No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.
True enough - but fortunately, there’s approximately zero such images readily-available on public websites, for obvious reasons. There certainly is not some well-labeled training set on par with all the images of Shrek.
It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.
And you think it’s short on images of fully naked women?
TIL Tim Sweeney doesn’t know what gatekeeping is
If there is one gate that definitely needs keeping, it is the kindergarten’s gate. Don’t let those creeps get away with it…
From making CSAM? Makes you wonder about this Sweeny guy.
Who else just did a search on the Epstein files for “Tim Sweeney”?
I didn’t find anything on jmail, but there’s still a lot that haven’t been released, and a lot of stuff is still redacted.
Man I just ran into 3 site blockages trying to open this. Somebody REALLY doesn’t want this to be read.
I use a VPN for 99.9% of personal Internet usage and had no issues connecting. So you’re probably correct that it’s being blocked by some means either intentional or not.
this guy is out of his mind
There’s this old adage, “never attribute to malice that which can be explained by stupidity”.
Tim Sweeney is very ignorant. However, he’s also pretty malicious. His fedoraèd waffling should probably be taken exactly for what it is.
I absolutely hate hanlon’s razor. It is only ever used to try to protect obviously malicious people.
I bet hes kind of right, here in the UK we just lost a whole bunch of rights and privacies online under the guise of “protect the kids” but its kind of weird to be piping up against it when theres actually protections needed.
It would be weird if it were the only time he’s called out the Google and Apple monopolies and their control over apps, but it’s been a running theme for him (and his legal battles). Two examples from a quick lookup:
- His tweet where he calls out Apple for removing privacy apps at the request of Russia, and for allegedly threatening to remove Twitter in 2024.
- His tweet on Apple removing the Russian social media app VK following the US sanctions related to the Russian invasion of Ukraine in 2022.
Personally, I see no issue with platforms removing content they deem to be problematic, and I’m sure Sweeney agrees, given that the Epic store prohibits pornography for example. However, as he’s said repeatedly, Apple in particular is unique in that it removing an app means there’s practically no way for an iPhone user to access it, since there’s no sideloading.
If it were an app dedicated to CSAM, I don’t think anyone would take issue, but his argument is that removing the app would deplatform all of its 500M users, most of whom are probably not pedos. I’m critical of people being on X, but it’s also undeniable that despite the far-right leaning and CEO, there are still leftists and people belonging to minority groups who are on it, for whatever reason. Are they pedophile and Nazi enablers too? I’m inclined to say yes, but I don’t know how many people would agree.
Edit: Format and details.
This isn’t really a change, though, I’m pretty sure. People have been able to make photo-realistic depictions a lot longer than AI has existed and those have rightfully been held to be illegal in most places because the confusion it causes makes it harder to stop the real thing.
I think the difference here is that Twitter has basically installed a “child porn” button. If their reaction had been to pull the product and install effective safeguards, it wouldn’t be as bad. It’s a serious fuckup, but people screw up every day.
Instead, they’ve made it so you can pay them to have access to the child porn generator.
Its not really a change, so much as its suddenly incredibly easy for anyone of any ability to do it as much as they want with near seamless results.
Every year its got easier and easier to do it more and more believably, but suddenly all you have to do is literally ask the computer and it happens. The line has to be drawn somewhere.













