1000 child porn ai pictures an hour is the current estimate. Being made, on Musk’s twitter, now known as shit.
Some are suggesting Musk is creating lists of Child porn makers on twitter, who dont realise they’re being filmed, 👈 so they can be blackmailed into criminal acts, like Putin compromised Trump. 👈


Before Elon, Twitter had problem with not removing reported CSAM at all. It’s not good VS bad, it’s shit all the way, just the flavor is different.
But at least I’ve heard less people fumbling into CSAM. But by judging your avatar, you might be just defending your beloved AI slop at all cost.
Don’t blame the tool, blame the user for misusing that tool
Still, guardrails should be set up, in which Epstein island visitor, Elon “on Mars the age of consent could be 14” Musk is clearly not interested in…
I don’t disagree. Anti-nudity filters in most common image generation models are available since the day 1
Elon fucked up hard.