But if we can stop people from looking at the illegal/dangerous stuff, and use AI to create it, let those people watch that instead, I think that would be a net positive. Of course you’d want to identify them, tag them and keep them separate from the rest of people; it’s not a solution to the problem they create, but if you can reduce the demand for it, I dunno, I want nothing to do with that kind of stuff, but I feel like there’s a solution in there somewhere.
CP detectors got really good well before image gen was even a thing. They had to, as image hosting sites had to filter it somehow. So that’s quite solvable.
Look at CivitAI as a modern example.
They filter deepfakes. They filter CP. They correctly categorize and tag NSFW, all automatically and (seemingly) very accurately. You are describing a long solved problem in any jurisdiction that will actually enforce their laws.
If you’re worried about power/water usage, already solved too. See frugal models like this, that could basically serve porn to the whole planet for pennies: https://arxiv.org/html/2511.22699v2
IMO the biggest sticking point is datasets… The Chinese are certainly using some questionable data for the base models folks tend to use, though the porn finetunes tend to use publicly hosted booru data and such.







Eh I disagree with the power usage point, specifically. Don’t listen to Altman lie through his teeth; generation and training should be dirt cheap.
See the recent Z Image, which was trained on a shoestring budget and costs basically nothing to run: https://arxiv.org/html/2511.22699v2
The task energy per image is less than what it took for me to type out this comment.
As for if we “need” it, yeah, that’s a good point and what I was curious about.
But then again… I don’t get why people use a lot of porn services. As an example, I just don’t see the appeal of OF, yet it’s a colossal enterprise.