Grok has yanked its image-generation toy out of the hands of most X users after the UK government openly weighed a ban over the AI feature that “undressed” people on command.
In replies posted to users on X, seen by The Register, the Grok account confirmed that “image generation and editing are currently limited to paying subscribers,” a change from the previous setup in which anyone could summon the system by tagging it in a post and asking for a picture.
That access helped fuel a grim trend: users uploading photos of clothed people – sometimes underage – and instructing the bot to remove their clothes or pose them in sexualized ways. Grok complied.
The rollback comes as governments openly float the idea of banning or boycotting X altogether if it fails to rein in the abuse enabled by its AI tools. In the UK, screenshots of Grok-generated images quickly drew the attention of ministers and regulators, who began questioning whether X is complying with the Online Safety Act.
Update
The Verge and Ars seems to be claiming otherwise. However, I don’t know for certain since I left Twitter ages ago.
You would get thrown in jail, rightfully, for doing this once.
Corporation does it by the truckload and they are politely told to please stop, if they don’t mind.
Not even the trivial, meaningless fines we’re used to reading about.
This world is broken.
FBI is usually the biggest distributors of csam.
Imagine being on the team at Twitter that had to work on this tool. Fuck me. You must hate your every living minute.
Ok so we need to define the acceptance criteria. GIVEN I am a user WHEN I click the button THEN grok should remove clothes from the image. Great. Any non functionals? Nope. Ok cool. Ship it.
You assume they have morals, yet they work for X on an advanced feature
Limited to paying subscribers. So the only thing stopping people from creating CSAM, and fake sex pics of famous people is $8 a month?
Yeah… that’ll show them.
But it will certainly drive up subscriptions 😉 … 🤮
With US oligarchs “threats” and discussions do not work. Only action and the type of action that hurts them personally.
Threats do work when delivered properly. See how the EU forced Apple into moving to USB-C or opening up sideloading/alternate app markets.
You just need to have weight to throw around behind your threats. The EU has weight. The UK alone? Hah.
Cynical take: They only did it openly to draw attention to it & pull in more subscribers. Because you can be sure Xhitter’s clientele loves that feature.
“image generation and editing are currently limited to paying subscribers,” a change from the previous setup in which anyone could summon the system by tagging it in a post and asking for a picture.
Seems like this has some teeth at least. Elon usually doesn’t budge at these sorts of legislation changes but this cleary has him a little nervous at least.
Perhaps, but if anything they should have used this as a cover to ban twitter. I do not find American style ostentatious “freedom polemics” to be convincing, but on top of that Musk has been found to meddle in local politics to benefit the far right and promote UK criminals.
Still waiting for an actual use case of image/video gen that isn’t just for cheap assets or pure malice
But… those ARE the only two use cases.
Don’t be so negative! It’s also found a huge market in scams. Both for stealing celebrity likenesses, and making pictures and video of nonexistent products.
rule 34. yeah not her
Sheesh. So far I thought it’s just one of those things Xhitter’s AI image generator can be told to do. But to have a specific <undress>
buttonfeature … 🤮 Was it point-and click even?Nothing says “we’re for male sexists only, and we know it” like adding a specific <undress>
buttonfeature.Is there a specific “undress” button? I tried looking for proof that it exists but couldn’t find any (my searching skills clearly need work). Could you please share a screenshot or point me to a place where I can confirm that it exists?
sorry, I meant ‘AI feature that “undressed” people on command’ (from the article). No idea if it’s an actual button, I don’t use Xhitter. But I edited my comment now.
It will culminate in revising “minor” to something scientific. Age is arbitrary and impossible for AI alignment to handle. Neoteny and the Tanner Scale is visually present and real. That fundamentally shifts culture in large ways because retention of childlike traits is neoteny, and the scientific definition of beauty. That is well outside the scope of Anglo cultural awareness, and will be hard for the Puritan backwardness to adjust. Like Riley Reid is so popular because of her neoteny. Even with actors like Tom Cruise, the reason people like ‘short people’ in media is the neotenous head to body proportions closer to 1 to 5. All cartoons are doing the same but exaggerated. “Minor” must be scientifically grounded better than the age when male children are large enough to carry the kit of grossly immortal mass murder orgies the halfwits call war.
Arbitrary age is irrelevant in purely visual media. Attempting to somehow mass surveil everyone to enforce it is dystopian insanity. The only solution is to be more scientifically grounded. It is ugly, and a bit ick, but whatever, those are illogical emotions fighting real justice and must be suppressed. Most countries like Germany and France have lowered their ages for minors. The only scientifically relevant ages are puberty and cognitive maturity at 25. Make it 25 and you will be supporting your kids until 25 financially, and the military will fall apart from people with the mind to question orders and think for themselves in mass.
Edit: To be clear, I do not want people fucking kids or what not, or exploiting them. I’m against authoritarian parenting by proxy with a state making up for terrible parents, but that is another thing entirely. I do not care about this personally or in justification of anything. It is simply an observation of inevitability as data expands and ages. Images in 100 years will still be around just the same. Their categorization must be derived from the pixels themselves. There is no other long term solution. How anyone feels about that is totally irrelevant stupidity and amounts to nothing more than stupid children putting their head in the sand and saying “nuh uhh!” Science and democracy are ugly. They always have been. If you cannot handle the ugly with the good, you are a fascist or authoritarian, and to that dystopian nightmare culture, fuck you for starting a war for democracy soon. Sincerely.
I don’t think it really matters how old the target is. Generating nude images of real people without their consent is fucked up no matter how old anyone involved is.
I agree, but the tool is not to blame and never has been. This is not unique. The same thing was said with the advent of digital photo editing. Fakes predate that when people made them with film. In each instance, it is the ethics of the people that are the problem. At the deepest abstract levels, if you play out all of the implications, there is no difference between a tool and the thoughts in your head. Blaming tools is always a problem and only an illusion of a solution.
I have been able to edit photos like this since before AI, but have never done so. When I was a kid in school I drew stuff like this. So the problems is actually intent and social accountability more than anything else.
I don’t like that the issue boils down to this. I think it is ugly, but I think the alternative is far far worse. I think those that are pushing this issue are convenient idiots combined with people that are hell bent on a fascist authoritarian future that is truly horrendous. If posting dubious child nudes of me will stop fascism and WW3, what poses would you like me to strike?











