Skip to main content

Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best experience possible, please make sure any ad blockers are switched off, or add https://experience.tinypass.com to your trusted sites, and refresh the page.

If you have any questions or need help you can email us.

Elon Musk’s green light for paedophilia

First Musk embraced the global hard right and now, by failing to limit images of child sexual abuse on X, he has added child abusers to his list of fellow travellers

Elon Musk, a valuable friend to the far right. Image: TNW

There was a time, not all that long ago, when Elon Musk proclaimed himself so appalled by images and videos of child abuse online that he declared tackling it to be “priority #1” for Twitter.

As Musk completed his $44 billion purchase of Twitter, as it was then known, in November 2022, he launched a very public blitz against child abuse material (known as CSAM) on the network, leading public fury against the site’s former management for its supposed inaction on the subject.

Musk banned some hashtags known to be used to share CSAM, and released statistics supposedly revealing how many more images were getting caught than before. When the French president Emmanuel Macron asked in French “whether the bird will protect the children” under Musk’s management, he replied “absolutement”. When a blue tick user praised Musk’s efforts to tackle CSAM, he promised it “will forever be our top priority”.

Behind the scenes, there was always some scepticism around Musk’s very public moral stance. Democratic senator Dick Durbin noted that Musk had laid off half of the team within Twitter that actually tackled CSAM day-to-day in his first weeks. When experts from the site’s Trust and Safety council flagged issues that they said contradicted Musk’s claims that the problem was now all but sorted, he simply dissolved the council. 

When Twitter’s former head of content moderation, Yoel Roth, criticised him, Musk launched a public pile-on against Roth, centred around false claims that Roth had advocated for gay men to be able to message minors on platforms like Grindr – leading to a relentless stream of abuse, harassment and death threats that forced Roth into hiding.

Musk’s time as a defender and champion of children against abuse might have been questionable, but it still stands in marked contrast to his actions in recent weeks. Researchers and journalists across the world have found evidence that Musk’s AI, Grok, has been generating sexualised images of women and children unchecked, for months – with its use for this purpose stepping up exponentially in recent weeks.

Grok would readily “nudify” images of underage teenagers down to bikinis at users’ request, before going on to pose the pictures – of real children – into more sexualised positions. Women online who angered the wrong user on X would suddenly find their mentions full of pictures of themselves, first stripped down and then posed in increasingly sexualised behaviour. 

Grok would take it further, complying with requests to then show the women tied up or otherwise bound, covered in bruises, and even coated in bodily fluids (if users asked for them to look like they were covered in “donut glaze”). 

Much of this is already overtly criminal activity in much of the world, and for good reason – experts on child sexual abuse warn that even fully AI-generated imagery not featuring real children creates a market for real abuse, even as it normalises it. And it shouldn’t need saying why the ability to create violent sexual imagery of real people is an obvious and vicious form of abuse.

What was shocking was that until the scale of the publicity backlash became apparent, Elon Musk didn’t even bother to seem sorry about any of it. For days, Musk shared posts joking about the situation or mocking those outraged by it. He tried to tie tackling CSAM and sexual degradation on his platform to “censorship”, the framing he’s used to stymie efforts to force him to moderate the overt racism now fully tolerated on his site.

Musk went from the billionaire promising to tackle child abuse on the internet to the billionaire promising to keep it online. As the backlash intensified, Musk briefly limited Grok’s image generation features to paid users – meaning that generating CSAM became a premium product – and then promised to block such content, but only in countries where it is illegal.

Even that final stance, which appears to tackle the issue, does little of the sort. Instead of saying child abuse imagery is bad, or that he won’t allow it on his networks, Musk has merely said users can only generate it where there aren’t specific laws against it. The richest man on the planet’s stated policy appears to be that if there isn’t an explicit law against it, then feel free to use Grok to sexualise images of children; a green light for the proliferation of paedophilic content. 

What is worse is that he seems to have convinced politicians that this is somehow a nuanced or balanced political debate. 

The British public oppose efforts to generate either CSAM or abusive sexualised images of real adults by ratios of more than 20 to 1. The public remains absolutely united on this issue, finding the very idea absolutely repulsive.

Despite that, both Kemi Badenoch and Nigel Farage came out against the idea of temporarily banning X in the UK until it actually tackled Grok’s generation of CSAM and sexual abuse. Somehow, the leaders of two major political parties – netting around 40% of the vote at present – have become so internet poisoned that they think their voters would be against a crackdown on online paedophilia.

Elon Musk and X can back down and U-turn as they wish, but the realities will prove hard to shift. Musk cannot pretend to truly care about this issue any more. Any action he takes to stop child abuse online will have to be forced. He is too terminally online to care, otherwise. 

That alone would be enough to make Musk’s company X the preferred destination for child abusers and paedophiles across the world. But the second order effect, of paralysing our politics still further, and breaking down political consensus even on an issue as cut and dried as this, might prove even more socially damaging in the long term.

Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best experience possible, please make sure any ad blockers are switched off, or add https://experience.tinypass.com to your trusted sites, and refresh the page.

If you have any questions or need help you can email us.