If you know anyone who’s under 14, chances are they’re on Roblox. The latest breakdown of the UK’s online habits from Ofcom showed that the platform is now the country’s most popular gaming app, used by roughly 3.4 million UK children and accounting for 5% of all time spent online for under-14s.
As a result, it’s big business; while technically free to play, the company made $3.5bn in revenue in 2024, partly from people paying real cash for “Robux” with which players can buy virtual goods for their virtual avatars (there’s a non-zero chance the Roblox enthusiast in your life will have asked for Robux for Christmas). Roblox turns 20 in 2026. It is time we paid more attention to what goes on there.
The Roblox experience is ostensibly simple: you sign up, you create a character that you can customise to your heart’s desire, although you may have to shell out for the fancier outfits, and you play one of the literally tens of thousands of games on the platform. The vast majority of these games are made by other players, ranging from Mario-esque platformers to driving games to job simulators to largely-inexplicable recent crazes like “Steal a Brainrot” (ask a child).
There’s a significant social layer letting players talk to each other (voicechat is available for 13+ users). Brands also have a presence on the platform, and for enough Robux, you can even buy Gucci drip for your avatar. Charli XCX, The Weeknd and 2hollis are among the artists who’ve performed virtual concerts there. While Meta might be scaling back spending on the metaverse, for GenAlpha, the metaverse is very much here in Roblox.
The platform is estimated to have over 100m daily active users worldwide. But there’s growing evidence that it isn’t quite as focused on their safety as it perhaps ought to be.
Multiple investigations have documented organised exploitation groups operating on Roblox, using the platform to contact children, then move conversations to Discord or other services; since 2018 at least 30 people in the US have been arrested for abducting or sexually abusing children they first groomed on Roblox.
A 2024 Bloomberg investigation found that Roblox reported over 13,000 incidents of child exploitation to the US National Center for Missing and Exploited Children in 2023 alone, while an investor report from the same year labeled the platform “a paedophile hellscape”. Reports of sexually-themed gamerooms – known as “condo games” – sexual roleplay and sexualised chat are a regular occurrence.
There are questions too about the platform’s moderation of the experiences created by other users. The Anti-Defamation League’s Center on Extremism uncovered multiple games on the platform created by a company calling itself Active Shooter Studios, allowing players to recreate famous instances of mass-shootings including Columbine and Uvalde, complete with realistic maps of each area and the ability for players to either attempt to fend off the police or commit suicide at the end of a “run”.
Rather than keeping them safe, it seems the company’s primary focus is making money out of its users. Beyond the brands and the microstransactions, the platform is increasingly seen by advertisers as an excellent place to get the attention of lots of young eyeballs. Earlier this year, Roblox announced a partnership with Google, meaning brands can use the company’s ad networks to pump promotional messages into gameworlds at vast scale. Even better, creators who allow ads in their games can link watching promos to in-game rewards, neatly incentivising children to stare at messages telling them to BUY THIS or CLICK HERE.
Suggested Reading
NFTs: a warning from history
It’s also a multibillion dollar business model built on user-generated content being made by the largely-unremunerated labour of literal children. A 2021 investigation by YouTube channel People Make Games highlighted the distance between the Roblox narrative (“earn money making games on Roblox!”) and the reality which sees most game-creators earn nothing at all. Those that do see their earnings curtailed by the platform’s in-game economics and the fact that Robux is effectively company scrip by another name. Can this be good for nine-year-olds, or society/the world? It doesn’t feel good.
The company’s leadership has done little to reassure those who feel child safety may not be a priority. When asked by the NYT whether Roblox had a problem with predators, the company’s CEO David Baszucki gave the following less-than-convincing response: “I think we’re doing an incredible job at innovating relative to the number of people on our platform and the hours, in really leaning into the future of how this is going to work.”
Roblox did recently announce a suite of additional safety and age verification measures, driven in part by legislation such as the UK’s Online Safety Act. But, as has already seen here and with the rollout of the under-16s social media ban in Australia, these are not always as simple to implement as they are to promise.
Indeed, the company’s long-term priorities seem focused elsewhere. Speaking in August, Baszucki outlined his hopes that “someday we’ll have dating on Roblox”, and that “lots of people who might find it hard to go on a real-life date could go on a virtual date instead.” Dating on a platform used by millions of under-13s? What could possibly go wrong?
It is vital that we apply greater consideration to how best to ensure that Roblox and platforms like it are designed to protect the safety of their users, because, whether you like it or not, they are where so much of 21st-century socialising takes place.
Entire generations of people see little difference between socialising via WhatsApp, voice note or sharing a three-hour shift at the virtual Roblox IKEA. These are the new gathering places, the same way as we hung out at the mall in the park or the rotunda or whatever the fuck old people did.
But the companies developing these platforms should at least be required to ensure that they’re not ultracapitalist hellscapes full of sexual predators.
