Roblox to Ban Young Children from Messaging Others

Roblox, the immensely popular online gaming platform, is rolling out new safety measures to protect its youngest users. Starting Monday, children under the age of 13 will no longer be able to send private messages within games unless a verified parent or guardian grants permission.

What’s Changing?

Under the new policy, direct messaging (DM) will be disabled by default for users under 13. Young players can still participate in public conversations, which are visible to everyone in a game, but private chats will be off-limits unless a parent or guardian steps in.

In addition, parents will have expanded control over their child’s Roblox account. These controls include the ability to:

  • Review their child’s friend list.
  • Set daily limits on playtime.
  • Manage privacy and communication settings.

Roblox plans to implement these changes fully by March 2025.

Growing Focus on Child Safety

Roblox, which is especially popular with eight- to 12-year-olds in the UK, has faced calls to improve child safety on its platform. The platform boasts 88 million daily players, and the company has dedicated over 10% of its workforce to enhancing safety features.

Matt Kaufman, Roblox’s chief safety officer, explained the company’s evolving approach:

“As our platform has grown in scale, we have always recognised that our approach to safety must evolve with it.”

Roblox is also introducing identity verification for parents and guardians. To access parental controls, adults must verify their age and identity using a government-issued ID or a credit card.

Expert Reactions

The new measures have been welcomed by child safety advocates like Richard Collard from the NSPCC, who called them “a positive step.” However, Collard stressed the importance of robust age verification to ensure the changes are effective.

“Roblox must make this a priority to tackle the harm taking place on their site and protect young children,” he said.

Simplifying Content Guidelines

In addition to the messaging restrictions, Roblox is revamping its content labels. Instead of age-based recommendations, games will now feature descriptive labels outlining their content, allowing parents to make decisions based on their child’s maturity.

These labels range from:

  • Minimal: Occasional mild violence or fear.
  • Restricted: Strong violence or realistic blood, available only to verified users aged 17 and above.

By default, users under nine can access only minimal or mild content. Parents can grant access to moderate games if deemed appropriate.

Adapting to New UK Online Safety Laws

The changes coincide with the upcoming Online Safety Act, which requires platforms like Roblox to better protect children from harmful content. Ofcom, the UK watchdog overseeing the law, has warned companies that failure to comply could result in penalties.

Roblox’s new measures, combined with stricter content guidelines, aim to create a safer and more transparent experience for its millions of young users.

📧 Reach out to our news team by emailing us at news@thetechblog.co.uk.

📰 For more stories like this, visit our news page! 🌟

🔗 Follow us on X and TikTok!

Share.
Exit mobile version