Home / News / Technology / WhatsApp Lowers Minimum Age From 16 to 13 Following Meta’s Protest Against UK Online Safety Act
4 min read

WhatsApp Lowers Minimum Age From 16 to 13 Following Meta’s Protest Against UK Online Safety Act

Last Updated April 12, 2024 4:27 PM
James Morales
Last Updated April 12, 2024 4:27 PM

Key Takeaways

  • WhatsApp has lowered the minimum age for users in the UK and EU from 13 to 16. 
  • Some campaigners in the UK have criticized the move.
  • Meta is increasingly at odds with the UK government’s stance on online safety.

On Wednesday, April 10, WhatsApp lowered the minimum age for users in the UK and EU from 13 to 16. 

The move comes as Meta finds itself increasingly at odds with the UK’s online safety movement, which has embraced a prohibitive approach to children’s use of social media. 

Campaigners Condemn WhatsApp Age Change

Responding to Meta’s policy change, the campaign group Smartphone Free Childhood condemned the move:

“Officially allowing anyone over the age of 12 to use their platform (the minimum age was 16 before today) sends a message that it’s safe for children,” it said in a statement .

“But teachers, parents and experts tell a very different story. As a community, we’re fed up with the tech giants putting their shareholder profits before protecting our children.”

Groups like Smartphone Free Childhood are part of a movement that has seen parents push back against the popular practice of giving children smartphones at the age of 11, when they start secondary school in the UK. 

The group’s concerns are centered on social media addiction, exposure to harmful content and the developmental consequences of excessive smartphone usage. 

Children’s Social Media Use Central to Online Safety Discussion

In recent months, calls for a new approach to children using smartphones have caught the attention of the UK government, which is reportedly  considering banning the sale of smartphones to under-16s.

It is also said to be undergoing a consultation  into restricting social media access for young people, a concept that has already gained legislative traction in several US states.

Responding to questions about the consultation, the prime minister’s deputy official spokesperson told reporters the government was “looking broadly at this issue of keeping children safe online.”

The government has already clashed with Meta on the issue of child safety, arguing that end-to-end encryption protocols used by WhatsApp and Facebook Messenger could hinder law enforcement’s ability to identify and prosecute abusers.

Legal Status of End-to-End Encryption

Meta’s disagreement with the government relates to a controversial clause in the Online Safety Act.

In its current phrasing, the Act gives the media regulator Ofcom powers to force operators of encrypted messenger apps to introduce back doors for law enforcement or face penalties.

After a fierce backlash from Meta, Apple and various privacy advocates, the government backed down at the eleventh hour, issuing guidance that prevents Ofcom from enforcing the provision.

However, the issue hasn’t actually been resolved. 

While the guidance was enough to keep services including WhatsApp, iMessage and Signal legal in the UK, the Online Safety Act remains fundamentally incompatible with end-to-end encryption.

The government’s statements on the matter indicate that some kind of workaround can be found that would ensure any back door would only be used to screen communications for evidence of child abuse. But regardless of the intent, such mass surveillance of communications undermines the privacy of all users. 

Was this Article helpful? Yes No