Key Takeaways
Half a year after Meta introduced dedicated Instagram accounts for teens, the company has expanded the offering to Facebook and Messenger.
With built-in protections for young users, Teen Accounts are a key part of Meta’s strategy to comply with regulations like the U.K.’s Online Safety Act.
Across its social media platforms, Meta’s Teen Accounts offer several features to protect under 16s.
Teen Accounts are private by default and restrict users’ exposure to sensitive content, such as content that shows people fighting or promotes cosmetic procedures. They can also only be messaged by people they follow or are already connected to.
The latest update will bar teens from Instagram Live unless their parents grant them permission. Meanwhile, a new feature will automatically blur images containing suspected nudity in direct messages.
Meta initially launched Teen Accounts after coming under fire for not doing enough to protect young people from social media harms like addiction, online bullying and sexual exploitation.
In the U.S., Congressional scrutiny and a rising tide of litigation have prompted the firm to enhance its child protection measures.
Meanwhile, in the U.K., the 2023 Online Safety Act codified platforms’ responsibility to provide age-appropriate services.
By one measure, Teen Accounts have been broadly successful, with 97% of users aged 13–15 keeping default restrictions in place.
However, the parental control system relies on users being truthful about their age.
The Online Safety Act requires platform operators to use some form of “age verification or age estimation.”
Age verification tools generally require a user to show some form of identification to access features reserved for adults. Meanwhile, age estimation relies on artificial intelligence to assess each user’s age based on an uploaded video.
With key compliance deadlines for the Online Safety Act looming, the U.K.’s media regulator, Ofcom, has warned social media companies that they must provide “highly effective age assurance” measures or face penalties.
As more jurisdictions enact similar legislation, debates over the best way to verify age are ongoing.
How Meta and its peers respond to the Online Safety Act is therefore of vital importance to countries like Australia that are pursuing their own online safety agendas.