Key Takeaways
Meta announced on Thursday, Oct. 17, a series of new safety features to protect Instagram users from sextortion scammers.
The news comes as Meta and other leading social media companies face a myriad of lawsuits accusing them of negatively impacting the mental health of young people.
A U.S. judge has rejected a series of leading social media companies’ attempts to block lawsuits accusing them of fuelling mental health issues in young people.
Meta, TikTok, YouTube, and Snapchat have all had their attempted dismissals denied, paving the way for more potential legal action.
Meta faces lawsuits from multiple U.S. states accusing the company of purposefully making its Facebook and Instagram platforms addictive.
One lawsuit against Meta involves 30 states, while Florida has filed the other.
Both lawsuits accuse Meta of contributing to the negative mental health of its young users, including depression and anxiety.
“Meta needs to be held accountable for the very real harm it has inflicted on children here in California and across the country,” California Attorney General Rob Bonta said in a statement .
Meta said it disagrees with the ruling overall and defended its platforms, claiming it had “developed numerous tools to support parents and teens.”
On Thursday, Meta announced a range of new safety features to help protect teenagers from sextortion scams.
Sextortion scams are a type of online blackmail where scammers coerce or manipulate victims into sharing sexually explicit images or videos and then threaten to release this content publicly unless a ransom is paid.
The social media giant said it uses technology to help identify potential scam accounts and block them from following or interacting with teenagers on the platform.
Meta said it will also alert teenagers if they are chatting to someone from a different country, as sextortion scammers often misrepresent where they live to trick teens into trusting them.
The features will work in conjunction with Instagram’s previously announced Teen Accounts . Meta said the newly designed accounts give teens built-in protections, limiting who can contact them and the content they can see online.
Hundreds of personal injury lawsuits have been filed against Meta, TikTok, YouTube, and Snapchat .
The lawsuits claim that the algorithms used by these platforms prioritize content that can trigger negative emotions, such as body image concerns and peer comparison.
TikTok is facing a lawsuit filed by 14 U.S. states, which accuse the Chinese-owned social media company of fueling the rise of mental health issues in teenagers.
Like Meta, the claimants allege that the social media platform purposefully designed its algorithm to be addictive to children.
The lawsuit, filed last week in New York, said: “TikTok knows that compulsive use of and other harmful effects of its platform are wreaking havoc on the mental health of millions of American children and teenagers.”
With studies linking social media use to mental health issues, mental health professionals have been raising alarms about the amount of time young people are spending on platforms.
The Australian government is considering introducing legislation banning young people from using social media before the end of the year.
This has led to widespread calls for greater transparency from tech companies regarding how their algorithms function.
As more lawsuits pile up against leading social media platforms, it is clear that lawmakers are becoming more active in holding social media companies accountable.