Meta will hide the information about suicide and eating disorders from teens

Teens in the Wild: How to Protect Teens from Online Content, and How to Educate (The Case of Meta and TikTok)

Nudity and drugs for sale are things that Meta doesn’t recommend for all users. The company says it will now restrict teens from even coming across much of this content, including when it’s posted by a friend or someone they follow.

In addition to hiding content in sensitive categories, teen accounts will also be defaulted to restrictive filtering settings that tweak what kind of content on Facebook and Instagram they see. This change affects recommended posts in Search and Explore that could be “sensitive” or “low quality,” and Meta will automatically set all teen accounts to the most stringent settings, though these settings can be changed by users.

Tech companies like Meta are under intense government scrutiny for how they handle children on their platforms. In the US, Meta CEO Mark Zuckerberg — along with a roster of other tech executives — will testify before the Senate on child safety on January 31st. A wave of legislation in the US attempts to prevent kids from accessing adult content.

Even beyond porn, lawmakers have signaled they are willing to age-gate large swaths of the internet in the name of protecting kids from certain (legal) content, like things that deal with suicide or eating disorders. For years, there have been reports about how teens’ feed are filled with harmful content. But blocking all material besides what platforms deem trustworthy and acceptable could prevent young people from accessing other educational or support resources.

Meanwhile, in the EU, the Digital Services Act holds Big Tech companies including Meta and TikTok accountable for content shared to their platforms, with rules around algorithmic transparency and ad targeting. And the UK’s Online Safety Act, which became law in October, now requires online platforms to comply with child safety rules or risk fines.

A group of more than 40 states also filed lawsuits against Meta in October, accusing it of designing its social media products to be addictive. They rely on evidence from the Facebook whistle blowers.

The move came as a bipartisan group of federal lawmakers, led by Sen. Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, R-Tenn., amped up their campaign to get the Kids Online Safety Act passed as quickly as possible. Tech companies could be held responsible for feeding teens toxic content if the legislation is passed.

In May of last year, the U.S. Surgeon General warned about the dangers of social media for kids. He said the technology was helping fuel a national youth mental health crisis.

A Meta spokeswoman acknowledged that people can exaggerate their age on social media. She toldNPR that the company is increasing investment in age verification tools and technology to detect users who lie about their age.

“You do not need parental permission to sign up for a social media account,” Twenge says. “You check the box to see if you’re 13 or a different birth year and you’re on.”

Jean Twenge, a psychology professor at San Diego State University who wrote Generations, says this is a step in the right direction, but that it is still hard to police who is actually a teen on Facebook.

There are many state lawsuits against Meta and it is under fire from child safety advocacy groups to make its social networks safer for kids.

Meta stated in a post that when people search for terms related to suicide, they will be directed to expert resources for help.

Previous post Congress has a new spending deal; Lloyd Austin is in the hospital
Next post Tiger Woods and Nike had a partnership for 27 years