Children will need approval from their parents with Utah’s new social media law
The Facebook Papers Revisited: Surveillance and Parental Control After the Haugen Leaked Social Media Decrees
Those hearings, which followed disclosures in what became known as the “Facebook Papers” from whistleblower Frances Haugen about Instagram’s impact on teens, prompted the companies to vow to change. The four social networks have since introduced more tools and parental control options aimed at better protecting younger users. Some have also made changes to their algorithms, such as defaulting teens into seeing less sensitive content and increasing their moderation efforts. The new solutions are still limited, say lawmakers, social media experts and psychologists.
A digital security director at a market research firm said that social media platforms were not giving much substance to counter the ills of their platforms. She said the solutions put the onus on the guardian to use various parental controls, such as blocking access and monitoring, and more passive options, such as snooping tools that aren’t visible.
In response, the company recently refreshed its Safety Center, where parents can find guidance on how to turn on safety settings, FAQs about how Discord works, and tips on how to talk about online safety with teens. Some parental control tools offer an option to prohibit a minor from getting a friend request from someone they don’t know.
After the fallout from the leaked documents, Meta-owned Instagram paused its much-criticized plan to release a version of Instagram for kids under age 13 and focused on making its main service safer for young users.
It has made it easy for parents to see how much time their kids spend on social media, and introduced a tool for parents to set time limits for their children. Parents can also receive updates on what accounts their teens follow and the accounts that follow them, and view and be notified if their child makes an update to their privacy and account settings. The accounts their teens have blocked can be seen by the parents. The company also provides video tutorials on how to use the new supervision tools.
Another feature encourages users to take a break from the app, such as suggesting they take a deep breath, write something down, check a to-do list or listen to a song, after a predetermined amount of time. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens and will actively nudge them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.
It has already had safety measures for young users, such as banning them from having public profiles and requiring them to be friends with each other before they can start communicating. Teens who choose to use the location-sharing tool off by default can also use it to tell a friend or family member their location, even when their app is closed for safety reasons. Meanwhile, a Friend Check Up tool encourages Snapchat users to review their friend lists and make sure they still want to be in touch with certain people.
Parental Constraints on TikToks, a Social Media Platform for Exploring Children’s Privacy and Content. A Utah Senate Measure Enacts a Similar Law to California
The company told CNN Business it will use feedback from the community, policymakers and experts to improve the tools over time.
In July, TikTok announced new ways to filter out mature or “potentially problematic” videos. The new safeguards allocated a “maturity score” to videos detected as potentially containing mature or complex themes. The tool was rolled out to help people decide how much time to spend on TikToks. The tool lets users set regular screen time breaks, and provides a dashboard that details the number of times they opened the app, a breakdown of daytime and nighttime usage and more.
In addition to parental controls, the app restricts access to some features to younger users, such as Live and direct messaging. A pop-up also surfaces when teens under the age of 16 are ready to publish their first video, asking them to choose who can watch the video. Push notifications are curbed after 9 p.m. for account users ages 13 to 15, and 10 p.m. for users ages 16 to 17.
Discord did not appear before the Senate last year but the popular messaging platform has faced criticism over difficulty reporting problematic content and the ability of strangers to get in touch with young users.
There is a possibility for a child to connect with strangers on a public server or in private chat if the invite came from someone else in the room. Users in the age range of 13 to 17 can receive invitations from anyone in the same server that will allow them to send private messages.
SALT LAKE CITY — Utah became the first state to enact laws limiting how children can use social media after Republican Gov. Spencer Cox signed a pair of measures Thursday that require parental consent before kids can sign up for sites like TikTok and Instagram.
The laws passed through Utah’s Republican-supermajority Legislature are a reflection on how politicians see technology companies.
Along with New Jersey, there are similar proposals in the works by other red states. California, meanwhile, enacted a law last year requiring tech companies to put kids’ safety first by barring them from profiling children or using personal information in ways that could harm children physically or mentally.
Children’s advocacy groups generally welcomed the law, with some caveats. The law aimed at reining in the addictive features of social media was applauded by Common Sense Media. It “adds momentum for other states to hold social media companies accountable to ensure kids across the country are protected online,” said Jim Steyer, the CEO and founder of Common Sense.
He pointed to similar legislation in the works in California and New Jersey — and said the safety and mental well-being of kids and teens depend on legislation like this to hold big tech accountable for creating safer and healthier experiences online.
The laws are the latest effort from Utah lawmakers focused on children and the information they can access online. Two years ago, Cox signed legislation that called on tech companies to automatically block porn on cell phones and tablets sold, citing the dangers it posed to children. The bill was revised because of concerns that it would not be enforced unless five other states passed the same laws.
“Utah will soon require online services to collect sensitive information about teens and families, not only to verify ages, but to verify parental relationships, like government-issued IDs and birth certificates, putting their private data at risk of breach,” said Nicole Saad Bembridge, an associate director at NetChoice, a tech lobby group.