Parents should read the guide on social media
Parents and Teens on Social Media: Instagram, Snapchat, ABI Research, the Safety Center, and a Stable Social Media Exploratory Hub
Other platforms including Instagram and Snapchat have similarly rolled out additional parental controls and features that encourage teens to take a break and set boundaries.
Michela Menting, a digital security director at market research firm ABI Research, agreed that social media platforms are “offering very little of substance to counter the ills their platforms incur.” She said the solutions put onus on guardians to make certain that various parental controls, intended to be filter, block and restrict access, and more passive options, such as monitoring and surveillance tools, are activated.
In response, the company recently refreshed its Safety Center, where parents can find guidance on how to turn on safety settings, FAQs about how Discord works, and tips on how to talk about online safety with teens. Some existing parental control tools include an option to prohibit a minor from receiving a friend request or a direct message from someone they don’t know.
After the fallout from the leaked documents, Meta-owned Instagram paused its much-criticized plan to release a version of Instagram for kids under age 13 and focused on making its main service safer for young users.
It has since introduced an educational hub for parents with resources, tips and articles from experts on user safety, and rolled out a tool that allows caretakers to see how much time their kids are spending on social media If a child makes an update to their privacy and account settings, parents will also be notified and can receive updates on what accounts their teens follow and accounts that follow them. Parents can see which accounts are blocked by their teens. The company also provides videos about how to use the new supervision tools.
Another feature encourages users to take a break from the app, such as suggesting they take a deep breath, write something down, check a to-do list or listen to a song, after a predetermined amount of time. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens and will actively nudge them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.
While this was Snapchat’s first formal foray into parental controls, it did previously have a few existing safety measures for young users, such as requiring teens to be mutual friends before they can start communicating with each other and prohibiting them from having public profiles. Teens with the default location-sharing tool off can still use it to tell a friend their real-time location, even if the app is closed for safety. It’s recommended thatSnapchat users review their friends and ensure they stay in touch with certain people.
A Study of TikTok, a Social Media Platform with a One-Hour Screen Time Limit to Prevent Teen Explosive Scrolling
The company told CNN Business it will continue to build on its safety features and consider feedback from the community, policymakers, safety and mental health advocates, and other experts to improve the tools over time.
TikTok announced Wednesday that every user under 18 will soon have their accounts default to a one-hour daily screen time limit, in one of the most aggressive moves yet by a social media company to prevent teens from endlessly scrolling.
Live and direct messaging are restricted for younger users due to the app’s parental controls. A pop-up also surfaces when teens under the age of 16 are ready to publish their first video, asking them to choose who can watch the video. Push notifications are curbed after 9 p.m. for account users ages 13 to 15, and 10 p.m. for users ages 16 to 17.
The popularity of the messaging platform has come under fire over difficulty reporting problematic content and the ability of strangers to get in touch with young users.
Still, it’s possible for minors to connect with strangers on public servers or in private chats if the person was invited by someone else in the room or if the channel link is dropped into a public group that the user accessed. By default, all users — including users ages 13 to 17 — can receive friend invitations from anyone in the same server, which then opens up the ability for them to send private messages.
In a report published Wednesday, the non-profit Center for Countering Digital Hate (CCDH) found that it can take less than three minutes after signing up for a TikTok account to see content related to suicide and about five more minutes to find a community promoting eating disorder content.
“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Imran Ahmed, CEO of the CCDH, said in the report.
A TikTok spokesperson pushed back on the study, saying it is an inaccurate depiction of the viewing experience on the platform for varying reasons, including the small sample size, the limited 30-minute window for testing, and the way the accounts scrolled past a series of unrelated topics to look for other content.
TikTok does not allow content that glorifies or promotes suicide or self-destructive behavior. Of the videos removed for violating its policies on suicide and self-harm content from April to June of this year, 93.4% were removed at zero views, 91.5% were removed within 24 hours of being posted and 97.1% were removed before any reports, according to the company.
The spokesperson said the CCDH does not distinguish between positive and negative videos on given topics, adding that people often share empowering stories about eating disorder recovery.
How Screen Time Matters to Teens: The Impact of Social Media on Addiction and Substituency in the US Sen. Richard Blumenthal
This is not the first time social media has been tested. The staff of the US Sen. Richard Blumenthal opened an account on the photo sharing site in October of 2016 at the age of 13 and then followed a number of pro-eating disorder and diet accounts. Instagram’s algorithm soon began almost exclusively recommending the young teenage account should follow more and more extreme dieting accounts, the senator told CNN at the time.
The spokesperson told CNN when someone searches for banned words or phrases such as #selfharm, they will not see any results and will instead be redirected to local support resources.
Are you prepared to break up with your phone? The science behind how technology is affecting our brains was covered in the Chasing Life show. Listen now.
If the limit is reached, users will need to make a decision if they want to extend their time on the app.
“While there’s no collectively-endorsed position on how much screen time is ‘too much’, or even the impact of screen time more broadly, we recognize that teens typically require extra support as they start to explore the online world independently,” Keenan wrote in a blog post.
The Family Pairing feature is an added feature that allows a parent to link their TikTok account to their teen. Parents will be able to modify their teen’s TikTok notifications to only alert them when they arrive, set a daily screen time limit for their teen, and create custom filters for their teen.
Cormac Keenan, the head of trust and safety at TikTok, believes that digital experiences can bring joy and positive role in how people express themselves, discover ideas and connect.
The explosion of social media in the past two decades has contributed to a mental health crisis among young people, experts say. Depression rates are surging, and a third of teen girls reported considering suicide in 2021. Research has shown that screen time can make young people feel better about themselves.
TikTok isn’t a Device for Under 13, but a Company Can Take a 60-Minute Per Day Limit
Users under 13 will also have a 60-minute daily limit, and a parent or guardian can enter a passcode that extends their daily usage for another half hour.
The company said it settled on the 60-minute default limit after consulting academic research and experts from the Digital Wellness Lab at Boston Children’s Hospital, though Keenan added that “there’s no collectively-endorsed position on the ‘right’ amount of screen time or even the impact of screen time more broadly.”
Concerns about TikTok being unsafe for its young users are not the only issues facing the company and it’s parent company. TikTok has denied sharing data with the Chinese government.
The White House said this week it was giving federal agencies 30 days to delete TikTok from government devices, and Canada and the European Parliament recently instituted similar bans.