The app tried to keep some of the documents secret

TikTok and the Kentucky Attorney General’s Investigation of a Secret NPR/KPR Document Redacted by a Locally Regulated Company

This was one of a number of disturbing accounts that came to light in a trove of secret documents reviewed by NPR and Kentucky Public Radio. TikTok executives were aware of the potential harms the app can cause, but they didn’t seem to care.

One of the lawsuits was filed by the Kentucky Attorney General. Thirty pages of documents that had been kept secret were brought to light when Kentucky Public Radio copy-and-pasted excerpts of the redacted material.

“It is highly irresponsible of NPR to publish information that is under a court seal,” Haurek said. The complaint cherry-picks misleading quotes and takes outdated documents out of context to exaggerate our commitment to community safety.

TikTok must either get rid of its Chinese parent company or face a nationwide ban under a new law. TikTok is standing up for itself. Meanwhile, the new lawsuits from state authorities have cast scrutiny on the app and its ability to counter content that harms minors.

In an assessment, one TikTok official said that the highest engagement content might not be what they want on their platform.

That’s what TikTok learned when it launched an internal investigation after a report on Forbes. Young streamers on TikTok were receiving a “gift” or “cautious” in exchange for stripping, often in the form of a plush toy or flower.

The company is aware that these young users have accounts but doesn’t remove them, according to the previously-censored portions of the suit.

TikTok for Younger Users, a service that the company says includes strict content precautions, is only open to kids under 13 years old.

The un ravenned filing shows that some of the suicide and self-harm content escaped the first round of human moderation. The study points to self-harm videos that had more than 75,000 views before TikTok identified and removed them.

The first round typically uses artificial intelligence to flag pornographic, violent or political content. The documents show that the following rounds only use human moderation if the video has a certain amount of views. These rounds don’t take account of the different types of content.

TikTok knows about the so-called filter bubbles. The company has defined them as when a user denies only information and opinions that conform to and reinforce their own beliefs, caused by Algorithms that personalize an individual’s online experience.

A complaint about TikTok, a social media platform for teens promoting self-harm, suicide, and halo suicide: An internal investigation

Another employee said, “there are a lot of videos mentioning suicide,” including one asking, “If you could kill yourself without hurting anybody would you?”

The employee said it took 20 minutes to drop into the negative filter bubble after following several accounts. “The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.”

The statements from top executives who appear well-awareness of the harmful effects of the app without taking necessary steps to address it were cited in internal documents.

Employees suggested internally the company “provide users with educational resources about image disorders” and create a campaign “to raise awareness on issues with low self esteem (caused by the excessive filter use and other issues).”

One popular feature, known as the Bold Glamour filter, uses artificial intelligence to rework people’s faces to resemble models with high cheekbones and strong jawlines.

TikTok has beauty filters that can be used on videos to make user look thinner and younger, or to have bigger eyes.

TikTok is trying to get users to take a break with its “break” videos. The company did not think the videos amounted to much. One executive said that they are “useful in a good talking point” with policymakers, but “they’re not altogether effective.”

The tool had little effect, accounting for just a half hour decrease in usage from 108.6 to 108.7 minutes per day. The complaint states that TikTok didn’t revisit this issue.

The app gives parents the ability to limit their childrens usage to between 40 and two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.

An internal document shows that the company was aware of its many features to keep young people using the app, which led to a constant and irresistible urge to keep opening it.

He continued: “We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”

Comments on a High-Tensor Report about Underage Users under the TikTok Privacy Protection Act and an NPR Dispute

After two years of investigation, 14 state attorneys general agreed to file a lawsuit against TikTok.

An internal document about users under 13 instructed moderators to not take action on reports on underage users unless their bio specifically states they are 13 or younger.

On Thursday, TikTok spokesman Alex Haurek criticized NPR for reporting on information that is now under a court seal, claiming the material “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”

Previous post In the U.S. elections voter fraud is very rare
Next post Florida island residents promise to stay if there is a storm