How Does the TikTok App Get Your Kids? A Follow-Up Study with State Attorneys and the U.S. Department of Investigation
A study done by TikTok concluded that kids were the most at risk of being sucked into the app. “As expected, across most engagement metrics, the younger the user, the better the performance,” according to a 2019 TikTok document.
In each of the separate lawsuits state regulators filed, dozens of internal communications, documents and research data were redacted — blacked-out from public view — since authorities entered into confidentiality agreements with TikTok.
The entire complaint was sealed by the state judge after Kentucky Public Radio published excerpts of the redacted material.
Separately, under a new law, TikTok has until January to divest from its Chinese parent company, ByteDance, or face a nationwide ban. TikTok is fighting the looming crackdown. The app was put under scrutiny because of the new lawsuits from state authorities.
One TikTok official concluded that the content that got the highest Engagement may not be the content they want on their platform.
When it launched its investigation after learning about Forbes, TikTok learned that. TikTok discovered a large number of young streamers receiving gifts in exchange for stripping and using their real money to purchase things like toys or flowers.
The company is aware of the accounts young users have, but does little to remove them according to the previously redacted portions of the suit.
Kids under 13 cannot open a standard TikTok account, but the company has found a new way to reach younger users.
According to TikTok’s own studies, the unredacted filing shows that some suicide and self-harm content escaped those first rounds of human moderation. TikTok removed self-harm videos that were more than 75,000 views before the study was done.
Artificial intelligence is usually used in the first round to flag pornographic, violent or political content. The documents indicate that the rounds only use human reviewers if the video has a certain amount of views. Some content or age specific rules are not taken into account in these additional rounds.
TikTok does not put users into filter bubbles. Users are placed into a bubble after 30 minutes of use in one sitting, according to one internal document. The company wrote that having more human moderators to label content is possible, but “requires large human efforts.”
TikTok: Limiting the Use of Artificial Intelligence for Self Self-Improved Videos in Employee Relationships
One employee stated that there are a lot of videos talking about suicide, including one that asked, “If you could kill yourself without hurting anyone would you?”
“After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble,” one employee wrote. The intense density of negative content makes me feel less depressed and more sad, yet I am still in a good mood.
The internal documents show that the company’s executives are aware of harmful effects of the app but have not taken significant steps to address it.
Employees suggested that the company create a campaign to raise awareness about issues with low self esteem due to the excessive filter use and other issues.
A popular feature is the use of artificial intelligence to re-create people’s faces to look similar to models with strong jawlines.
TikTok has a beauty filters that enable users to make themselves look thinner and younger, or have larger eyes, using videos.
TikTok has publicized its “break” videos, which are prompts to get users to stop endlessly scrolling and take a break. The videos were not thought to be much by the company. One executive says that they are useful in a good talking point with policymakers, but are not entirely effective.
One document shows a project manager saying he doesn’t want to reduce time spent. In a chat message, one of the employees said the goal was to contribute to DAU and retain users.
The app allows parents to limit their kids usage to between 40 and two hours per day. The default time prompt for TikTok was created to fight excessive use of the social media app.
The company was aware of its many features designed to keep young people on the app and that’s why it made it so irresistible to keep opening it.
He said that the company has put in place robust safeguards that include removing suspected under-age users and launching safety features like screentime limits, family pairing and privacy by default for kids under 16.
A State-Level Investigation of TikTok and a Critique of NPR for Reporting “Unusual” Information about Underage Users
The information came after a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday.
An internal document about users under 13 instructed moderators to not take action on reports on underage users unless their bio specifically states they are 13 or younger.
On Thursday, a spokesman for TikTok criticized NPR for reporting information that is now under a court seal, claiming it was “cherry-picking misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”