TikTok can set a one-hour daily screen time limit for users under 18


What Parents Can Do to Keep Their Kids Safe on Social Media: After the Instagram Leaked Documents, Snapchat and Snapchat Revisited

Alexandra Hamlet, a New York City-based clinical psychologist, recalls being invited to a roundtable discussion roughly 18 months ago to discuss ways to improve Instagram, in particular, for younger users. “I don’t see many of our ideas being implemented,” she said. Social media platforms, she added, need to work on “continuing to improve parental controls, protect young people against targeted advertising, and remove objectively harmful content.”

The director of Digital Security at the firm agreed that social media platforms offer little substance to counter the ills of their platforms. She said that the solutions put onus on guardians to regulate access and use of the system, with more passive options, such as monitoring and tools that run in the background.

Consideration should be given to the fact that teens can often circumvent parental controls while still learning how to use them. Here’s a closer look at what parents can do to help keep their kids safe online.

After the fallout from the leaked documents, Meta-owned Instagram paused its much-criticized plan to release a version of Instagram for kids under age 13 and focused on making its main service safer for young users.

The popular short form video app currently offers a Family Pairing hub, which allows parents and teens to customize their safety settings. A parent can also link their TikTok account to their teen’s app and set parental controls, including how long they can spend on the app each day; restrict exposure to certain content; decide if teens can search for videos, hashtags, or Live content; and whether their account is private or public. TikTok also offers its Guardian’s Guide that highlights how parents can best protect their kids on the platform.

Another feature encourages users to take a break from the app, such as suggesting they take a deep breath, write something down, check a to-do list or listen to a song, after a predetermined amount of time. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens and will actively nudge them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.

In August,Snapchat introduced a parent guide and hub to give them more insight into how their teens use the app, not to mention who they have been talking to within the last week. To use the feature, parents must create their own Snapchat account, and teens have to opt-in and give permission.

A study of TikTok’s behavior towards teen users and its consequences for the digital health of the 21st century, according to CNN Business

The company told CNN Business that it will continue to build on its safety features and that it will consider feedback from the community and policymakers.

Teenage TikTok users will be able to turn off this new default setting, which will roll out in the coming weeks. But the feature change could bolster the digital well-being of younger users by requiring them to opt out of stricter screen time limits rather than clearing the higher bar of opting in to them.

In addition to parental controls, the app restricts access to some features to younger users, such as Live and direct messaging. Teens under the age of 16 are asked if they want to watch a video when they make their first video. Push notifications are curbed after 9 p.m. for account users ages 13 to 15, and 10 p.m. for users ages 16 to 17.

Despite being a popular messaging platform, it did not appear before the Senate last year, and it faces criticism over difficulties reporting problematic content and the ability of strangers to communicate with young users.

If a person invited another person to the room, it is possible for them to connect with strangers on a public server or in private chats. By default, all users — including users ages 13 to 17 — can receive friend invitations from anyone in the same server, which then opens up the ability for them to send private messages.

In a report published Wednesday, the non-profit Center for Countering Digital Hate (CCDH) found that it can take less than three minutes after signing up for a TikTok account to see content related to suicide and about five more minutes to find a community promoting eating disorder content.

“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Imran Ahmed, CEO of the CCDH, said in the report.

A TikTok spokesperson pushed back on the study, saying it is an inaccurate depiction of the viewing experience on the platform for varying reasons, including the small sample size, the limited 30-minute window for testing, and the way the accounts scrolled past a series of unrelated topics to look for other content.

TikTok said it does not allow content depicting, promoting, normalizing, or glorifying activities that could lead to suicide or self-harm. From April to June of this year, the company took down 93.4% of the videos that were found to be violation of its policies on suicide and self harm.

The CCDH doesn’t differentiate between positive and negative videos on given topics, as people often share empowering stories about eating disorder recovery.

The Impact of Screen Time on the Chasing Life Appoach and Teen Social Media Experiences: Dr. Sanjay Gupta

It isn’t the first time social media has been tested. A staff member of US Sen. Richard Blumenthal assumed the persona of a 13 year-old girl, and then followed a variety of diet and pro-eating disorder accounts on social media. Instagram’s algorithm soon began almost exclusively recommending the young teenage account should follow more and more extreme dieting accounts, the senator told CNN at the time.

The spokesperson told CNN when someone searches for banned words or phrases such as #selfharm, they will not see any results and will instead be redirected to local support resources.

Are you about to stop using your phone? Dr. Sanjay Gupta explores the science of how technology is impacting our brains on the Chasing Life podcasts. Listen now.

If the limit is reached, users will be asked to enter a password to make a decision on prolonging their time on the app.

While there is no universally-endorsed position on how much screen time is “too much”, or even whether or not the impact of screen time more broadly, we know teens need extra support as they start to explore the online world independently.

The Family Pairing feature, which lets a parent or caregivers link their TikTok account to their teen’s, was one of the things Keenan announced. Parents will be able to filter videos with words or hashtags they don’t want to appear in their teen’s feed, set a custom daily screen time limit for their teen, and set a custom schedule to mute TikTok notifications sent to their teen.

Cormac Keenan, the head of trust and safety at TikTok, said that the company believes digital experiences should bring joy and play a positive role in how people express themselves.

The explosion of social media in the past two decades has contributed to a mental health crisis among young people, experts say. Depression rates are rising, and a third of teen girls consider suicide. Research shows that limiting screen time can make people feel better about themselves.

Limits to Screen Time and Password Use for the Under 13-Year-Old TikTok Sensors, a Data-Driven Company

Under 13 year old users will have a 60 minute daily limit, as well as a parent or guardian can enter a password to extend their usage for another half hour.

The company said it settled on the 60-minute default limit after consulting academic research and experts from the Digital Wellness Lab at Boston Children’s Hospital, though Keenan added that “there’s no collectively-endorsed position on the ‘right’ amount of screen time or even the impact of screen time more broadly.”

TikTok’s parent company is based in Beijing, China, so it’s likely that there are safety issues for young users as well. TikTok denied sharing any information with the Chinese government.

The White House said this week it was giving federal agencies 30 days to delete TikTok from government devices, and Canada and the European Parliament recently instituted similar bans.