What Parents Can Do to Protect Their Kids from Unintentional Violations of Privacy Rules on Social Media: A Closer Look at the Facebook Papers
After the disclosures in the ” Facebook Papers”, the companies vowed to change. The four social networks have since introduced more tools and parental control options aimed at better protecting younger users. Default teens seeing less sensitive content and increasing their moderation efforts are just some of the changes that have been made by some. Some people say that the new solutions are still limited and need to be done more.
But, despite these very real privacy concerns, it’s simply too dangerous for parents not to know what our kids are seeing on social media. We have to ensure our kids don’t end up in bad places on social media because they are just as supervised by their parents and caregivers as we are.
For now, guardians must learn how to use the parental controls while also being mindful that teens can often circumvent those tools. Here’s a closer look at what parents can do to help keep their kids safe online.
After the fallout from the leaked documents, Meta-owned Instagram paused its much-criticized plan to release a version of Instagram for kids under age 13 and focused on making its main service safer for young users.
Meta and Instagram: A Portal to Parental Supervision and Social Media Safety in Virtual Reality for Parents and Teens (with an Emphasis on Teens)
The hub also offers a guide to Meta’s VR parental supervision tools from ConnectSafely, a nonprofit aimed at helping kids stay safe online, to assist parents with discussing virtual reality with their teens. Guardians can see which accounts their teens have blocked and access supervision tools, as well as approve their teen’s download or purchase of an app that is blocked by default based on its rating, or block specific apps that may be inappropriate for their teen.
Another feature encourages users to take a break from the app, such as suggesting they take a deep breath, write something down, check a to-do list or listen to a song, after a predetermined amount of time. Instagram also said it’s taking a “stricter approach” to the content it recommends to teens and will actively nudge them toward different topics, such as architecture and travel destinations, if they’ve been dwelling on any type of content for too long.
It is already possible for parents to see which new friends their teenagers have added and it is possible to report any accounts that seem to be in contact with their child. It’s also working on a tool to give younger users the option to notify their parents when they report an account or piece of content.
The company told CNN Business that it will continue to add safety features and consider feedback from the community, policymakers, safety and mental health advocates.
In July, TikTok announced new ways to filter out mature or “potentially problematic” videos. The new safeguards allocated a “maturity score” to videos detected as potentially containing mature or complex themes. It also rolled out a tool that aims to help people decide how much time they want to spend on TikToks. The tool allows users to set screen time breaks and provides a dashboard that shows the number of times the app was open and how long it took to open.
Some features for younger users, like Live and direct messaging, are restricted by the app. A pop-up also surfaces when teens under the age of 16 are ready to publish their first video, asking them to choose who can watch the video. The notifications are curbed after 10 p.m. for account users ages 13 to 15.
Discord did not appear before the Senate last year but the popular messaging platform has faced criticism over difficulty reporting problematic content and the ability of strangers to get in touch with young users.
If a person is invited to a room by another person, they can connect with strangers on a public server or in private chats. By default, all users — including users ages 13 to 17 — can receive friend invitations from anyone in the same server, which then opens up the ability for them to send private messages.
Utah is a different state from China so it can’t just turn off kids’ phones. This new law requires social networks to implement these settings. The tougher part of Utah’s law for tech companies to implement will be a provision requiring social apps to ensure they’re not designed to addict kids.
Utah’s Republican-dominated Legislature passed a number of laws that reflected the changing perception of technology companies by politicians.
Other red states, like Arkansas, Texas, Ohio and Louisiana, are working on similar proposals. California, meanwhile, enacted a law last year requiring tech companies to put kids’ safety first by barring them from profiling children or using personal information in ways that could harm children physically or mentally.
It is about time. Children in the United States are at risk if they use social networks, which makes it very difficult for parents to protect them. While Cox is correct that these measures won’t be “foolproof,” and what implementing them actually looks like remains an open question, one thing is clear: Congress should follow Utah’s lead and enact a similar law to protect every child in this country.
He pointed to similar legislation in the works in California and New Jersey — and said the safety and mental well-being of kids and teens depend on legislation like this to hold big tech accountable for creating safer and healthier experiences online.
The laws are the latest effort from Utah lawmakers focused on children and the information they can access online. Two years ago, Cox signed a bill that made it an offence for tech companies to sell or give away porn on cell phones and tablets. Amid concerns about enforcement, lawmakers in the deeply religious state revised the bill to prevent it from taking effect unless five other states passed similar laws.
“Utah will soon require online services to collect sensitive information about teens and families, not only to verify ages, but to verify parental relationships, like government-issued IDs and birth certificates, putting their private data at risk of breach,” said Nicole Saad Bembridge, an associate director at NetChoice, a tech lobby group.
Editorial: Social Media and the Children’s Health Care: Protection of Kids’ Accounts During the Last 13 Years of Molly Russell’s Life
Editor’s Note: Kara Alaimo, an associate professor of communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Her book is about social media being toxic for women and girls. The book titled And How We Can Reclaim It will be published by the same company in the year 2024. The opinions are of her own. Read more opinion on CNN.
Parents will also be allowed to access their kids accounts without their consent, and children will not be able to use accounts after 10:30 pm and 6:30 a.m. without their permission.
One of the key components of this legislation is allowing parents access to their kids’ accounts. The law will be helped to address one of the biggest dangers kids face online by doing this. I’m talking about content from the six months before the death of 14-year-old Molly Russell, who took her own life last year.
The pass-out or choking challenge is a challenge that has gone around social networks. In 2021, four children 12 or younger in four different states all died after trying it.
I hope groups that serve children who are questioning their gender and sexual identities and those that work with other vulnerable youth will adapt their online presences to try to serve as resources for educating parents about inclusivity and tolerance, too. This is also a reminder that vulnerable children need better access to mental health services like therapy — they’re way too young to be left to their own devices to seek out the support they need online.
Source: https://www.cnn.com/2023/03/27/opinions/utah-social-media-laws-protect-kids-alaimo/index.html
The Suicide & Crisis Hotline in the U.S. (Major Report on Social Media Adversarial Activities in the United States)
The Utah law helps parents deal with the amount of time their children are spending on social media. A survey found that the 8 to 12 year old spends more time on social media than 13 to 18 year olds. That is more time than a full time job.
The American Academy of Pediatrics warns that lack of sleep is associated with serious harms in children — everything from injuries to depression, obesity and diabetes. So parents in the US need to have a way to make sure their kids aren’t up on TikTok all night (parents in China don’t have to worry about this because the Chinese version of TikTok doesn’t allow kids to stay on for more than 40 minutes and isn’t useable overnight).
You can call or text the Suicide & Crisis Hotline. The best practices of professionals in the United States are available to you and your loved ones through the support of the Lifeline, which provides free and confidential support. Linea de Prevencion del Suidio y Crisis is available in Espaol.