A college student talks about her social media breaking point


U.S. Supreme Court Rules that Social Media Should not be a Content-Producing Platform: The Case of Gonzalez v. Google

The US Supreme Court is now taking notice of a federal law that has long been used as a legal shield for online platforms.

On February 21st, the nine justices heard oral arguments in the case of Gonzalez v. Google, which is a case brought by a father who believes that YouTube aided the attack in Paris on his daughter. The outcome of the case could decide the future of social media platforms worldwide.

The Trump administration tried to turn some of the criticisms into concrete policy that would have had significant consequences. For example, in 2020, the Justice Department released a legislative proposal for changes to Section 230 that would create an eligibility test for websites seeking the law’s protections. That same year, the White House issued an executive order calling on the Federal Communications Commission to interpret Section 230 in a more narrow way.

The law has a provision that says websites cannot be treated as the publishers or speakers of other people’s content. In plain English, that means that any legal responsibility attached to publishing a given piece of content ends with the person or entity that created it, not the platforms on which the content is shared or the users who re-share it.

For years, much of the criticism of Section 230 has come from conservatives who say that the law lets social media platforms suppress right-leaning views for political reasons.

The executive order faced a number of legal and procedural problems, not least of which was the fact that the FCC is not part of the judicial branch; that it does not regulate social media or content moderation decisions; and that it is an independent agency that, by law, does not take direction from the White House.

Why Section 230 isn’t and shouldn’t be tolerated on social media: The American Civil Society, Tech Giants, and the Terrorist Victims

Even though the Trump-era efforts to curtail Section 230 never bore fruit, conservatives are still looking for opportunities to do so. They are not alone. Since 2016, when social media platforms’ role in spreading Russian election disinformation broke open a national dialogue about the companies’ handling of toxic content, Democrats have increasingly railed against Section 230.

The result is a bipartisan hatred for Section 230, even if the two parties cannot agree on why Section 230 is flawed or what policies might appropriately take its place.

The outcome of the cases this week could lead to significant changes to the internet, for better or for worse.

Tech critics have called for added legal exposure and accountability. “The massive social media industry has grown up largely shielded from the courts and the normal development of a body of law. It is highly irregular for a global industry that wields staggering influence to be protected from judicial inquiry,” wrote the Anti-Defamation League in a Supreme Court brief.

It would be a bad thing for the tech giants because it would undermine what has allowed the internet to flourish. They say it would put many websites and users in legal jeopardy and change how some websites operate to avoid liability.

“‘Recommendations’ are the very thing that make Reddit a vibrant place,” wrote the company and several volunteer Reddit moderators. “It is users who upvote and downvote content, and thereby determine which posts gain prominence and which fade into obscurity.”

The brief argued that there would be a serious risk of being sued for defaming someone if people stopped using Reddit and moderation stopped.

The rules of the court could be a gamechanger for American law, society, and social media platforms if they are implemented properly.

The lawyer for the terrorism victims is going to tell the Supreme Court this week that the economic model of social media companies has changed since Section 230 was enacted.

He said that one way to keep users online longer is by suggesting other related material, adding that social media companies make more money the longer you are online.

Social media companies can’t have their cake and eat it. They must be held to account for failing to quickly deploy the resources and technology to prevent extremist content from inciting violence despite earning a financial bonanza across their platforms.

The Supreme Court is in a moment where the Internet is going to be regulated by the courts and that is not the case in social media

The director of national intelligence, the director of the FBI, the attorney general, and the White House chief of staff are some of the people who have been mentioned. Government officials. told them exactly that,” he says.

She says that they don’t think there is a place for hate speech on their products and platforms, and that they invested in human review and smart detection technology to make sure that happens.

Prado acknowledges that social media companies today are nothing like the social media companies of 1996, when the interactive internet was an infant industry. She doesn’t think that the courts should be involved in changing the law.

“Congress had a really clear choice in its mind,” he says. The internet could be like the broadcast media that was heavily regulated. Or, was it going to be like “the town square or the printing press?” The town square and printing press were taken over by Congress, he says. But, he adds, that approach is now at risk: “The Supreme court now really is in a moment where it could dramatically limit the diversity of speech that the internet enables.”

There are many bedfellows among the tech company allies. Groups ranging from the conservative Chamber of Commerce to the libertarian ACLU have filed an astonishing 48 briefs urging the court to leave the status quo in place.

But the Biden administration has a narrower position. One thing to be more passive presenting, even organizing information, but when you cross the line into recommending content, you leave behind the protections of 230, that is what Columbia law professor Timothy Wu summarizes the administration’s position.

Hyperlinks, grouping certain content together, sorting through billions of pieces of data for search engines, that sort of thing is okay but actually recommending content that shows or encourages illegal conduct is not one of them.

It would be a threat to the economic model of social media companies if the Supreme Court adopted that position. The tech industry says there is no easy way to distinguish between aggregating and recommending.

It’s likely that the companies will be defending their conduct in court. Getting over the hurdle of showing enough evidence to justify a trial is not the same as filing suit. It has become more difficult to get past that hurdle thanks to the Supreme Court. The second case the court hears this week, on Wednesday, deals with just that problem.

On February 21, Elena Kagan, a justice on the US Supreme Court, told the court that it was a court that they did not know about. We are not, like, the nine greatest experts on the internet.”

One prominent example of this supposedly “biased” enforcement is Facebook’s 2018 decision to ban Alex Jones, host of the right-wing Infowars website who later was slapped with $1.5 billion in damages after harassing the families of the victims of a mass shooting.

Editor’s Note: Former Amb. Marc Ginsberg is the founder and president of the Coalition for a Safer Web, a non-profit organization whose mission is dedicated to developing technologies and policies to expedite the permanent de-platforming of hate and extremist incitement on social media platforms. The views in this commentary are of his own. View more opinion on CNN.

The Court heard a case on the subject of social media companies aiding and abetting terrorism by hosting content that supports a group in the act of violence. Twitter’s attorney, Seth Waxman, argued that Twitter could have been liable if the company were warned that specific accounts were planning an attack, but that in the absence of such warnings, Twitter was not liable.

In other words, the platforms are treated as benign providers of digital space and have limited liability exposure to whatever customers decide to upload onto that space. The idea was that there would be financial ruin for the newly formed internet service companies if they faced a lot of lawsuits.

But things have changed since those early days of the internet. Antisemitic extremists and far right terrorists have weaponized platforms. They’ve been used to cause attacks in the US. American people are paying for lives lost in other countries.

Advertisers have to rely either on the platforms’ assurances they will delete offensive content, or hope that watchdog groups or community members will flag extremist content that they surely would not want their brands to be associated with. Until a platform removes offensive accounts, days to months can go by. Advertisers don’t know if their ads are going to end up sponsoring extremist accounts because they don’t have any confidence from social media companies.

The Chasing Life Podcast explores how ordinary citizens are trying to hold Big Tech accountable for their content.

TikTok CEO Shou Chew faced grilling from lawmakers Thursday when he appeared before the House Energy and Commerce Committee. As his testimony took place, some legislators renewed calls for TikTok to be banned in the US due to its ties to China through its parent company, ByteDance. They have questions about TikTok’s practices with data collection and impact on children.

While these events play out on the national stage, the tug-of-war over content consumption is also taking place on a smaller scale in homes and schools across the country, affecting the lives and mental health of some users, especially young people.

Lembke was aware that she needed to leave. But she didn’t want to just leave social media; she wanted to do something so that other teens wouldn’t have to go through what she did.

Lembke remembered hearing the buzzer, his phone, or something trying to grab him, and immediately grabbed for it. “And it was in that response – in the millisecond between that buzz and my grab – that I finally hit my breaking point. And I wondered how I was allowing these apps to have so much control over me.

Lembke was last among her friends to be allowed on social media. She thought that world must be “mystical and magical and golden,” given her friends’ newfound and intense focus.

“… I remember seeing all of my friends’ attention get pulled away from me … having their eyes looking up at me, having conversations and (then) getting pulled straight down,” she said. It felt like a drop. Each person spent more time with their phones and screens than they did with me.

“I first got my social media accounts at the age of 12, in the sixth grade, starting with Instagram and making my way over the years to other apps and platforms like Snapchat,” she said. “But as I, a 12-year-old girl, began to spend more time on these apps, my mental and physical health really suffered.”

She said she was scrolling and trying to calculate her worth through likes, comments, and followers. “And that quantification really deepened my social anxiety and deepened my depressive spirals.”

The opaque algorithms of these platforms also pushed her down a dark road of unrealistic body standards, she said, leading her into harmful eating habits.

I think the most dishonest aspect to it all is that it isn’t overt. It’s not going to say, ‘Get an eating disorder, feel bad about your body, go home and, like, don’t eat anything for a day.’ She said that it will not be that blunt. It will help you into the content that will continually reinforce the standards and practices.

When I was young, what were you up against? A societal norm tells people to get on social media.

The Impact of Section 230 on Lembke’s Life in the Light of Facebook and Big Tech: A CNN Technologist’s Perspective

Today, Lembke is urging that Big Tech and social media companies be held accountable, testifying before the Senate Judiciary Committee in February about the effect of these platforms on her life.

You can hear more about Lembke and her crusade on behalf of the mental health and privacy of young people in this week’s Chasing Life podcast. CNN technology reporter Brian Fung unpacks what Section 230 is all about with any Supreme Court ruling.