The Correspondence between Meta, Amazon, Apple, Meta, and Microsoft During the Biden Era: Murthy v. Missouri
Two former Biden administration officials were referenced in the documents, and a select subcommittee held a hearing with them on Wednesday. In their prepared statements, they both emphasize that their communication with social media companies was meant to understand how they were implementing their own policies around misinformation. The administration’s interactions with tech companies were consistent with the First Amendment, said Slavitt, the former senior advisor to the Biden covid response team. He said that they did not intend to make social media companies take any action. I didn’t receive a clue that our dialogue was ever interpreted that way.
That the communications happened in the first place is not illegal, though if they rise to the level of coercion — the issue that is in front of the Supreme Court now in Murthy v. Missouri — it would be.
According to the report, there are previously private communications between the top executives at Meta, one that shows how they overcame the criticism of the country’s leader.
The report comes after House Judiciary Chair Jim Jordan (R-OH) subpoenaed Google-parent Alphabet, Amazon, Apple, Meta, and Microsoft last year for their communications with the federal government, saying at the time he wished to “understand how and to what extent the Executive Branch coerced and colluded with companies and other intermediaries to censor speech.” Jordan also chairs a subcommittee established for this purpose — the House Judiciary Committee’s Select Subcommittee on the Weaponization of the Federal Government — and nearly hosted a vote to hold Meta CEO Mark Zuckerberg in contempt of Congress for failing to produce documents. He said the company had begun to cooperate more.
The Supreme Court is currently considering the case of social media platforms being coerced into certain content moderation decisions. The issue of persuasion from the government versus illegal influence is discussed in Murthy v. Missouri. The Jordan report becomes more aligned with the core arguments as a result of the shift in language.
At Wednesday’s hearing, Democratic Select Subcommittee Ranking Member the US Virgin Islands,Stacey Plakett, accused Republicans of holding the hearing now to influence the Supreme Court opinion in the case of Murthy v Missouri.
The staff of the committee gathered hundreds of hours of testimony to show that the companies only took action if the content was found to violate their internal policies.
Republicans have refused to allow Democrats to view hundreds of hours of video taken during those investigations and have refused to make testimony public. Plaskett asked to enter several transcripts of interviews with tech executives into the record, but there was an objection. Jordan said that they plan to release the rest of them once they have talked to everyone and their counsel.
The Supreme Court will answer this question in the coming months, as they decide whether the White House was coerced into making a decision.
The exchange also appears to show that, rather than feeling beholden to Biden’s will, the incident actually pushed Facebook’s top executives to want to engage less with the federal government. If they are interested in attacking us instead of fixing the problems, it is not helping the cause to engage with them further.
Can we include that the WH tried to discourage us from censoring the lab leak theory? In other words, according to him, it was always ‘do more’ generic pressure that was put on that theory.
“If they’re more interested in criticizing us than actually solving the problems, then I’m not sure how it’s helping the cause to engage with them further,” Zuckerberg wrote
Another thought was added by the COO. Did Trump say things that were not right? If Trump blamed a private company not himself and his govt, everyone would have gone nuts.”
In June 2021, a trust and safety executive sent an email to Facebook saying that third party fact-checks they relied on either acknowledged or withdrew uncertainty about the lab leak theory. The executive says that the company had removed posts including any of five claims rated as false by its fact-check network in February 2021, including that the disease was man-made or engineered by a government or country. The decision was made in response to continued public pressure and tense conversations with the new administration, which would have been the Biden administration.
The trust and safety executive said that the team was asked in February to review the decision, to determine if we should go back to reduce and inform or remove the posts.
Source: Republicans release tech executives’ internal communications
The Supreme Court Report on YouTube: Implications for Vaccine Safety Policy, Public Policy, and the Social Media Hesitancy Crisis
The phrase “compromise our standards due to pressure from an administration” is zeroed in on by the conservatives and the following clause, “we’ll often regret it later,” could just as well be. Buyer’s remorse is only possible when you’re free to make (or not make) a purchase.
The committee claims to have reviewed tens of Thousands of emails and other relevant nonpublic documents that show that the White House coerced companies to suppress free speech.
The focus of the Supreme Court case will have significant ramifications on the ability of the federal government to communicate with social media firms.
According to the report, there was a new proposed policy for vaccine safety content that was shared byYouTube in September of 2021. Back in July of that year, YouTube’s public policy team did not commit to a Biden administration official to any new policies and responded to a question about what it calls “borderline content” with stats about the low reach that content already receives. On September 21st, a member of the YouTube policy team asked White House official Rob Flaherty about dates to preview and seek feedback on its “new policy to remove content that could mislead people on the safety and efficacy of vaccines.” On September 29th, after the policy was released, Flaherty apologizes for failing to respond to the previous message but says he “saw the news” and that “at first blush, seems like a great step.”
During arguments in Murthy v. Missouri, Justice Elena Kagan was skeptical of a monthslong gap between the Biden administration asking Facebook not to distribute a post about vaccine hesitancy and the platform allegedly blocking a health group as a result.
We know Jordan — who chairs the committee and subcommittee that released this report — is invested in the outcome of the Supreme Court case because he actually attended the oral arguments. There is an opinion expected at the end of June. House Republicans are working on new legislation that would allow individuals to file lawsuits against executive branch officials for censoring their speech, according to a report.
Extremism groups are using Facebook to organize for the US presidential election. After laying low for several years after the Capitol riot on January 6, militia extremists have been quietly reorganizing, ramping up recruitment and rhetoric on Facebook—with apparently little concern that Meta will enforce its own ban against them, according to new research by the Tech Transparency Project shared exclusively with WIRED. These groups, which are set up locally, encourage members to engage in combat training and recruitment. Today on WIRED Politics Lab, we discuss Facebook’s culpability, and what this means as we head into November.
When the Internet Gets Weaker: David Feiger on Twitter, Tess Owen on Twitter and Facebook at WIRED Politics Lab. An Episode About Misinformation
I’ll call him “@LeahFeiger.” David Gilbert is a person on the internet. Tess Owen is @misstessowen. Write to us at [email protected]. Be sure to subscribe to the WIRED Politics Lab newsletter.
You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:
If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for WIRED Politics Lab. We’re on Spotify too.
The transcript may contain errors. David Gilbert is an author. Hey, I’m there. I’m David Gilbert, senior reporter on the WIRED Politics team. I want to tell you that we are putting together an episode about misinformation before we start the show. Think about it as WIRED’s guide to disinfo, and we need you to be a part of it. What questions do you have about the internet? We want to hear it all. We’ll be reading through the mailbox and answering your questions on the show. Send your questions to [email protected]. That’s politicslab@wired. If you haven’t checked out our newsletter, please sign up. There’s a link in our show notes. Thanks.