Why Like-Dense News Does Not Blame for the Problems of American Democracy: An Empirical Study of Meta’s Platform
These findings challenge popular narratives blaming social media for the problems of American democracy. Algorithmic changes that decrease exposure to like-minded sources do not seem to offer a simple solution for those problems. It’s possible that information on social media is a reflection of us rather than a source of our own views.
The researchers weren’t paid by Meta but the company seemed pleased with the results Nick Clegg, Meta’s president of global affairs, said in a statement that “the experimental findings add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization” or have “meaningful effects on” political views and behavior.
It’s a sweeping conclusion. But the studies are actually much narrower. Even though researchers were given more insight into the Meta platform, the studies they released today leave open as many questions as they answer, even though they were too sensitive to make public.
“We don’t know what would have happened had we been able to do these studies over a period of a year or two years,” Guess said at a press briefing earlier this week. He said that there is no accounting for the fact that many people have had accounts for at least a decade. The world would not have been the same if we had not had social media in the last 10 or 15 years.
Researchers stress that the results do not let social media off the hook, because many factors might have undercut the interventions designed to reduce polarization. The experiments were performed at the end of the 2020 US political election when there was mostly bipartisanship in place.
There are unanswered questions about whether the effects will hold outside the election environment, and whether they will hold if Donald Trump wasn’t one of the candidates.
Second, although surveys find associations between holding polarized attitudes and reported consumption of like-minded news26,27, few studies provide causal evidence that consuming like-minded content leads to lasting polarization. The correlations may be spurious because people with extreme political views are more likely to consume like-minded content. In addition, although like-minded information can polarize30,31,32, most experimental tests of theories about potential echo chamber effects are brief and use simulated content, making it difficult to know whether these findings generalize to real-world environments. There are questions about the effects of polarizing effects and whether they are common, how fast they can decay, and whether they are concentrated among people who avoid news and political content.
Researchers weren’t as definitive. One of the studies published in Science found that resharing elevates “content from untrustworthy sources.” The same study showed that most of the misinformation caught by the platform’s third-party fact checkers is concentrated amongst and exclusively consumed by conservative users, which has no equivalent on the opposite side of the political aisle, according to an analysis of about 208 million users.
Our analysis of platform exposure and behaviour considers the population of US adult Facebook users (aged 18 years and over). We focus primarily on those who use the platform at least once per month, who we call monthly active users. The data is calculated for the subset of US adults who accessed Facebook at least once in the 30 days preceding 17 August 2020. During the three and four quarters of 2020 which encompass this interval, 231 million users in the US accessed Facebook every month.
The Metaverse Chronological Feed: Why We Need It, Not Why We Wanna See It? A Political Scientist’s View on the Meta Universe
Instagram ditched a chronological option in 2016 over vocal user objections, but it reintroduced it last year, same as Facebook. Some users prefer a chronological option to keep up with live events, and some lawmakers have raised it as a way to combat ranking methods that can lead people into information bubbles or lead them toward harmful content.
The reverse chronological feed option on Threads was added last week, the same week that the new data on Meta users was released. That update may appease some Twitter exiles and live-news addicts who have been loudly demanding it, but Meta surely will be monitoring closely to watch for signs of disengagement.
Dean Eckles, a social scientist and mathematician at MIT who worked for Meta and testified to US senators about feed design, says the ranked feed is designed to maximize consumption and engagement of the viewer. Companies such as Meta and Twitter train their ranking systems to promote content similar to what users have clicked on, dwelled on, liked, or commented on in the past. The approach has proved effective at holding attention and any intervention is going to reduce engagement.
A person answering a request for comment did not reply. Changes and improvements to its services are continually made by the service.
Responding to calls for regulations requiring access to data, Meta said in a statement to Nature that the company is committed to “further transparency”, but privacy obligations to its users prevent the company from making raw data available to external researchers. Meta said that it hopes that the results of the research will “help policymakers as they shape the rules of the road for the internet — for the benefit of our democracy, and society as a whole”.
Tucker is hoping the project will inspire further research, but he warns that the decision still rests with social-media platforms. Tucker is hoping that society will make sure that this kind of research will continue in the future.
The research is important, but scientists still have some way to go to fully understand the Meta universe, according to Michael Sullivan, a political scientist at the University of Wisconsin- Madison who worked on the project. Wagner notes that many of the individual data were off-limits, and even the data that the scientists were able to access came pre-packaged by Meta. He wants a system that would allow access to the raw data and give incentives for researchers to collaborate.
The two other interventions, published in Science3 and Nature4 did not have much effect. One limited reshared content came from outside a user’s network, but was reposted by a connection or group to which the user belonged.
Some of the proposed solutions for reducing online echo chambers and improving social discourse would likely not have had much of an impact during the 2020 election. But Tucker acknowledges that there are limits to the inferences that can be drawn from the research.
Lewandowsky, et al. Technology and democracy are related. Understanding the Influence of Online Technologies on Political Behaviour and Decision-making (EU, 2020).
The rate of engagement was higher when the treatment group saw content from like-minded sources in their Feed than when they didn’t. Figure 3c shows that, conditional on exposure, passive and active engagement with content from like-minded sources increased by 0.04 s.d. (95% confidence interval: 0.02, 0.06) and 0.13 s.d. (95% confidence interval: 0.08, 0.17), respectively. Although treated participants saw more content from cross-cutting sources overall, they were less likely to engage with it. The confident interval was 0.04. The number of content views per days active on the platform also decreased slightly (–0.05 s.d., 95% confidence interval: −0.08, −0.02).
The results of our investigation can be seen in the diagram which shows the effect of treatment on exposure to different types of content.
How Facebook Users (and Facebook Users) Respond to Cross-Cutting Content: The Effects of the Treatment on Like-Destining Sources
Facebook users are very unlikely to be exposed to content from cross-cutting sources. Three out of four of their Facebook Feed exposure is coming from cross-cutting sources, with the exception of civic and news content.
The observed effects of the treatment on exposure to content from like-minded sources among participants are plotted in Fig. 2. The treatment reduced exposure to like minded content relative to the pre-treatment period. In the treatment group exposure to content from like-minded sources declined from 36.2% to 36.2% while remaining stable at 54.3% in the control group. There was little increase in exposure levels for both groups during the treatment period, except for a short increase on 2 November and 3 November due to a technical glitch in the production server that implemented the treatment.
We next consider the effects of the treatment (reducing exposure to content from like-minded sources) on how participants engage with content on Facebook. We call it total engagement and engagement rate, two ways of looking at content engagement. Figure 3b presents the effects of the treatment on total engagement with content—the total number of actions taken that we define as ‘passive’ (clicks, reactions and likes) or ‘active’ (comments and reshares) forms of engagement. Figure 3c presents effects of the treatment on the engagement rate, which is the probability of engaging with the content that participants did see (that is, engagement conditional on exposure). These two measures do not necessarily move in tandem: as we report below, participants in the treatment group have less total engagement with content from like-minded sources (since they are by design seeing much less of it), but their rate of engagement is higher than that of the control group, indicating that they interacted more frequently with the content from like-minded sources to which they were exposed.