Facebook’s labelling of “state-controlled media” has a significant impact on user engagement, according to a recent study. When content from authoritarian nations like China and Russia was labeled as such, users decreased their engagement with it. Interestingly, the study found that the labels actually increased user favourability towards posts from Canadian state media. The research, conducted by Carnegie Mellon University, Indiana University, and the University of Texas at Austin, aimed to understand how these labels affect user behavior on Facebook.
Before delving into the effects, it’s essential to understand what these labels entail. Facebook’s ‘state-controlled media’ labels are part of a broader effort to provide users with context about the sources of news articles they encounter on the platform. When applied, these labels indicate that a news source receives substantial government backing or control. The objective is to help users make informed judgments about the news they consume, particularly when it comes from sources that may have political or governmental influence.
The first experiment involved 1,200 US-based Facebook users with and without state-controlled media labels. The results showed that engagement with posts from Russia and China decreased, but only when users actively noticed the labels. The second test, involving 2,000 US Facebook users, revealed that user behavior was influenced by public sentiment towards the country listed on the label. Users responded positively to Canadian state-controlled media but negatively towards Chinese and Russian government-run content.
In a third experiment, researchers analyzed how Facebook users interacted with state-controlled media before and after the introduction of labels. They found that the labels had a significant effect, with a 34% decrease in sharing labeled posts and a 46% decrease in user likes. Additionally, the study showed that training users on the labels significantly increased the likelihood of users noticing them.
Impact on User Engagement
The introduction of these labels has raised questions about their impact on user engagement with labeled content. Some studies and anecdotal evidence suggest that posts from labeled sources may experience a reduction in engagement, such as likes, shares, and comments.
This phenomenon can be attributed to various factors:
- Trust Concerns: Users may be less likely to engage with content from labeled sources due to concerns about bias or government influence, regardless of the article’s actual quality or accuracy.
- Algorithmic Effects: Facebook’s algorithms may deprioritize content from labeled sources, making it less visible in users’ feeds. This can indirectly affect engagement.
- User Preconceptions: Labeled content may be perceived as less credible or trustworthy, leading users to hesitate before engaging with it.
- Reduced Reach: Labeled sources may face limitations in promoting their content through paid advertising on the platform.
While the study concluded that state-controlled media labels reduced the spread of misinformation and propaganda on Facebook, it faced limitations in determining the precise cause and effect relationship. The authors note that Facebook’s nontransparent newsfeed algorithms could have influenced the results. Nevertheless, the study’s authors recommend that social media companies clearly communicate labeling policy changes to users, explain the meaning behind the labels, and ensure that users notice them.
As the fight against online misinformation and propaganda continues, the study’s authors call on Facebook and other social platforms to take more action. They emphasize the importance of informing users about labeling policy changes and making sure the labels are displayed prominently. The researchers believe that Facebook’s quiet introduction of the labels without user notification significantly reduced their effectiveness.