A groundbreaking research effort delves into the realm of political behavior on two of the internet’s largest platforms—Facebook and Instagram. The study, led by an interdisciplinary team of researchers collaborating with internal groups at Meta (formerly known as Facebook), encompasses four papers published in prestigious journals like Science and Nature. The research focuses on user behavior during the 2020 U.S. election, providing a comprehensive analysis of political engagement and expression on both platforms.
The Unprecedented 2020 Facebook and Instagram Election Study (FIES)
The study, known as the 2020 Facebook and Instagram Election Study (FIES), marks a unique collaboration between Meta and the scientific research community. Spearheaded by Professor Talia Jomini Stroud from the University of Texas’ Center for Media Engagement and Professor Joshua A. Tucker from NYU’s Center for Social Media and Politics, the FIES project sets the stage for many more forthcoming papers in the coming months.
Examining Ideological Echo Chambers on Facebook
One of the papers investigates Facebook’s ideological echo chambers, aiming to understand the extent of exposure to politically aligned content. The researchers discovered that Facebook exhibits substantial ideological segregation, surpassing previous findings related to internet news consumption based on browsing behavior.
Subheading 1: Pages and Groups Drive Ideological Segregation
The study found that Facebook Pages and Groups play a significant role in fostering ideological segregation. Content posted in these entities displayed much more “ideological segregation” compared to content shared by users’ friends. Misinformation and extremism, such as QAnon, anti-government militias like the Proud Boys, and life-threatening health conspiracies, have historically thrived in these settings.
Asymmetry in Political Misinformation
Another major finding was an asymmetry in political content on Facebook. The researchers observed that a “far larger” portion of conservative political news on the platform was flagged as false by Meta’s third-party fact-checking system. This suggests that conservative Facebook users are exposed to more online political misinformation compared to their liberal counterparts.
Experiment with Algorithmic Feeds on Facebook and Instagram
In a separate experiment, participants on Facebook and Instagram had their algorithmic feeds replaced with a reverse chronological feed. While this change did not significantly impact users’ political sentiments, offline political engagement, or political knowledge, it did reveal an interesting shift. Users in the Chronological Feed group spent dramatically less time on the platforms, indicating how Meta’s algorithm-driven content mix encourages addictive behavioral tendencies.
Future Implications
These findings are only a glimpse of the comprehensive results to be unveiled in future papers. Meta has interpreted the results as a success, though some view it as a mere publicity stunt. Nonetheless, this data provides a crucial foundation for future research on social media and its impact on political behavior and engagement. As more research emerges, a deeper understanding of online political expression and its implications will undoubtedly come to light.