Categories
Uncategorized

Do Echo Chambers Exist in Facebook?

Introduction

Social media is a powerful tool that allows users to easily interact with each other, share and create content, provide information, and much more. However, it can just as easily be used negatively. One such way is through the formation of echo chambers; staying within communities that only provide information that you want to hear can cause you to become unwilling to hear differing perspectives and opinions on certain topics, thus leading to the spread of disinformation. In this blog post, we will be exploring whether Facebook users are part of such echo chambers.

How Facebook Users Interact with Posts

To test whether echo chambers existed in Facebook, an experiment was conducted by a group of three people to observe how users reacted to two differing narratives—science and conspiracy. They created a directory consisting of validated Facebook pages that focused on either of these two with respect to Italian and U.S. demographics. An example of such pages are ScienceNOW, a science page, and TheConspiracyArchives, a conspiracy theory page. They observed the reactions of people who interacted with such pages.

For FB Italy, they noted down reactions to posts that ridiculed conspiracies
For FB U.S.A., they noted down reactions to posts that fact-checked conspiracies

They then analyzed how polarized a user was; if 95% of a user’s likes belonged to either the science or conspiracy narrative, the user was polarized. They discovered that a majority of the Facebook users they studied were polarized, with roughly around 250,000 polarized users interacting with science pages and 790,000 polarized users interacting with conspiracy pages. 

Probability density function of how polarized a user was
p = -1.0 represents science and p = 1.0 represents conspiracy

They were able to conclude from this that users primarily focused on only one side of a certain topic. Furthermore, they also discovered that the more highly polarized a user was, the more friends they had that were also highly polarized in the same narrative. 

From this experiment, we can see how echo chambers exist and thrive in Facebook. Users only focus on a certain narrative, leaving no room for middle ground, (as we can see from the figure above). 

The Spread of Information

So how exactly do these narratives spread, causing users to become highly polarized towards them? To show this, the researchers demonstrate how informational cascades are created by homophily, and how these cascades are trapped within the aforementioned echo chambers.

To show the effects of informational cascades, the researchers measured the time between the first and most recent user to share a certain post. 

Probability density function of the lifetime of the effects of cascades

As we can see from the graph above, the lifetime of cascade effects can last to an hour, and potentially up to twenty hours. They also found that a majority of posts tends to lose traction quite quickly, with only 27% of science and 18% of conspiracy related information lasting more than a day.

Probability density function of mean edge homogeneity

Furthermore, Facebook posts that were shared mainly reached users who had similar polarization as the user sharing the post. They concluded that the mean edge homogeneity of all of the information cascades rarely reaches users that consume information from opposing narratives. This results in information being trapped within certain groups, causing echo chambers to grow within these communities.

Conclusion

From this we can conclude that echo chambers exist on Facebook, more so with communities that have extreme opposing views with other communities, such as science and conspiracy communities as we’ve seen above. 

Highly polarized users end up surrounding themselves with like-minded users, eventually causing them to become distrusting of others with views that do not align with theirs. Furthermore, content primarily spreads from one highly polarized user to the other, rarely reaching users from opposing sides. This behavior is further worsened as users are able to curate their feeds to their own desires, for example by hiding posts that they disagree with and dislike.

Overall, the presence of echo chambers on Facebook is quite dangerous, as plenty of people use the social media platform as their main source of news. If these echo chambers grow rampant, so will the misinformation that is spread amongst them, potentially influencing events on a global scale.

Sources

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2795110

One reply on “Do Echo Chambers Exist in Facebook?”

Great findings! I would expect other social media platforms to have a similar algorithm to Facebook which causes everyone to be involved in some echo chamber based on how they see the world.

It definitely scares me to know how integrated social media is in our day to day and the consequences that follow as a result of your findings.

Leave a Reply