![]() ![]() We believe that additional insights about cognitive factors and behavioral patterns driving the emergence of polarized environments are crucial to understand and develop strategies to mitigate the spreading of online misinformation. A recent study pointed out the inefficacy of debunking and the concrete risk of a backfire effect from the usual and most committed consumers of conspiracy-like narratives. those supporting anti-vaccines claims, climate change denials, and alternative medicine myths-keep proliferating in polarized communities emerging on online environments, leading to a climate of disengagement from mainstream society and recommended practices. ![]() In spite of different debunking strategies, unsubstantiated rumors-e.g. Moreover, the spreading of misinformation on online social media has become a widespread phenomenon to an extent that the World Economic Forum listed massive digital misinformation as one of the main threats for the modern society. Indeed, conspiracy-like pages disseminate myth narratives and controversial information, usually lacking supporting evidence and most often contradictory of the official news. We limit our analysis to Science and Conspiracy for two main reasons: a) scientific news and conspiracy-like news are two very distinct and conflicting narratives b) scientific pages share the main mission to diffuse scientific knowledge and rational thinking, while the alternative ones resort to unsubstantiated rumors. We then compare the users interaction with these videos on both platforms. We focus on Facebook posts linking Youtube videos reported on Science and Conspiracy pages. To shade light on the role of algorithms for content promotion in the emergence of echo chambers, we analyze the users behavior exposed to the same contents on different platforms-i.e. Recent studies suggest confirmation bias as one of the driving forces of content selection, which eventually leads to the emergence of polarized communities where users acquire confirmatory information and ignore dissenting content. ![]() Not much is known about the role of cognitive factors in driving users to aggregate in echo chambers supporting their preferred narrative. Conversely, YouTube promotes videos through Watch Time, which prioritizes videos that lead to a longer overall viewing session over those that receive more clicks. photo, video, status update-can also make a post more likely to appear. Facebook promotes posts according to the News Feed algorithm, that helps users to see more stories from friends they interact with the most, and the number of comments and likes a post receives and what kind of story it is-e.g. Indeed, little is known about the factors affecting the algorithms’ outcomes. The role of these algorithms in influencing the emergence of echo chambers is still a matter of debate. they show users viewpoints that they already agree with. However, on online social media, different algorithms foster personalized contents according to user tastes-i.e. Ĭonfirmation bias has been shown to play a pivotal role in the diffusion of rumors online. The emergence of polarization in online environments might reduce viewpoint heterogeneity, which has long been viewed as an important component of democratic societies. Polarized communities emerge around diverse and heteorgeneous narratives often reflecting extreme disagreement with respect to the main stream news and recommended practices. Such a disintermediation elicits the tendencies of the users to a) select information adhering to their system of beliefs-i.e., confirmation bias-and b) to form groups of like minded people where they polarize their view-i.e. We passed from a mediated (e.g., by journalists) to a more disintermediated selection process. The diffusion of social media caused a shift of paradigm in the creation and consumption of information. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.Ĭompeting interests: The authors have declared that no competing interests exist. The collection methods are relayed in the Data Collection section.įunding: Funding for this work was provided by EU FET project MULTIPLEX nr. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.ĭata Availability: The entire data collection process has been carried out exclusively through the Facebook Graph API and the YouTube Data API. Received: ApAccepted: JPublished: August 23, 2016Ĭopyright: © 2016 Bessi et al. (2016) Users Polarization on Facebook and Youtube. Citation: Bessi A, Zollo F, Del Vicario M, Puliga M, Scala A, Caldarelli G, et al. ![]()
0 Comments
Leave a Reply. |