Misinformation spreads rapidly on social media platforms. This column uses a model of online content-sharing to show that a social media platform tha

Misinformation: Strategic sharing, homophily, and endogenous echo chambers

submited by
Style Pass
2021-06-30 18:30:08

Misinformation spreads rapidly on social media platforms. This column uses a model of online content-sharing to show that a social media platform that wishes to maximise content engagement will propagate extreme articles amongst its most extremist users. ‘Filter bubbles’ prevent the content from spreading beyond its extremist demographic, creating ‘echo chambers’ in which misinformation circulates. The threat of censorship and a corresponding loss in engagement could pressure platforms to fact-check themselves, while regulating their algorithms could mitigate the consequences of filter bubbles. 

“Virginia is eliminating advanced high-school math courses.” “Donald Trump tried to impeach Mike Pence.” “President Biden is passing a bill forcing all Americans to cut out red meat.” 

These headlines were among the many circulating on social media over the last few months. Each of the articles was found to contain misinformation – i.e. misleading information or arguments, often aiming to influence (a subset of) the public. Articles containing misinformation were also among the most viral content, with “falsehoods diffusing significantly farther, faster, deeper, and more broadly than the truth in all categories of information” (Vosoughi et al. 2018). There are increasing concerns that misinformation propagated on social media is further polarising the electorate and undermining democratic discourse.

Leave a Comment