Social networks have a problem with content moderation. It’s widespread, broadly acknowledged and, for a growing number of people, is making social

Why User-Led Moderation Could Be The Answer To The Content Moderation Problem

submited by
Style Pass
2021-05-25 19:00:05

Social networks have a problem with content moderation. It’s widespread, broadly acknowledged and, for a growing number of people, is making social media inhospitable. With 44% of users saying they faced abuse on social media last year, a quarter of British adults admitting to having experienced online bullying and celebrities deleting their accounts across platforms on a regular basis (Thiery Henry and Chrissy Tiegan are just the latest high-profile cases), polarisation and toxicity have almost become par for the course. And even new social networks, like Clubhouse, are struggling with the issue. 

Despite coming in for significant criticism for being slow to respond to the issues surrounding content moderation, there is little denying that all social media platforms are taking it very seriously now. Facebook, for instance, spends $3.7 billion annually and has 30,000 moderators working on the problem. However, progress has been limited for a multitude of reasons. 

Social media platforms were designed for engagement and therefore revenue. Unfortunately, the most engaging content is usually controversial and polarizing. This means that a platform’s algorithms often end up contributing to the problem. Facebook is working on fixing the problem. In 2018, the company announced an overhaul to its algorithm to optimize for "meaningful social interactions," but it is difficult to change its algorithms, features and user culture overnight.

Leave a Comment