Most of the videos you’d find on YouTube are free for you and me to watch. How, then, did YouTube rake in $6 billion in just three months this year?

Mozilla Explains: Why Does YouTube Recommend Conspiracy Theory Videos?

submited by
Style Pass
2021-06-13 05:30:06

Most of the videos you’d find on YouTube are free for you and me to watch. How, then, did YouTube rake in $6 billion in just three months this year? How is the site on its way to rivaling the earnings of Netflix — a service that charges all of its subscribers a monthly fee?

YouTube makes billions each month through the help of advertising. The more YouTube videos a viewer consumes, the more ad money Google (YouTube’s parent company) can rake in. Simply put, the longer YouTube can keep you on its site, the better for its pocketbook. It’s why, like we mentioned before, YouTube’s recommendation algorithm is intended to keep you on the site watching videos for as long as possible.

This isn’t always a bad thing, but oftentimes it can be. For example, content that is entertaining isn’t always factually correct or safe. “Hatred is useful for clickbait,” as Guillaume Chaslot puts it. The current Mozilla fellow and former Google engineer says as much in our latest Mozilla Explains. That hatred, in some cases, can lead to radicalization — where a viewer winds up falling down a rabbit hole filled with misleading and violent information, in some cases urged on by YouTube’s recommendation algorithm.

Users can quickly fall prey to a domino effect, where one conspiracy video leads to another. In fact, in our YouTube Regrets series we studied exactly this. We asked Mozilla supporters about the times they felt as if the algorithm suggested extreme content and thousands of them responded to tell us all about the weird places they were led to. From searching simple dance videos that led to videos about bodily harm to drag queen self esteem videos that transitioned eventually to anti-LGBTQ content. Users of YouTube were taken to some eerie corners of the site — all thanks to YouTube’s recommendation algorithm. Zoom out and this sequence of events happening repeatedly, while profitable for YouTube, can be dangerous for society.

Leave a Comment