“Our goal is to start with a list of internet conspiracies listed on the internet where there is a lot of active discussion on YouTube,” Wojcicki said at SXSW.
The decision to include links to other websites represents a dramatic shift for YouTube, which has historically existed as a mostly contained ecosystem. It’s also notable that YouTube chose to link out to text-based sites, rather than rearrange its own search algorithm to further favor content from truthful creators and video journalists. One reason for the decision might be that YouTube wants to avoid the perception that it’s rigging its platform to favor certain creators, a criticism it has faced in the past. It also prevents YouTube from having to censor content outright, serving as the ultimate arbiter of truth.
“People can still watch the videos, but then they have access to additional information,” said Wojcicki.
Merely placing links to factual information alongside videos won’t solve the company’s moderation problems wholesale. For one, as Zeynep Tufekci at The New York Times and others have pointed out, YouTube’s recommendation algorithm is often how users end up seeing conspiracy theories in the first place. Wikipedia in particular can also be edited by anyone, and has had its own reliability issues in the past.
Take, for example, what happens when you search conspiracy theorist Alex Jones’ videos about the Parkland shooting. After watching one, YouTube recommends you then watch another of Jones’ videos, this time about how the Sandy Hook shooting was a hoax. It doesn’t suggest that you watch a factual clip about Parkland or Sandy Hook at all. YouTube’s algorithm system serves to radicalize users, and until that’s fixed, the company will likely continue to suffer from scandals related to misinformation.
YouTube has also still yet to decide and implement clear rules for when uploading conspiracy theory content violates its Community Guidelines. Nothing in the rules explicitly prevents creators from publishing videos featuring conspiracy theories or misleading information, but lately YouTube has been cracking down on accounts that spread hoaxes anyway.
‘People can still watch the videos, but then they have access to additional information.’
SUSAN WOJCICKI, YOUTUBE
In the wake of the Parkland shooting for example, YouTube reportedly issued a “strike” against Jones for uploading a video accusing Hogg of being an actor (this video was separate from the one that trended on the platform). But Jones and his organization InfoWars have been uploading videos to YouTube prompting lies, hate speech, and false conspiracy theories for years, leaving YouTube’s users and creators to guess what’s actually permitted. Often it seems the platform reacts primarily in response to public outcry, which makes its moderation decisions inconsistent. Until YouTube has outlined a clear policy for how it wants to regulate misinformation, its new efforts to introduce text-based links won’t entirely be effective.
Merely serving up factual information has also not been a cure-all for other platforms that have suffered from scandals associated with misinformation, like YouTube’s parent company Alphabet and Facebook. Both Google News and Facebook’s trending bar have surfaced conspiracy theories during breaking news events in the past, despite having plenty of links to more reputable news sites on their platforms. It’s remarkable, too, that an enormous platform, equipped with a flow of advertising cash, has chosen to address its misinformation problem primarily using the work of a donation-funded volunteer encyclopedia.
Another obvious question here is whether Wikipedia and YouTube will be able to keep up with with breaking news events that quickly fall prey to conspiracy theories. For example, the Parkland shooting survivors were accused of being actors within hours of the tragedy. It’s unclear how quickly YouTube will be able to add links to the thousands of misinformation videos that are uploaded every time a major news event occurs.
Still though, YouTube should be applauded for doing something to try to fight conspiracy theories, especially since adding links elsewhere will do nothing to immediately aid its bottom line.
- After the Parkland shooting, YouTube let a conspiracy video hit the top of its trending section
- Previously, YouTube failed to take down a video by Logan Paul that clearly violated both its terms of services and norms of decency
- YouTube’s moderation efforts, in short, have been an inconsistent mess