YouTube Debuts Plan to Promote and Fund 'Authoritative' News
Following a year in which YouTube has repeatedly promoted conspiracy-theory videos during breaking news events like the shootings in Parkland, Florida, and Las Vegas, the company announced on Monday a slew of new features it hopes will make news on the platform more reliable and less susceptible to manipulation. The company is also investing $25 million in grants to news organizations looking to expand their video operations, as part of a larger, $300 million program sponsored by YouTube's sister company, Google.
According to YouTube executives, the goal is to identify authoritative news sources, bring those videos to the top of users' feeds, and support quality journalism with tools and funding that will help news organizations more effectively reach their audiences. The challenge is deciding what constitutes authority when the public seems more divided than ever on which news sources to trust—or whether to trust the traditional news industry at all.
Among the many changes YouTube announced Monday are substantive tweaks to the tools it uses to recommend news-related videos. In the coming weeks, YouTube will start to display an information panel above videos about developing stories, which will include a link to an article that Google News deems to be most relevant and authoritative on the subject. The move is meant to help prevent hastily recorded hoax videos from rising to the top of YouTube’s recommendations. And yet, Google News hardly has a spotless record when it comes to promoting authoritative content. Following the 2016 election, the tool surfaced a Wordpress blog falsely claiming Donald Trump won the popular vote as one of the top results for the term “final election results.”
YouTube is also expanding a feature, currently available in 17 countries, that shows up on the homepage during breaking news events. This section of the homepage will only surface videos from sources YouTube considers authoritative. The same goes for the videos that YouTube recommends viewers watch next.
These changes attempt to address the problem of misinformation online without adding more human moderators. With some 450 hours of video going up on YouTube every minute, “human curation isn’t really a viable solution,” Neal Mohan, YouTube's chief product officer, told reporters Monday.
Traditionally, YouTube's algorithm has prioritized a user's personal viewing history, as well as the context of the video that user is currently watching, when deciding what videos to surface next. That can be problematic because, as researchers have found, once you watch one conspiracy-theory video claiming that the student survivors of the Parkland shooting are crisis actors, YouTube may recommend you watch even more. With this change, the company is trying to interrupt that downward spiral. It's important to note, though, that YouTube is applying that standard only to breaking news and developing stories. For all other videos that users find on YouTube, the recommendation engine will work the old-fashioned way, which, YouTube executives acknowledge, may well turn up content that people find objectionable.
"There are going to be counter points of view, and there’s going to be [videos] where people who have a conspiratorial opinion are going to express them," Mohan says. "What I think we can do is, instead of telling users what to think, give them as much information as possible, so that they can make those decisions themselves."
To that end, YouTube is also beginning to implement its previously announced partnerships with Wikipedia and Encyclopedia Brittanica, which it will use to fact-check more evergreen conspiracy theories about, say, the moon landing or the Bermuda Triangle. Those videos will now feature an information panel with context from either Encyclopedia Brittanica or Wikipedia. For the moment, though, these panels are being applied only to a small subset of videos that, Mohan says, "tend to be accompanied by misinformation,” meaning they’re hardly a cure-all for the vast quantities of new and less predictable misinformation being uploaded to YouTube every day.
Eradicating that content isn’t the goal for YouTube, anyway. After all, merely spreading falsehoods isn’t against the platform’s policies, unless those falsehoods are considered to be hate speech or harassment. That’s one reason why known propagandists like Alex Jones of Infowars have managed to build wildly successful channels on the back of conspiracy theories that carefully adhere to YouTube’s terms. As it walks the fine line between openness, profitability, and living up to its responsibility to the public, YouTube is less focused on getting rid of the hoaxers than it is on trying to elevate journalism it considers valuable.
That’s one reason it’s giving $25 million in grants to newsrooms that are investing in online video capabilities. That’s a small amount for the multibillion-dollar company, but YouTube’s executives say it could grow in time. The funding is part of the so-called Google News Initiative, a three-year, $300 million fund aimed at strengthening and lifting up quality journalism, which Google announced in March. The hope is that this funding can help news organizations build more robust video operations to compete with the amateurs who might like to mislead their audiences. YouTube has also formed a working group of newsrooms that will help the company develop new products for journalists. “We’re doing this because, while we see the news industry changing, the importance of news is not,” says Robert Kyncl, YouTube’s chief business officer.
Still, questions remain about how this experiment will play out in practice. Identifying which news outlets are authoritative is hard enough in the United States, where people can subsist on completely different media diets according to their politics. Among the news organizations that YouTube highlighted in the announcement as authoritative were CNN and Fox News; the former is routinely rejected by President Trump as “fake news,” the latter is among the least trusted news sources among Democratic voters. This bifurcation of the media poses a challenge for all tech platforms, not just YouTube, that resist taking a stand on what constitutes truth. In attempting to satisfy people all across the political spectrum—and do it on a global scale—they risk landing themselves smack in the center of the same ideological battles they helped foment.