Facebook Pushes News Literacy to Combat a Crisis of Trust

Facebook joins with the founder of Craigslist to fix a crisis of faith in the news, but questions remain about who should shoulder the blame.
FacebookTA471115586Converted1.jpg
WIRED

Trust in the press has cratered, along with trust in so many other institutions of late. That crisis of faith in the facts compelled a group of concerned citizens and foundations---oh, and Facebook---to pour $14 million dollars this week into an effort to gain that trust back.

Craigslist founder Craig Newmark conceived of the News Integrity Initiative as part of his recent crusade against fake news. Future-of-news provocateur Jeff Jarvis will run the project out of the City University of New York's prestigious journalism school. But by far the most intriguing donor is Facebook, the elephant lurking in every American newsroom.

Facebook's unprecedented power in connecting content to audiences has undermined the industry's entrenched business models, while giving the social network enormous influence over the kind of news journalism produces. It won't be Facebook's first move to fight fake news. It has partnered with fact-checking sites to flag intentionally misleading content. And it has hired former CNN anchor Campbell Brown to help the social network navigate its changing relationship with journalism, and helm the Facebook Journalism Project specifically to fight fake news. At the same time, the framing of the News Integrity Initiative also takes a lot of the blame for the trust crisis off Facebook's shoulders and spreads it around to everyone, from ad networks to journalism itself.

"I do not think that the platforms are media companies, and I do not want them to be editors or censors of the world," Jarvis says. Rather, he hopes to bring together journalists, ad networks, and the platforms themselves to figure out how everyone can help each other diminish the perception that the news is rigged, in one way or another. There's a clear counterargument though, given both Facebook's role in the media landscape and its recent interest in producing original content. But while the News Integrity Initiative may not act as Facebook's tacit acknowledgment that it is a media company, it does show that Facebook feels at least some responsibility for the ecosystem.

Facebook's Evolution

It used to be easy to define Facebook: It was a social networking site to connect with your friends and family. The most confusing thing about it was figuring out who exactly it was appropriate to "poke." Now any attempt to define its essential nature feels like contemplating the vagaries of life itself. There's what Facebook thinks it is; what its users think it is; what its advertisers think it is; and how people actually use it.

"It’s been an evolution," says Brown, "Over time the Facebook community started using Facebook to share information and news. With that recognition that we are part of the news ecosystem comes a responsibility to make sure that there is authentic, accurate information on our platform."

Six in 10 Americans get their news from social media, according to recent surveys. Your News Feed lately probably looks like some combination of baby photo, snarky joke, news article about Donald Trump, rant, video of an animal being adorable, news article about Donald Trump. The shift has tugged on journalism's bottom line, and compelled media companies to rethink the kinds of "content" they create. It's also invited opportunists who cater less to any journalistic ideal of truth-seeking than to Facebook's algorithms. After the 2016 presidential election results, which shocked pollsters and pundits, data scientists pointed out that Facebook was the battleground for spreading misinformation and propaganda from fake news factories and partisan bots.

Facebook critics have argued that the platform didn't fully understand how important an arbiter of information it had become, and therefore let fake news spread like a disease on its site, infecting the electorate. Journalism itself takes plenty of flack as well, as the industry becomes ever-more fractured and atomized---how can anyone know what's good reporting and what's hype? In reality, the two share the blame, along with the click-centric advertising model that girds the online publishing business, and human psychology itself, which resists facts that don't comport with a pre-existing worldview.

More Questions Than Answers

The News Integrity Initiative aspires to address all of these problems, but the trick will be figuring out how. Jarvis hasn't hired a manager for the program yet, and is still actively looking to add collaborators. Newmark says the initiative will focus on research into how to garner support for "good-faith journalism," and raise awareness about bad journalism to help readers---and Facebook users---make informed decisions. That's admirable, but the search for those answers isn't unique. What's most needed now are ways to turn insights into action.

Facebook itself has some ideas. "There are a lot of things that we are doing and can do on the technology side of this," says Brown. "We are finding that we can disrupt economic incentives, because most false news is financially motivated." On top of policing its own site, Facebook also hopes to foster great journalism before it ever hits News Feeds in the first place.

Brown doesn't know exactly what will come out of Facebook's involvement. One hope she has for the research? A curriculum for high schools that emphasizes news literacy. Newmark, too, hopes that the initiative will come up with ways to advance news literacy for young people, and to help news sites and platforms make it easier for readers to judge the quality of sourcing. Jarvis, among other goals, hopes to come up with recommendations for how journalism can learn from memes to deliver solid, well-reported information in bite-sized chunks that readers want to share. Why let the fake news purveyors have a monopoly on virality?

All the parties involved have grand aspirations for this collaboration. Of course, good intentions are rarely enough to produce meaningful action. But Facebook's pledge here is meaningful in itself, since it proves Facebook is ready to admit that it impacts every aspect of the news industry, and therefore has enormous responsibility. The first step, after all, is admitting you have a problem.