Tech and Consequences | Sojourners

Tech and Consequences

Are we just going to see the stories that are generating the most statistically measurable buzz?
JuliusKielaitis / Shutterstock.com
JuliusKielaitis / Shutterstock.com

TOWARD THE END of August this year, more than 100 million potential U.S. voters were exposed to a fake story about the presidential election that was disguised as hard news. The story, which claimed that Fox News anchor Megyn Kelly had endorsed Hillary Clinton, began on an ultra-Right website called endingthefed.com, but a link to it quickly appeared in the “Trending” box at the top of the Facebook screen. Not only did the fraudulent link slip through Facebook’s legendary screening software, but it stayed there for a full eight hours.

A couple of weeks later, the opposite problem struck when the Facebook robo-censor kicked out posts containing the Pulitzer Prize-winning 1972 photograph of a young naked Vietnamese girl fleeing a U.S. napalm attack. The Facebook Machine didn’t see a gut-wrenching statement about the cruelty of war. It only saw a naked little girl. After an entire day of protests, Facebook finally announced that it would reprogram the software to allow that photo of a naked girl.

Facebook has been cajoled and scolded over the past year by various German officials about the company’s failure to preemptively remove racist material, as German law requires. But Zuckerberg insists Facebook is “a tech company, not a media company.” We build “the tools,” he said, “we do not produce any content.”

The through line in all of these controversies is a persistent question about the role of human decisions versus that of computer algorithms in determining what material appears on Facebook or other digital media intermediaries, including the Google News search engine. Are we just going to see the stories that are generating the most statistically measurable buzz? Or will trained professionals take a hand in guaranteeing that what we see is actually true? The answer has enormous legal consequences for companies such as Facebook. If their human staffs are making choices about the veracity and relative importance of news stories, then digital media platforms may be liable to lawsuits over the content of those stories. But the stakes are even higher for the future of journalism and the functioning of democracy.

This question first surfaced back in May when an anonymous employee in Facebook’s “Trending” team claimed that stories with a conservative spin were being systematically excluded. A statistical analysis found no evidence of such bias; still, three months later Facebook fired its human “Trending” team. From that day forward, the company proclaimed, “Trending” decisions would be made solely by the Machine.

A week later, the Megyn Kelly debacle occurred.

So what, you may be asking. We’re talking about Facebook. What does it matter which stories are mixed in next to that endless stream of vacation pictures and random ads?

Well, it matters. According to the Pew Research Center, 44 percent of adult Americans say they get news from Facebook. By providing the public with access to the news, and prioritizing some news stories and sources over others, Facebook is performing some of the historic functions of journalism. But the company is showing none of the commitment to truth and the public interest that are at the core of journalistic ethics.

Historically, the Federal Communications Commission has played a role in ensuring Americans’ access to diverse news sources and the accountability of news outlets to the public interest. It did this mostly through regulation of news media ownership. In the digital world, we often hear, our information choices are nearly infinite, so we don’t need media regulation anymore. And this is true if you want to spend your life hunting for your own information sources. But in practice, today most information comes to most people through a tiny handful of digital conduits. So it may be time for the FCC to shine some light on the practices of those organizations.

This appears in the December 2016 issue of Sojourners