Stay Updated on Developing Stories

Facebook Trending story: The Wizard of Oz algorithm

Story highlights
  • Facebook's Trending topics aren't just a product of a computer algorithm; journalists shape them
  • Ed Finn says Facebook didn't want to admit it, but the reality is that human judgment inevitably enters the picture

Editor's Note: (Ed Finn is the founding director of the Center for Science and the Imagination at Arizona State University, where he is an assistant professor with a joint appointment in the School of Arts, Media and Engineering and the Department of English. A former journalist for Time, Slate and Popular Science, he is the co-editor of "Hieroglyph: Stories and Visions for a Better Future" (William Morrow, September 2014) and author of "Culture Machines," a book about the present and future of algorithms (MIT Press, forthcoming). The opinions expressed in this commentary are his.)

(CNN) The recent scandal with Facebook's Trending Topics news module goes deeper than the revelation that it was humans all along hiding behind the algorithm. It should come as no surprise that Facebook has bias -- every organization does. It's what you do about the bias, how you attempt to disclose it and manage it, that makes a difference. News organizations have been grappling with that question for a long time, creating formal and informal codes of conduct, oversight systems and transparency rules.

But of course, Facebook doesn't want to be a news organization (or be seen as taking a political stance). As Will Oremus pointed out in Slate, that would be bad for business: people think much more favorably of technology companies than they do of the Fourth Estate. So it should come as no surprise that in reacting to the scandal Facebook seems desperate to avoid looking like a news agency. It stands, according to VP Justin Osofsky, for "a free flow of ideas and culture across nations."

Ed Finn

This is a lovely sentiment, and I'm sure many people who work at Facebook and use the platform believe in it. But it's not what Facebook does. We know this for two reasons. First, imagine if the company took that credo at face value.

A truly democratic network where the most popular content wins would be filled with cute pet videos, ice bucket challenges and, one presumes, vast troves of porn. The company got a wake-up call about this problem during the Ferguson tragedy in 2014, when its algorithms did a poor job of sharing news coverage of that story.

Its response? Hire a bunch of contractors, most of them just starting out in journalism, and stick them in a temporary conference room that might as well have been labeled "Black Box."

The recently leaked guidelines (published by The Guardian) those contractors followed suggest just how deeply the company's algorithmic logic permeated even this effort to put humans back in the loop: the detailed instructions read like they began life as a flowchart that someone translated into memo form.

Ironically, those guidelines reveal just how circular the online news business has become. The memo instructed contractors to track just 10 news sites, including CNN, the BBC, Fox News and The Guardian, in evaluating whether a story merits "national" importance. Meanwhile, editors at all of those places closely monitor Facebook (which, remember, is not a news organization) to see what news is getting traction with the digital public. News directors trying to decipher the Trending Topics algorithm were really peering into an editorial funhouse mirror.

This brings us to the second reason the free-flow dream is not happening anytime soon. Facebook really is a company built around a set of algorithms, just like Google, Amazon and many others. The news feed algorithm decides what posts and news items Facebook's 1.6 billion daily users see -- and then there are the algorithms serving up $17 billion worth of ads.

What's really interesting about the Facebook controversy is how it highlights the quiet sea change that algorithms have caused in all sorts of digital contexts. I like to think of this as the façade of computation: we're all so desperate to believe in the magic of the algorithm that we bend over backward to make it seem real.

Sometimes it seems like people are more patient with their smartphones than they are with their children. We all use a special voice talking to the airline reservation system. And when the magic works, we're delighted even if we had to cheat a little to make it happen.

So when the façade crumbles and we awkwardly confront the humans toiling away inside the machine, everyone is disappointed and sometimes even angry. But what we need to realize that algorithms are always implemented in the world -- there are engineers, workarounds, bugs and a thousand other course corrections that have to happen before an elegant piece of code can really start doing work in the messy, ambiguous space of culture.

The interesting question becomes why we're so desperate to outsource fairness and objectivity to our algorithms -- why it could possibly be easier to program it deep inside some black box when we so rarely achieve it in full sunlight.

Join us on Facebook.com/CNNOpinion.
Read CNNOpinion's Flipboard magazine.

Ed Finn is the founding director of the Center for Science and the Imagination at Arizona State University, where he is an assistant professor with a joint appointment in the School of Arts, Media and Engineering and the Department of English. A former journalist for Time, Slate and Popular Science, he is the co-editor of "Hieroglyph: Stories and Visions for a Better Future" (William Morrow, September 2014) and author of "Culture Machines," a book about the present and future of algorithms (MIT Press, forthcoming). The opinions expressed in this commentary are his.
Outbrain