Editor's Note: (Brian Hughes is a professor of media studies at Queens College, City University of New York. He writes and researches on the topics of extremism, new media and the Middle East. The opinions expressed in this commentary are his.)
(CNN) Can tall tales swing an election? That's the question being asked today in conference rooms across the digital media industry.
After Donald Trump's startling presidential victory, many are wondering if an abundance of misleading or simply made up news stories gave President-elect Trump the edge he needed to win. Critics are calling on media companies to regulate the spread of phony news reports. Already, platforms like Facebook and Google have announced plans to curtail fake news by revising their algorithms and user policies. But will such changes really improve the trustworthiness of online news?
First, some perspective: The 2016 election was a stunningly low-turnout race, which ended in an ambiguous, electoral-but-not-popular victory for President-elect Trump. Fake news may have been, at most, a necessary but insufficient cause for the election's outcome. Take away one FBI press conference, or add a last-minute Clinton rally in Wisconsin, and things might have gone very differently. If fake news did move the needle in Trump's favor, it was only one of many factors affecting the outcome of a race between two of the least popular candidates of all time.
Make no mistake. Fake news is a problem. The next Democratic challenger will have to contend with fake news, as will subsequent Republicans and independents. And when even Macedonian teenagers are getting in on this racket, you can be sure it's not going away. Fake news presents a profound challenge that transcends ideology, striking at the core of representative democracy: a sober, informed electorate.
Something does need be done. But it would be a mistake to pressure Facebook and Google into acting as censors. We've already seen how such an approach can backfire. Manipulation, such as Facebook altering its Trending Newsfeed to hide conservative sources, only leads to higher levels of public distrust. That in turn empowers purveyors of fake news. If users feel they are being denied relevant (or juicy) content, they will simply seek it out from sites with no such editorial standards.
The solution to this problem isn't less content; it's better curation. In the 1950s, the FCC regulated the television industry with a program it called the "Fairness Doctrine." The thinking went like this: With only three networks to choose from, viewers needed reliably balanced news and opinion. So, if a television station aired one perspective on a controversial topic, it was obliged to air an opposing view.
As a country, we should look at the possibility of adopting a digital equivalent to the Fairness Doctrine. Social media like Facebook work best when they're effectively monopolies. So do companies like Google, which depend on collecting user data to target search results and advertising. It's called the network effect: The more people who use a social network, the more indispensable it becomes.
Big Data analytics like Facebook's social graph are notorious for their ability to identify consumer niches. It should therefore be possible to individually program our news feeds for balance and accuracy. If services like Facebook and Google are allowed to become news-aggregating monopolies, it's only reasonable to expect them to serve the public good as well as the bottom line.
After an election this contentious, some partisan second-guessing is to be expected. But the problem of fake news goes beyond Democrats wondering what might have been. Fake news pours gasoline on a country already inflamed with political and even ethnic and racial hostilities. The last thing we need going into the next four years is more confusion and misunderstanding. It might not be Facebook or Google's fault that we're in this mess, but they are uniquely positioned to help us get out of it.