Stay Updated on Developing Stories

Opinion: Amazon's recent film decision reveals a dangerous trend

Editor's Note: (Julian Zelizer, a CNN political analyst, is a professor of history and public affairs at Princeton University. He is the author and editor of 24 books, including his forthcoming co-edited work, "Myth America: Historians Take on the Biggest Lies and Legends About Our Past" (Basic Books). Follow him on Twitter @julianzelizer. The views expressed in this commentary are his own. View more opinion on CNN.)

(CNN) It's time for tech and social media companies to take some responsibility for the content they are willing to house on their platforms.

At this week's DealBook summit, Andy Jassy, Amazon's chief executive, said that the company would continue to allow the film -- "Hebrews to Negroes: Wake Up Black America!" -- to be sold on its site. This film has generated huge controversy for its antisemitic content promoting false claims about the reaches of Jewish power and the number of Jews who died in the Holocaust. Jassy justified the decision by saying, "we have to allow access to those viewpoints, even if they are objectionable."

Brooklyn Nets player Kyrie Irving, however, paid the price for tweeting a link to the film last month. Though Irving eventually issued a full-throated apology, his failure to do so immediately and unequivocally led to a temporary suspension by the Nets and the suspension of his Nike sponsorship.

If the distribution and promotion of this hateful film was a one-time problem, we would not have cause for major concern. But it is not.

We have seen a number of platforms provide a home for bigotry over the past few months. Right now, the new owner of Twitter, Elon Musk, has been removing controls that had prevented bigotry, rage and provocation from being spread by the site.

In the name of free speech, he has decided to pump out a daily dose of vile commentary on his personal account, reactivated the accounts of several controversial figures -- including former President Donald Trump and conservative Canadian podcaster Jordan Peterson -- and allowed bigotry and hate to proliferate across the platform. Invoking a false antisemitic trope about Jewish power, Musk recently tweeted that retired Lt. Colonel Alexander Vindman, whose testimony was at the heart of Trump's first impeachment, was "both puppet & puppeteer. Question is who pulls his strings..."

As Oren Segal, vice president of the Anti-Defamation League, told Vice News, "As soon as he took over Twitter, we saw extremists trying to exploit the platform, we've seen hate of all kinds increase... it's becoming a hellscape for antisemitism and racism and bigotry."

While it is certainly good for tech companies to promote free speech and civic debate, they also need to reexamine the content that they deem permissible on their platforms. And there are real-world implications for these companies failing to create clear boundaries. University of Washington historian Margaret O'Mara says we need look no further than January 6, 2021, in which social media played a critical role in activating violent protesters and helping them to organize.

It's worth noting that companies already make choices about what they will -- and won't -- support all the time. Amazon has the power to determine what will go on its site, and, as evidenced by Jassy, it exercises that power. On Twitter, Musk recently said he will permanently ban people impersonating other people on their accounts. Meanwhile, Meta relies on algorithms to determine what kinds of material will appear on our feeds.

In general, we don't want a system where the owners of major tech companies have too much of a heavy hand in determining what content gets placement. And for good reason. Many fear that Musk's control over Twitter, for example, could turn the social media platform into a massive bully pulpit to spread incendiary and dangerous material.

On the other hand, guardrails are necessary -- and there's historical precedent for them. From 1949 to 1987, the Fairness Doctrine required American radio and television stations to devote time to important provocative issues and to present both sides of those issues when they did. The Federal Communications Commission revoked the doctrine in the late 1980s, but several years later introduced the television parental guidelines -- similar to movie ratings -- to provide parents with a clear understanding of the kinds of visuals and themes present in shows.

And beyond trying to present multiple sides of an issue or adding warnings or disclaimers, media executives routinely reject material that either will not be profitable or could be dangerous to air. After all, it's not as if any major TV network has a history of showing Holocaust denial films.

Reputable news organizations similarly adhere to strict editorial and production controls before publishing or broadcasting stories. Even physical and mail order stores have made their own set of tough choices about what they will or will not stock on their shelves.

Have these systems been perfect? No -- far from it. Yet they have been important to creating some kinds of parameters.

Unfortunately, we cannot simply rely on public sentiment or consumer choice to dictate parameters -- particularly given that many people are eager to devour the most poisonous ideas out there. All the "isms" that afflict society can be extremely profitable. In an era defined by likes, retweets and shares, the most provocative statements often attract the most eyeballs.

If CEOs of major companies don't do something, much of our online world will likely continue to devolve into a cesspool of toxic ideas -- fueling division, extremism and potentially harmful activity.

The leaders of these online companies must act now. If they don't, responsible users concerned about the proliferation of hate should consider using other platforms which offer some degree of moderation.

While executives like Jassy may not be able to filter all the content that finds a home on the internet, to rein in some of the most dangerous films, shows and posts, they should think a little harder about what they serve as a platform for -- and why.

Outbrain