Stay Updated on Developing Stories

Facebook's alarming plan for news feeds

Editor's Note: (Kara Alaimo, an associate professor of public relations at Hofstra University, is the author of "Pitch, Tweet, or Engage on the Street: How to Practice Global Public Relations and Strategic Communication." She was spokeswoman for international affairs in the Treasury Department during the Obama administration. Follow her on Twitter @karaalaimo. The opinions expressed in this commentary are solely those of the author. View more opinion at CNN.)

(CNN) Worried about recent news reports that Facebook knows that Instagram, which it owns, hurts the mental health and body images of teenagers? You can thank strong reporting from the media for raising that concern.

Kara Alaimo

But you should also be worried that Facebook has developed a way to lessen the chance that you'll hear about such criticisms of the platform in the future. The company now plans to show users more positive articles about itself in news feeds, using what is known as Project Amplify, according to The New York Times, citing as sources three people with knowledge of the effort.

If certain favorable stories are given priority, that by definition leaves less space for more objective reporting to be shared in news feeds, since users can only read a finite number of stories each time they log on.

The plan is downright alarming and underscores the importance of seeking out news outside of social platforms.

According to the Times, Facebook plans to show users more articles that make it look good -- including some written by its own employees. The Times reports the initiative was personally approved in August by chief executive Mark Zuckerberg. The company told the Times it hadn't changed its approach. But a spokesman also seemed to suggest it was making a change, telling the Times, "people deserve to know the steps we're taking to address the different issues facing our company -- and we're going to share those steps widely."

The reason this plan is so disturbing is because Facebook is one of the primary places Americans get their news. According to a just-released Pew survey, 48% of adults sometimes or often get their news from social media.

If Facebook prioritizes articles that it perceives as favorable to its image, users may, of course, see fewer critical stories since they likely will not be given priority and amplification in the same ways as the pieces the company likes. It would also be a short step from here to censoring articles attacking the company altogether. That could limit an urgently needed national debate about what Facebook and other social networks are doing to the country.

It's never been more important to have an honest reckoning about the problems technology companies are creating and how to fix them.

For example, online misinformation poses a direct threat to America's democratic system. According to Facebook's own figures, a campaign by Russia to spread disinformation around the 2016 presidential election reached 126 million Facebook users and 20 million Instagram users.

Misinformation originating from the conspiracy group QAnon around the 2020 election helped spread the false perception that the election was stolen from former President Donald Trump and drove participation in the deadly January 6 attack on the Capitol, according to Mia Bloom and Sophia Moskalenko's newly published book "Pastels and Pedophiles: Inside the Mind of QAnon."

Bloom and Moskalenko write that "Facebook determined that QAnon was dangerous relatively late in the game" -- two years after the group was banned by Reddit. And, of course, Facebook only suspended Trump -- who had a history of using bellicose language -- from its platform after the January 6 attack. The company's oversight board later found that Trump's "words of support for those involved in the riots legitimized their violent actions."

If Facebook users miss out on critical coverage of how social media is affecting elections and only or primarily see content the company perceives as favorable, they will be less likely to have the national debate the nation urgently needs about how to safeguard future elections from violence, foreign interference and misinformation.

Americans will also be less likely to fully contend with what social media is doing to us as people.

While it was chilling to read a recent Wall Street Journal report that Facebook is well aware that Instagram makes teen girls feel bad about their bodies, it would have been even more surprising if Facebook were not aware of these problems.

Research has long shown that users post self-promotional content on social media, and this provokes envy in their friends. Over time, envy has been documented to damage a person's mental health, diminish their sense of well-being and self-worth, cause them to become dissatisfied with and withdraw from groups and even cause depression. (Instagram's head of public policy, Karina Newton, wrote in a post in response to the Wall Street Journal article that the piece "focuses on a limited set of findings." She said research shows that social media has both positive and negative effects -- making people feel both more connected and more lonely -- and that the company recognizes that its "job is to make sure people feel good about the experience they have on Instagram.")

These aren't even the only problems with social media. As I've said before, Facebook also tracks users' activities online and can use and share this information in ways that seriously compromise their privacy. But you get the idea.

The dangers posed by Facebook and other social media platforms aren't abstract. They affect users' ability to make informed decisions about who to vote for, how to ensure the peaceful transfer of power between leaders, protect the mental health of children and more. We can't afford to have tough reporting on these issues replaced with puff pieces that populate Americans' news feed and are pleasing to Facebook's public relations team.

Of course, even without this policy, relying on Facebook for news was never a good idea.

As early Facebook investor Roger McNamee wrote in "Zucked: Waking Up to the Facebook Catastrophe", the company's platform is designed to show users extreme stories, since they're what keep them online longer (and therefore, of course, generate more ad revenue for the company).

Those kinds of stories aren't necessarily the ones that keep us most educated and informed about the issues our communities and country are facing. That's why it's so critical for Americans to seek their news outside of tech platforms, directly from trustworthy media outlets.

There's absolutely nothing to like about Facebook's chilling new policy. It will almost certainly cut down on users' knowledge of the dangers posed by social media. The only way to avoid this filter is to seek news directly from legitimate sources that aren't motivated by the desire to make Facebook look good.

Outbrain