Editor’s Note: Kara Alaimo, an associate professor of communication at Fairleigh Dickinson University, writes about issues affecting women and social media. Her book, “This Feed Is on Fire: Why Social Media Is Toxic for Women and Girls — And How We Can Reclaim It,” will be published by Alcove Press in 2024. The opinions expressed in this commentary are her own. Read more opinion on CNN.

CNN  — 

Tech executives could face the prospect of time behind bars in Britain if they willfully ignore rules designed to protect children online under a proposed amendment to an online safety bill.

Courtesy Kara Alaimo
Kara Alaimo

As it’s currently written, the bill would require social media companies to identify and remove content promoting self-harm, including content that glorifies suicide, and not allow children under the age of 13 to use their platforms. In a written statement to parliament, Secretary of State for Digital, Culture, Media and Sport Michelle Donelan said tech leaders who act in “good faith” would not be affected, but those who “consent or connive” not to follow the new rules could face jail time.

Let’s hope this bill passes. For far too long, tech leaders have evaded responsibility for the harmful impact their products can have on those who use them. And while it’s unlikely that a law similar to this amendment to the UK bill would ever pass in the US — given its fiercely pro-business climate, broad constitutional protection of free speech, and regulations that limit liability for internet platforms over what their customers post online — other countries should consider similar penalties for tech executives.

The tech industry, of course, disagrees. TechUK, an industry trade association in the country, said the prospect of jail time would not make social networks safer for children but would discourage investment in the country. But I think this law would do just the opposite: serve as a wake-up call to tech leaders that they’re accountable for what the products they build do.

Part of the reason tech executives have evaded personal responsibility for their impact on society for so long is because of the way we think about social media. We talk about what happens in real life to distinguish it from what happens online. But the effects that social networks have on users — especially children — are often very much felt in “real” life.

For instance, in September, a British coroner ruled that “negative effects of online content” were partly to blame for the suicide of 14-year-old Molly Russell. The Guardian reports that in the six months before she took her life in 2017, data from Meta revealed that Molly viewed 2,100 pieces of content related to self-harm, depression and suicide on Instagram.

Meta, Instagram’s parent company, has admitted that Molly saw content that violated its community standards, and in 2019, added new policies against graphic images depicting self-harm. It also started offering resource links to users viewing depressive content.

But, in 2021, US Sen. Richard Blumenthal’s staff set up an account pretending to be that of a 13-year-old girl and followed accounts that promoted eating disorders. Instagram then promoted disordered eating accounts with names like “eternally starved.” Instagram told CNN it removed the accounts and that they should not have been allowed in the first place since they violated the platform’s rules against content promoting eating disorders.

And a terrifying report that the Center for Countering Digital Hate released last month explains what happened when researchers set up TikTok accounts purporting to be those of 13-year-olds, quickly pausing on and liking mental health and body image content. Within 2.6 minutes, TikTok was showing suicide content.

Within eight minutes, the platform was recommending content about eating disorders. When an account used a name suggesting the user was vulnerable to an eating disorder, TikTok served up even more of this kind of appalling content. TikTok has said the content that researchers saw doesn’t reflect what other users see due to the study’s limited sample size and time constraints and that it removes content that violates its standards and provides resources for those who need them.

And former Facebook staffer-turned-whistleblower Frances Haugen revealed in 2021 that Meta is well aware of the harmful effects Instagram has on some younger users. But Haugen said the company chooses to prioritize making money rather than protecting children. Meta has said that it is developing parental supervision controls and features to help teens regulate their Instagram use, and CEO Mark Zuckerberg disputed Haugen’s characterization of the company as untrue.

Get our free weekly newsletter

  • Sign up for CNN Opinion’s newsletter.
  • Join us on Twitter and Facebook
  • In the United States, members of Congress have passed just two laws regulating how companies interact with children online in the last 25 years — one requiring parental consent for sites to collect data about children under the age of 13 and one that holds sites accountable for facilitating human trafficking and prostitution.

    There is no reason why tech leaders should be exempt from liability for what their products can do to users. This amendment in the UK should also be a wake-up call to parents and other social media users about the dangers that we and our children could face online.

    If jail sounds draconian, it’s nothing compared to the price Molly Russell and her family have paid. But, five years after her suicide, social platforms are still serving up the same kind of toxic content to vulnerable young people. This must stop — even if it takes putting tech executives behind bars.