Stay Updated on Developing Stories

Tech companies have a major 'live' problem

New York(CNN) Big tech companies have spent the better part of two years telling us how they're trying to fix their misinformation problem.

But their efforts to increase interest in live content is adding to it.

On Monday, as the Notre Dame Cathedral in Paris burned, YouTube suggested the fire was linked to the September 11 terrorist attacks.

Those watching the live video feeds posted by reputable news outlets, including NBC News and France 24, were shown an information box with facts about the 2001 attacks. YouTube normally places the boxes next to videos that include topics often subject to misinformation and conspiracy theories, such as the moon landing.

A YouTube spokesperson said on Monday that the feature was "triggered algorithmically and our systems sometimes make the wrong call."

YouTube fixed the error after an hour or so, but it was yet another example of a live content misstep made by tech companies this month.

The companies' enthusiasm for users to engage in live experiences, whether it's a Facebook Live broadcast or comments on real-time YouTube videos, is creating more opportunity for misinformation, hate and propaganda to flourish -- the very issues the companies are trying to tackle.

Hany Farid, a professor at Dartmouth College and digital forensics expert, told CNN Business that live content is much harder for social media companies to police.

"As if they didn't have enough problems already, it increases the level of complexity," Farid said.

After the suspect in last month's terrorist attack at two New Zealand mosques streamed the massacre live on Facebook, the company said it would consider limiting who could broadcast live in the future -- perhaps by preventing people who had broken Facebook's rules in the past from going live. The company failed to stop the livestream of the video as it unfolded, even though it has hired thousands of human moderators and invested in artificial intelligence systems to weed out content that violates its content rules.

Last week, when the US House of Representatives streamed a congressional hearing on hate and social media live on YouTube, the company was forced to shut down its live comments feature due to an influx of hateful posts. The irony was not lost on Representative Jerry Nadler, the chair of the committee. "This just illustrates part of the problem we are dealing with," he told the committee at the time.

That YouTube's latest mistake on Monday was caused by a feature the company designed to fight misinformation only adds to the issue.

But the fact that cracking down on live content is presenting new challenges shouldn't be a surprise, Farid said. "There's an immediacy [with live video] that is going to create problems."

After the New Zealand attack, some critics suggested Facebook put a delay on live videos. But Guy Rosen, Facebook's VP of product management, said in a blog post last month that won't solve the problem.

"There are millions of Live broadcasts daily, which means a delay would not help address the problem due to the sheer number of videos," Rosen wrote.

He also noted the benefits to live streaming, such as helping first responders get alerts in real time.

In a recent interview with ABC News, Facebook CEO Mark Zuckerberg argued livestreaming still offers a net positive. He said the experience of connecting people to others in this way is "magical."

When asked about the idea of delaying livestreams, Zuckerberg told ABC that would "fundamentally break what livestreaming is for people."

"Most people are livestreaming, you know, a birthday party or hanging out with friends when they can't be together," he added.

YouTube on Monday did not provide any information about the Notre Dame mistake other than to blame its algorithms. But learning these details could provide important insight into the scale of the challenges the platforms face.

When Facebook received widespread criticism for failing to stop the livestream of the New Zealand attack, it later revealed that it had prevented the video from being uploaded again more than 1.2 million times in first 24 hours after the massacre.

The staggering figure only underlines the breadth of the challenge faced by Facebook, YouTube and any other company not actively curating content.

"I think for a long time people thought, 'This digital world might be bad but doesn't have real-world consequences,'" Farid said. "But we are seeing now that isn't true."

Paid Partner Content