Should Facebook Live be REMOVED After Ohio Murder?

In his moment to publicly address the horrifying video of a 74-year-old man’s murder, Facebook CEO Mark Zuckerberg gave it what some call a cursory mention.

Zuckerberg used his keynote at Facebook’s biggest press event of the year to offer his condolences to the family of Robert Godwin Sr. and make a passing vow to work harder to keep communities safe.

“We have a lot more to do here,” he said about developing new technologies to “keep our communities safe.”

“We’re reminded of this this week by the tragedy in Cleveland. And our hearts go out to the family and friends of Robert Godwin Sr.,” Zuckerberg said at his F8 developer conference in San Jose, Calif.

“We have a lot of work, and we will keep doing all we can to prevent tragedies like this from happening,” he added.

Zuckerberg then moved on to a discussion of new technologies such as augmented reality.

There was no explanation for why it took Facebook more than two hours to remove the grisly video Sunday after Steve Stephens posted it at 2:11 p.m. eastern.

In a written statement Monday, Facebook explained the timeline behind the video’s removal.

The company said Stephens uploaded video at 2:09 p.m. Sunday voicing his intent to murder someone. No users flagged it so it went unnoticed by the service.

Stephens uploaded the murder video a few minutes later and then confessed to killing Godwin in a Facebook Live stream that started at 2:22 p.m.

“We know we need to do better,” Justin Osofsky, the company’s vice president of global operations, said Monday of the two hour and 11-minute lag time.

“It was a horrific crime — one that has no place on Facebook,” Osofsky said, adding that the company was reviewing the way it handles complaints and looking at new technologies such as artificial intelligence to improve its reaction time.

“Nearly three hours on the internet is an eternity,” Hany Farid, a digital forensics expert at Dartmouth College told the Daily News. “That absolutely should be lower.”

NY Daily News

He continued saying that Facebook has largely “outsourced” the policing of its content to users and third-party moderators. So, if you don’t catch it, chances are Facebook will take their time getting to it.

Farid helped develop a technology called PhotoDNA to stop the spread of child pornography online.

“They rely on the public to flag the content, (saying) ‘As soon as you tell us, we’ll go take a look. I find that a little inexcusable. And Facebook has outsourced its moderation. I think it’s terribly understaffed — just enough to give lip service but not enough to do the job right. It’s a well-kept secret.”

He said he recognized that the job of inspecting social media videos was huge. Facebook handles more than a billion uploads in one day.

Compare that to YouTube. In late 2015, its users were uploading 300 hours of video every minute.

Farid said that YouTube would need 1,500 full-time employees to review every video as it’s uploaded.

With regards to Facebook, he said, “Maybe it’s not practical, but it’s not ridiculous. It’s not like you need 100,000 employees.”

Osofsky defended Facebook, saying the company has “thousands of people around the world review the millions of items that are reported to us every week in more than 40 languages.”

It is unclear how many hours are dedicated to the effort each week.

Mark Zuckerberg expressed in February that he hopes Facebook will eventually use artificial intelligence to vet content for things like terrorist propaganda.

Farid wasn’t impressed by this:

“We heard him say AI will fix our problems, but we have to be patient, it will take five or 10 years. I think that’s spectacularly naïve.”

Farid believes that a combination of manpower and AI tools would be useful in vetting content.

“It was big news when Google came up with an algorithm that could identify cats in videos. There’s lot of hype, but we are not even close to a tool that can operate at Internet scale and make fairly fine-grained distinctions between a guy killing someone in Cleveland and kids wrestling after school.”

But with the growing number of crimes being aired live on Facebook, would it be simpler to just get rid of it altogether?

The Cleveland murder wasn’t the first crime to air on the site. Suicides, sexual assaults and other disturbing crimes have been streamed in real time, leading many to call for the suspension of Facebook Live.

Emily Bell, the director of the Tow Centre for Digital Journalism at Columbia’s Journalism School, tweeted out:

“Facebook Live worst case has happened. FB has to decide if it wants to continue with it knowing risk to reputation/public safety.”

What do you think?

Share Your Comments
Trending Now on GJWHG