Facebook adding 3,000 more employees to prevent violent videos

This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

INDIANAPOLIS, Ind. -- Facebook is ramping up its efforts to keep videos of violence or crime from spreading.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community," CEO Mark Zuckerberg wrote in a Facebook post.

He announced the company is adding 3,000 more employees to its community operations team, in addition to the 4,500 it already has, to review the millions of reports the site gets each week.

"We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down," Zuckerberg wrote.

Indianapolis police said it takes tech companies cooperating with law enforcement to get people help as soon as possible.

"We are aware that there are incidents that are happening across the nation that have garnered a lot of public attention and we do look out for those things here in Indianapolis," Officer Jim Gillespie said.

Police said they're often tipped off to what's on Facebook by the public. Generally it's fighting.

"Typically with that we see a lot of school fights, you know kids being kids, and they'll start broadcasting a live fist fight between a couple of students and then that will be reported to use or school police," Gillespie said.

He and social media experts called Facebook's move a step in the right direction.

"My initial reaction is that's a great start. The issue is there are almost 2 billion Facebook users and every one of them has the ability to go live it's still gonna be difficult to catch all those incidents of suicide, murder, abuse," said social media expert Dom Caristi, a professor of telecommunications at Ball State University.

"What's needed is artificial intelligence that can detect violent videos earlier than human beings can and that technology's coming, but it's not quite here yet," he said.

In the meantime, police remind the public of this:

"We need folks to step up and instead of going ahead and going live on Facebook we need them to call 911 so we can get there before somebody gets seriously injured," Gillespie said.

Facebook said the additional employees will also help in removing things not allowed on the site, like child exploitation and hate speech.