Facebook, which has grappled with a string of violent crimes and activities broadcast on its live video platform in recent months, is tackling the issue with another 3,000 human moderators, company CEO Mark Zuckerberg announced Wednesday.
In a post written on the social network, the company chief highlighted the urgent need to scrub the site of horrific viral content, such as the murder of a 74-year-old man in Cleveland which unfolded on Facebook Live last month. The company gets “millions of reports” of inappropriate live videos a week, according to Zuckerberg, and the hiring spree will add to Facebook’s already numerous team of 4,5000 moderators over the course of a year.
“If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down,” the CEO wrote.
The company will also implement more tools to make it easier for users to report violent or potentially harmful incidents. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” Zuckerberg noted.
Facebook Live, while immensely popular for publishers, public figures, and brands, clearly has a dark side. Even though it’s still in its infancy -- Facebook Live debuted for all users in April, 2016 --- the medium has captured brutal Chicago gang activity, suicide, and child abuse.
Zuckerberg, for obvious reasons, wants to crack down on all of that, describing a grim episode the company was recently able to prevent: “Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.”
To read Zuckerberg’s full statement, see below: