Facebook, which has grappled with a string of violent crimes and activities broadcast on its live video platform in recent months, is tackling the issue with another 3,000 human moderators, company CEO Mark Zuckerberg announced Wednesday.
In a post written on the social network, the company chief highlighted the urgent need to scrub the site of horrific viral content, such as the murder of a 74-year-old man in Cleveland which unfolded on Facebook Live last month. The company gets “millions of reports” of inappropriate live videos a week, according to Zuckerberg, and the hiring spree will add to Facebook’s already numerous team of 4,5000 moderators over the course of a year.
“If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down,” the CEO wrote.
The company will also implement more tools to make it easier for users to report violent or potentially harmful incidents. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” Zuckerberg noted.