Facebook founder and CEO Mark Zuckerberg has released a statement explaining how the social network will tackle the increasing number of murders recorded via Facebook Live, saying that the company will employ an extra 3,000 people in order to tackle reports made on the site.
In the statement, Zuckerberg noted that “we’ve seen people hurting themselves on others and Facebook,” noting that he’d been “reflecting on how we can do better for our community.” His solution is to hire more people to deal with reports made on the network, with these new reviewers also helping to remove hate speech and child exploitation that appear on the site. He added that the company would continue to work with local community groups and law enforcement in order to notify the correct authorities if they believe a user is in danger or if they’re “about to harm themselves.”
A number of disturbing incidents have been captured on Facebook Live since the feature was implemented. In February, a pregnant woman recorded herself being shot in Chicago in a shooting that claimed the lives of two people, including a two-year-old boy. In April, Steve Stephens recorded himself killing Robert Godwin Sr, while later that month a man in Thailand recorded himself murdering his 11-month old daughter.
Facebook Live allows any use to live stream themselves on the fly, with it being a feature that was heavily pushed by the social network following its launch in 2016. However, since its introduction it has frequently been in the news for the disturbing footage that has been broadcast using the tool, with calls for Zuckerberg to crack down on those abusing the service.
Read Zuckerberg’s statement in full below:
“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.
If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.
Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.
These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.
In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren’t so fortunate.
No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.”