Facebook to hire 3,000 to review suicide, crime videos
Facebook plans to hire another 3,000 people to review videos and other posts after getting criticised for not responding quickly enough to murders shown on its platform.
The hires over the next year will be on top of the 4,500 people Facebook already has to identify crime and other questionable content for removal. CEO Mark Zuckerberg wrote on Wednesday that the company is "working to make these videos easier to report so we can take the right action sooner - whether that's responding quickly when someone needs help or taking a post down."
Videos and posts that glorify violence are against Facebook's rules, but Facebook has been criticised for being slow in responding to such content, including videos of a murder in Cleveland and the killing of a baby in Thailand that was live-streamed. The Thailand video was up for 24 hours before it was removed.
In most cases, content is reviewed and possibly removed only if users complain. News reports and posts that condemn violence are allowed. This makes for a tricky balancing act for the company. Facebook does not want to act as a censor, as videos of violence, such as those documenting police brutality or the horrors of war, can serve an important purpose.
Policing live video streams is especially difficult, as viewers do not know what will happen. This rawness is part of their appeal.
While the negative videos make headlines, they are just a tiny fraction of what users post every day. The good? Families documenting a toddler's first steps for faraway relatives, journalists documenting news events, musicians performing for their fans, and people raising money for charities.
"We don't want to get rid of the positive aspects and benefits of live streaming," said Benjamin Burroughs, professor of emerging media at the University of Nevada in Las Vegas.
IMPLICATIONS FOR HARM
Burroughs said that Facebook clearly knew live streams would help the company make money, as they keep users on Facebook longer, making advertisers happy. If Facebook had not also considered the possibility that live streams of crime or violence would inevitably appear alongside the positive stuff, "they weren't doing a good enough job researching implications for societal harm", Burroughs said.
With a quarter of the world's population on it, Facebook can serve as a mirror for humanity, amplifying both the good and the bad - the local fundraiser for a needy family and the murder-suicide in a faraway corner of the planet. But lately, it has got outsize attention for its role in the latter, whether that means allowing the spread of false news and government propaganda or videos of horrific crimes.
Videos live streaming murder or depicting kidnapping and torture have made international headlines even when the crimes themselves would not have, simply because they were on Facebook, visible to people who would not have seen them otherwise.
As the company introduces even more new features, it will continue to have to grapple with the reality that they will not always be used for positive, or even mundane things. From his interviews and Facebook posts, it appears that Zuckerberg is aware of this, even if he is not always as quick to respond as some would hope.
"It's heartbreaking, and I've been reflecting on how we can do better for our community," Zuckerberg wrote on Wednesday about the recent videos.