A live-streamed murder in Cleveland is the latest in a series of incidents which have fixed a spotlight on Facebook’s growing role in global affairs.
O n Easter Sunday, Steve Stephens drove around downtown Cleveland with the intent to commit murder. Within minutes he had an audience of millions as he pulled up and shot a bystander, Robert Godwin Sr. The killing was broadcast live on Facebook, where it remained visible for nearly two hours.
Mr. Stephens has since been found dead by police after a nationwide manhunt, but the incident has opened an inquest of a different kind. In recent months a slew of killings, sexual assaults and other crimes have reached millions of people via social media. Last summer, the death of Philando Castile - shot dead by police in Minnesota - was broadcast live by his girlfriend across Facebook. Three men in Sweden were arrested in January after raping a woman and streaming the assault live. In February, the fatal shooting of two radio journalists in the Dominican Republic was captured on Facebook Live.
The recent spike in so-called real-time crime might not be surprising, in part because most of Facebook’s 1.79 billion users now have broadcasting and recording devices in their pockets. However, Facebook is facing a severe backlash, raising questions as to how actively it should police content on its platforms and whether it is possible to create a safe space in real time.
‘The world’s most powerful editor’
In a public post on Monday Justin Osofsky, a vice-president of Facebook, admitted “we need to do better”.
That may prove difficult. As Kelsey Hamlin, a Seattle-based journalist, told The World Weekly, it is extremely unlikely that Facebook would abandon its live-streaming services, “simply because of the competitive market and because of profit”.
Additionally, the company has been criticised for overzealous moderation in the past. Last September, Facebook removed an iconic Pulitzer-winning photograph from the Vietnam War, depicting a naked nine-year-old girl fleeing napalm bombs. Norway’s largest newspaper subsequently labelled Facebook CEO Mark Zuckerberg “the world’s most powerful editor” and accused him of “abusing his power”.
Hemanshu Nigam, founder of SSP Blue, an advisory company for online safety, security and privacy, argues that Facebook should take a more assertive role in policing its content. “Facebook is a private company,” he told TWW. “It has a responsibility to decide what kind of content it wants to allow.”
Across the tech industry, Mr. Nigam said, companies must “seriously think about not just relying on users to report issues, but work co-actively to prevent them happening in the first place, and build mechanisms to respond quickly when they have happened.” Facebook could look at anomalous behaviour, such as incidents in which videos jump from five or ten people to triple figures in a number of minutes, he suggested.
Facebook Live relies on its user community to flag inappropriate or graphic content. Once an issue is flagged, it is sent to a global team of professionals within 24 hours. That time-frame, critics say, is too long. Videos can be viewed millions of times, or reposted on other social media platforms, before being removed.
Pressure from advertisers could necessitate a change in tact. YouTube discovered as much last month, when some 250 brands - including McDonald’s and the UK government - pulled their advertising from the platform after it emerged that their ads were being placed alongside videos from notorious white nationalists and jihadist propagandists.
The criticism that Facebook has faced over the past year, especially after the US elections, suggests that the company is struggling to grapple with its increasingly influential role in global affairs. The tech industry, Mr. Nigam said, is in the midst of an “identity crisis”.
Fake news is a case in point. Blatantly false stories, like a report that the Pope had endorsed Donald Trump for president, spread throughout Facebook like wildfire before the election. A recent Buzzfeed report showed that there was more engagement on fake news stories than real ones over the course of the election campaign.
44% of American adults get their news from Facebook and 62% from social media in general according to the Pew Research Centre.
“I’d say [Facebook] has a greater concern with the dissemination of fake news than they do with live videos,” Ms. Hamlin said. “Considering that Facebook has decided to uptake the role of disseminating information, it would be wise for them to create their own code of ethics in carrying matters out responsibly.”
Mr. Zuckerberg has acknowledged that Facebook is a media company rather than a technology company, admitting “we’re an important part of the public discourse.” He has since outlined plans to combat fake news, but significant challenges lie ahead. Facebook’s sheer scale means that incidents like Mr. Godwin’s murder, or indeed the spread of fake news, will be impossible to eradicate completely.