Outcry prompts Facebook to reverse censorship of iconic Vietnam war image
San Francisco/Washington — After Facebook’s removal of an iconic Vietnam war photo stirred international uproar last month, the social network’s executives backtracked and cleared it for publication.
That was after the image, showing a naked Vietnamese girl burned by napalm, was used in company training sessions as an example of a post that should be removed, two former Facebook employees told Reuters.
They said trainers told content-monitoring staffers the photo transgressed Facebook policy, despite its historical significance, as it depicted a naked child, in distress, photographed without her consent.
The social network has taken great pains to craft rules to be applied uniformly with minimal discretion. But the reversal on the war photo shows how Facebook’s top executives sometimes overrule company policy and its legions of low-and mid-level content monitors.
Facebook often insists it is a technology company, not a media company, but an elite group of at least five senior executives directs content policy and makes editorial calls, particularly in high-profile controversies, eight current and former Facebook executives told Reuters.
One of those key decision-makers, Justin Osofsky, who runs the community operations division, wrote a Facebook post admitting that removing the war photo was a "mistake". He said: "Sometimes, he global and historical significance of a photo like ‘Terror of War’ outweighs the importance of keeping nudity off Facebook."
Facebook spokeswoman Christine Chen would not comment on the group’s use of the photo in training sessions. Facebook has long resisted calls to publicly detail its policies and practices on censoring postings. That approach has drawn criticism from users who have had content removed and free-speech advocates, who cite lack of transparency and an appeals process for many content decisions.
At the same time, some governments and antiterror groups are pressuring the company to remove more posts they consider offensive or dangerous.
The current and former Facebook executives, most of them speaking on condition of anonymity, told Reuters in detail how complaints moved through the company’s content-policing apparatus. The toughest calls, they said, rose to an elite group of executives.
Another of the key decision-makers is Global Policy chief Monika Bickert, who helped rule on the fracas over the war photo. "That was one we took a hard look at, and we decided it definitely belonged on the site," said Bickert, a former federal prosecutor. She would not elaborate on the decision-making process.
Facebook chief operating officer Sheryl Sandberg followed up with an apology to Norwegian Prime Minister Erna Solberg, who posted the photo on her own account after Facebook removed it from others in her country.
In addition to Sandberg, Osofsky and Bickert, executives involved in sensitive content issues include Joel Kaplan, Facebook’s Washington-based government relations chief; and Elliot Schrage, vice-president for public policy and communications.
All five studied at Harvard, and four have undergraduate and graduate degrees from the elite institution. All but Sandberg hold law degrees. Three have longstanding personal ties to Sandberg.
CE Mark Zuckerberg, a Harvard dropout, occasionally got involved in content controversies, Bickert said.
These executives also weigh in on content policy changes meant to reflect shifting social context and political sensitivities around the world, current and former executives said.
Facebook officials said the five people identified by Reuters were not the only ones involved in high-level content decisions. "Facebook has a broad, diverse and global network involved in content policy and enforcement with different managers and senior executives being pulled in depending on the region and the issue at hand," Chen said.
Chen declined to name other executives involved in content policy.
War over free expression
The company’s reticence to explain censorship decisions has been criticised in many countries. Last month, Facebook disabled the accounts of editors at two of the most widely read Palestinian online publications, Shehab News Agency and Quds. In line with company practice, Facebook didn’t publicly explain the action or pinpoint content it considered inappropriate.
The company told Reuters the removal was simply an error. Some Palestinian advocacy groups and media outlets condemned the shutdowns as censorship stemming from what they called Facebook’s improper alliance with the Israeli government.
Israel’s government has pushed Facebook to block hundreds of pages it believes incite violence against Jews, said Noam Sela, spokesman for Israeli cabinet minister Gilad Erdan. Sela said the Israeli government "had a connection" at Facebook to handle complaints, but would not elaborate.
"It’s not working as well as we would like," Sela said. "We have more work to do to get Facebook to remove these pages."
Ezz al-Din al-Akhras, a Quds supervisor, said that Facebook’s head of policy in the Middle East got in touch after the uproar over the shutdowns and three of four suspended accounts were restored.
"We hope the Facebook campaign of suspending and removing Palestinian accounts will stop," he said. "We do not practise incitement; we are only conveying news from Palestine to the world." Facebook said the restoration of the accounts was not a response to complaints. It declined to comment on whether top executives were involved.
The company has cited technological glitches in other recent cases where content was removed, then restored, including the takedown of a video showing the aftermath of a Minneapolis police shooting.
Chen declined to explain the glitch. She said the company was reviewing its appeals process in response to public feedback. Facebook currently allows appeals of company actions involving entire profiles set up by people or institutions, or full pages on those profiles, but not for individual posts.
Thick rule book
To manage the huge volume of content complaints — more than a million a day — the company employs a multilayered system. It starts with automated routing of complaints to content-policing teams in Dublin, Hyderabad, Austin and Menlo Park, who make initial rulings, current and former executives said.
These low-level staffers and contractors consult a thick rule book that interprets the comparatively spare "community standards" that Facebook customers are asked to follow. The company trains frontline monitors to follow rules and use as little discretion as possible.
When a removal sparks more complaints, regional managers function as a mid-level appeals court. Continuing controversy could then push the issue to top US executives.
Senior executives also weigh in on policy updates. Osofsky and Kaplan, for instance, wrote a blog post last week, in response to "continued feedback" on content removals, explaining that the company would start weighing news value more heavily in deciding whether to block content.
In an earlier post, responding to the Napalm-girl controversy, Osofsky said Facebook’s policies usually work well, but not always.
"In many cases, there’s no clear line between an image of nudity or violence that carries global and historic significance and one that doesn’t," Osofsky wrote.
The Vietnam war photo — depicting horrors suffered by a girl named Phan Thi Kim Phuc — was first removed from an account in Norway by a frontline monitor.
In protest, the Norwegian newspaper Aftenposten printed the image on its front page and posted it on Facebook, which removed it. That prompted the prime minister to post the photo, only for Facebook to remove it again.
Facebook issued a statement defending the action, saying it was "difficult to create a distinction between allowing a photograph of a nude child in one instance and not others". The next day, executives reversed the call, with Sandberg telling the prime minister: "Even with clear standards, screening millions of posts on a case-by-case basis every week is challenging."