Facebook plans to reduce censorship, show more offensive but newsworthy content

Facebook will soon display more graphic content including violence and nudity that would normally violate its policies as long as the imagery is newsworthy or important enough.

Joel Kaplan, Facebook’s VP of global policy writes “Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”

The move comes after criticism of Facebook’s temporary censorship of the famous “Napalm Girl” photo of a nude child from the Vietnam War, which was shared by a Norwegian journalist and later by the newspaper he works for. Eventually the company retreated and restored shares of the photo after heavy media and public backlash.

Facebook also temporarily took down the video showing the final moments of Philando Castile before he died after being shot by police. Facebook at the time claimed a “technical glitch” caused the disappearance of the newsworthy video. But Facebook’s VP of News Feed Adam Mosseri last month admitted that Facebook tries to “automatically detect content that violates our standards. And we actually had a sort of miscategorization.” That implies it was a wrongly applied censorship algorithm rather than some sort of mundane server outage.

And just last night, Facebook censored a Swedish breast cancer awareness video, and has since apologized and restored it. While in July Facebook detailed its censorship policy to TechCrunch, clearly it needs a better-defined process for how to apply it, which apparently is what’s coming.

 

screen-shot-2016-07-07-at-6-43-14-pm

Recently Facebook took feedback from its community about what they did and didn’t want to see. Its decision is that “In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards,” Kaplan said.

One possible way to do this would be to age-gate content that might be offensive to minors. As for adults, TechCrunch has suggested Facebook employ interstitial warnings about graphic content that users would have to click through to watch. To aid this, we’ve suggested that Facebook ad a content flagging option that something is “graphic but newsworthy”, which would allow users to notify the company about content that needs a warning but shouldn’t be removed.

Facebook has repeatedly stated that it’s not a media company, meaning it doesn’t have the same editorial responsibilities to avoid censorship even if it offends viewers. Facebook insists it operates as a technology platform that gives users what they want.

On stage at TechCrunch Disrupt, Mosseri said that “We think of ourselves as a technology company. We know we play a meaningful role in media,” yet “our responsibility is to make sure we’re a platform for all ideas. We’re not in the business of deciding which ideas people should read about.”

But by relaxing its community standards today, Facebook is directly contradicting that statement. It’s making a judgement call about what’s newsworthy and what people should read about, even if it might offend them or other users. This makes it clear that unless Facebook wants to run an unrestrained free-for-all of a content site or apply an inflexible censorship policy, it must accept its responsibilities as a media company.