Off With Their Heads! Graphic Content On Facebook Is Judged By A Disturbingly Uneven System

fb_icon_325x325

I've never seen a video of someone being decapitated. I don't think I could handle it, frankly. Whatever morbid curiosity I possess, there are limits to the lengths I'll go to satisfy it. But if your curiosity was harder to tame and you wanted to watch such a video, then you probably won't have to look very far. A few days ago, Facebook lifted a six month old ban on decapitation videos (the ban originated over a user-posted video that showed a Mexican woman beheaded for committing adultery). Facebook now allows users to share graphic videos of decapitations because, according to a Facebook rep:

When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it.

Condemnation or not, Facebook backpedaled today and removed the video that started the whole mess after a public outcry that included Facebook users and British Prime Minister David Cameron. Facebook insists, however, that it didn't change any of its policies, nor will it inherently prevent other violent videos from being posted in the future. Each video will be reviewed on a case by case basis. Turns out that public pressure was a good tool to use in this case because there really are no legal mechanisms that prevent Facebook from allowing users full reign to post whatever content they want. Here's why...

1. Facebook doesn't owe a contractual duty to protect its users from any kind of harm. In fact, Facebook states pretty clearly in its terms and policies that it does not

control or direct users' actions on Facebook and are not responsible for the content or information users transmit or share on Facebook. We are not responsible for any offensive, inappropriate, obscene, unlawful or otherwise objectionable content or information you may encounter on Facebook. We are not responsible for the conduct, whether online or offline, or any user of Facebook.

2. Even if Facebook didn't have contractual protection through the above disclaimer, any tort-based lawsuit against the social network would fail because federal law absolves internet service providers like Facebook from legal responsibility when obscene content is posted by their users. The Communications Decency Act (CDA), which was originally passed in 1996 to regulate pornography on the internet, states that

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

It's worth noting that the CDA also prevents users from suing Facebook if Facebook removes content it deems to be obscene or violent. This means that the CDA is a Teflon-coated Kevlar shielded brick wall sprayed in bullet repellant; Facebook is essentially lawsuit-proof.

So if Facebook can't be sued for letting users post the videos, why did it lift the ban after six months only to backtrack when the public freaked out? My guess is that in the absence of litigation, public opinion is all Facebook can rely on to drive its policies. And, until recently, the public has been largely silent on the issue of graphic, violent content. In other words, Facebook assumed that people didn't care about violent content, so it let users upload the videos until the outcry became impossible to ignore.

But this raises a question that I actually find more interesting. Why has Facebook's handling of violent content been so much less even than its handling of sexual content? (For those who don't know, Facebook has a blanket policy to remove all nude media from user accounts, including breastfeeding pictures.) Call me crazy, but I have a hard time understanding why a photo of a mother breastfeeding her child, even when her breast is fully exposed, is more offensive than a video showing some poor fellow having his head sawed off, even when the reason for posting that video is to criticize and condemn the act. And it's not like the public has been silent on this issue either. When I googled "Facebook bans nude pictures," I got 38 million results.

I personally don't have a problem with Facebook censoring any user content (the First Amendment, remember, only applies to government censorship... Facebook as a private party can censor as much as it wants), but I'd like for its censorship policies to at least have some semblance of uniformity, especially if it won't explain why a photo of a boob is somehow more onerous than a severed head, or why decapitation videos get individual reviews by the Facebook team, while nude pictures get a ban hammer. I hope that we can convince Facebook that sexual content deserve at least the same type of case-by-case scrutiny that it gives to decapitation porn. If not, I fear the puritanical society we may one day become.