Just recently I made a post about the strict censorship process the Chinese government adheres to. The Great Firewall blocks information left, right and centre in order to ‘protect’ its people. How would you feel if I told you that one of our – and in particular, my personal favourite – social media platforms similarly adhered to strict censorship rules and regulations? Well, get ready for it…
It has been noted that “more than 1.4 billion people around the world use Facebook every month” (Al Jazeera, 2015), with users each having their own form of laws, cultures and accepted social norms and customs. With such a wide-reaching audience with the capacity to contribute to the global news platform, Facebook has been censoring the content users upload on a daily basis.
The rule book that Facebook moderators follow when deciding the fate of uploaded user content has been leaked online, and the definitions of ‘offensive material’ and ‘unacceptable content’ prove to be intriguing. According to Facebook’s Community Standards outline, the social networking conglomerate aims to keep individual users safe by “removing content, disabling accounts and working with law enforcement” and moreover urges respectful behaviour online (Facebook Community Standards, 2015). The Abuse Standards Violations handbook utilised by Facebook moderators depicted below includes but is not limited to censoring sexual content and nudity, hate speech, graphic content and self harm. Whilst it is the job of moderators to control the content accessible on this social network, Facebook “relies on users to identify breaches of the community standards. “If you see something on Facebook that you believe violates our terms,” the site implores, “you should report it to us” (The Economist, 2014). But does this warrant the use of strong regulative censorship that Facebook engages in?
In 2012, Facebook actively censored an image shared by a social media networker portraying a homosexual couple kissing. Whilst the original reason for removing the content from the site was based upon an infringement of Facebook’s Statement of Rights and Responsibilities, Facebook later addressed the issue, apologising for its mistake.
This is not the first case where Facebook’s confusing censorship guidelines have caused controversy in the social networking blogosphere. Ryan Tate reports that “technically, Facebook allows breastfeeding pictures. But such pictures still seem to get regularly banned on the social network” (Tate, 2015).
More recently in 2014, German photographer Peter Kaden went far enough as to blur out the genitalia on Louvre sculptures after the images he shared on the social networking site were removed due to their exposure of nudity. Keep in mind that whilst Facebook’s content is strictly monitored, it is okay to share images of blood, gore and cracked skulls, so “long as no insides are showing” (oDesk Abuse Standards Violation).
So what does this suggest about Facebook’s methods of censorship? The process by which Facebook declares an article, post or image to be offensive or unacceptable has proven to be a very confusing and somewhat contemptuous one. The extent to which Facebook sanitises its content is amazing, and there is no doubt that this form of control changes the way we perceive the online global media forum as a result.
Much like Facebook, Instagram has recently been under fire for censoring and consequently removing images depicting menstruation. To follow the story of Rupi Kaur’s censorship struggle with the popular networking platform, I suggest you take a look at this.
- Al Jazeera, 2015, ‘Is Facebook Right to Censor User Content?’, Al Jazeera, 17 March, viewed 15 April 2015, <http://www.aljazeera.com/programmes/insidestory/2015/03/facebook-censor-user-content-150317021301434.html>
- Chen, A., 2011, ‘Facebook Apologises For Censoring Gay Kiss Picture’, Gawker, 19 April, viewed 15 April 2015, <http://gawker.com/5793536/facebook-apologizes-for-censoring-gay-kiss-picture>
- Chen, A., 2012, ‘Inside Facebook’s Outsourced Anti-Porn and Gore Brigade, Where ‘Camel Toes’ are More Offensive than ‘Crushed Heads’’, Gawker, February 16, viewed 15 April 2015, <http://gawker.com/5885714/inside-facebooks-outsourced-anti-porn-and-gore-brigade-where-camel-toes-are-more-offensive-than-crushed-heads/all>
- Facebook, 2015, ‘Community Standards’, Facebook, 2015, viewed 15 April 2015, <https://www.facebook.com/communitystandards>
- Facebook, 2015, ‘Statement of Rights and Responsibilities’, Facebook, January 30, viewed 15 April 2015, <https://www.facebook.com/legal/terms>
- Frank, P., 2014, ‘Artist Hilariously Censors The Louvre’s Nude Statues for Facebook (SWF!)’, The Huffingtion Post, 17 July, viewed April 15 2015, <http://www.huffingtonpost.com/2014/07/17/peter-kaaden_n_5592038.html>
- oDesk, ‘Abuse Standards 6.2: Operation Manual for Live Content Moderators’, oDesk Proprietary and Confidential, viewed 15 April 2015, <https://moodle.uowplatform.edu.au/pluginfile.php/363742/mod_resource/content/1/manualfacebookmoderators.pdf>
- Tate, R., 2012, ‘Why Moms Are Breastfeeding in Facebook’s Face’, Gawker, February 7, viewed 15 April 2015, <http://gawker.com/5883135/why-moms-are-breastfeeding-in-facebooks-face>
- The Economist, 2014, ‘Facebook Censorship: Arbitrary and Capricious’, The Economist, August 28, New York, viewed 15 April 2015, <http://www.economist.com/blogs/democracyinamerica/2014/08/facebook-censorship>