Are Instagram, Pinterest and Tumblr Leading the Way in a Content Censorship Wave?
Over the weekend, Instagram updated its community guidelines to prevent the sharing of content that promotes self-harm, coming on the heels of similar updates at both Pinterest and Tumblr. The changes bring to the forefront questions about the responsibilities of visual curation sites around images that contain “questionable content.”
Do image oriented and visual curation sites have unique responsibilities since photos can be used as very specific maps for how to achieve certain negative activities, such as self-mutilation? Does the discussion expand to other social sites such as Twitter and Facebook, where I can link to such a picture? What about the role of general content arbiter aka search engines? Should our ability to look for and connect with potentially objectionable content at all be controlled?
In an excerpt from the Instagram blog, the staff explains the site’s new rules:
Don’t promote or glorify self-harm: While Instagram is a place where people can share their lives with others through photographs, any account found encouraging or urging users to embrace anorexia, bulimia, or other eating disorders; or to cut, harm themselves, or commit suicide will result in a disabled account without warning. We believe that communication regarding these behaviors in order to create awareness, come together for support and to facilitate recovery is important, but that Instagram is not the place for active promotion or glorification of self-harm.
Self-harm is being defined as content promoting eating disorders, self-mutilation behaviors like cutting, and suicide. More than 30,000 images have been removed from Instagram, and search tags such as #anorexia, #bulimia, #thinspo and #thinspiration, which are connected with pro-anorexia communities and discussions, have been banned.
Plans are in the works, or already exist, at these sites to censor searches for these terms by warning users that accounts can be terminated if they try to tag content with a prohibited term or by displaying public service warnings designed to deter negative behaviors. Several accounts suggest that these features aren’t working properly, or are already being circumvented by those in the pro-ana community.Continued on the next page