The reason it is a sensitive matter is because the work of content moderation itself constitutes the tenuous job of erasing all the dirt, the crime, the outrageous, the terrifying and the abominable that humans are capable of putting on the web. Commercial content moderation is a service for commercial sites to perform “cleaning” of user content, usually by outsourcing the task to specialized companies. Employees work by viewing, assessing against the agreed moderation rules, and deleting disturbing content. In the process they often suffer psychological damage.
The active moderation is the one that filters the entire content shared on the network, in real time. It is done both by humans and machines and it is significantly more labor-intensive. Nevertheless, nowadays it seems to become an imperative so as to keep up with the user’s increasing expectations.
5. NONDISCLOSURE: The content sweepers are tied by very strict confidentiality rules and contracts and are not even allowed to speak openly to the other company employees. When they deal with other departments such as legal, privacy and security or brand and product managers, they don’t show the actual piece of content but they are asked to describe it. In other words, “smelly” things need to be contained.There is also a tier of self regulation created by most social media giants, with Safety Advisory Boars/Councils that gather independent experts, civil society organisations or even think tanks like Google’s Jigsaw to discuss and decide upon sensitive issues. Nevertheless, the discussions remain private. The most open, as industry insiders point out, seems to be Pinterest. Apparently, being open with it’s content moderation policies and trying to be more transparent has gained the platform a good reputation.
7. CONTENT: The actual content of some of the deleted imagery is hard to describe. They include cruelty, horror, death, terror towards humans, children, sacred artifacts and symbols. Most shockingly, testimonials say, is seeing these dark images through the eyes of the perpetrator. This means it is not like watching a news report but as watching it happen in real life, together with the often hyper elated criminal’s reactions.
8. RELATIVITY: Whether we say it or not, content moderation, is, to some extent, the same thing as censorship. So as to be able to censor a certain public’s access to a piece of content you must understand its deepest and most profound cultural, religious and historical traits. What could be outrageous for a certain public, could be mundane for another. This poses difficulties in both deriving automated mechanisms of content moderation and outsourcing this task to specialized businesses. There will always be a misunderstanding, an outcry, an accident. Meanwhile, Social Media companies have to walk a very thin line.
9. CONTEXT: The context in which a certain piece of content is put on a social network is also important and can result in traumatizing situations although the trauma is not readily apparent. Also, moderators’ decisions to delete or not to delete a piece of content, such as the footage of the shooting of a civilian girl during protests, can alter an entire country’s political life.
10. ABUSES: Many of the social networks still have problems with protecting victims of abuse via their own content moderation systems. It is quite common that the so-called internet trolls, savvy of the network’s automated systems, would act to quiet down those with opposite points of view. This is especially common in political disputes but also business, sports or other passion-stirring areas.
There is plenty more to say and grasp about this topic. I just highlighted my first conclusions after a few days’ worth of researching. I will leave bellow the most valuable links I could find for further reference.
http://www.wired.com/2014/10/content-moderation/
http://hyperallergic.com/271703/the-stories-of-content-moderators-hosted-on-the-darknet/
