Content Moderation – The Undignified Work in Social Media

 Today I am touching a sensitive subject  – content moderation in social media.

The reason it is a sensitive matter is because the work of content moderation itself constitutes the tenuous job of erasing all the dirt, the crime, the outrageous, the terrifying and the abominable that humans are capable of putting on the web. Commercial content moderation is a service for commercial sites to perform “cleaning” of user content, usually by outsourcing the task to specialized companies. Employees work by viewing, assessing against the agreed moderation rules, and deleting disturbing content. In the process they often suffer psychological damage.

Here are the 10 things that you probably don’t know about content moderation in social media:
1. NUMBERS: Reportedly, social networks rely on armies of content moderators. One source, that I came across quite often in the research I have done on the subject, estimates 100.000 content moderators across the entire industry, with big companies employing significantly more content sweepers than engineers and other specialized staff.
2. TYPES: There are three main types of content moderation: the automated, the passive and the active moderation.
The automated, as you can probably guess, is the type of content blocking that is based on algorithms developed within each social network. This is constantly evolving but cannot substitute the human moderation. There are “crazy” innovations happening in this area as we speak, such as adaptive listening technologies, 3D modelling or PhotoDNA.
The passive content moderation is made by humans and it is based on a flagging system that comes from the users themselves. This way, once a piece of content is notified as offensive goes to a content moderator’s screen, where, based on their internal set of rules, it gets decided whether that content stays up or goes down.
The active moderation is the one that filters the entire content shared on the network, in real time. It is done both by humans and machines and it is significantly more labor-intensive. Nevertheless, nowadays it seems to become an imperative so as to keep up with the user’s increasing expectations.
3. LOCATION: Most of the content moderation is done far away from the fancy Silicon Valery campuses, many of the companies subcontracting to developing countries such as the Philippines. According to a Wired article, the Philippines are preferred because of their cultural ties with the US, from it being a former colony. As one could imagine, content moderation involves a lot of cultural familiarity. Even if outsourced, the headquarter will always keep a content moderation team dealing with the most sensitive cases. The employees are generally entry level, women and their salaries are significantly lower than the industry average. While one could only guess why this is happening, we can conclude for sure that this is a side of the industry that is not very much praised.
4. PSYCHOLOGICAL SIDE EFFECTS: Most of the workers don’t last in this jobs for more than 6 months and they often have psychological difficulties a few weeks into the job such as depression, anxiety, paranoia, low self esteem, low sociability, substance abuse, etc. Although they get professional counseling, they hardly ever prove efficient.
5. NONDISCLOSURE: The content sweepers are tied by very strict confidentiality rules and contracts and are not even allowed to speak openly to the other company employees. When they deal with other departments such as legal, privacy and security or brand and product managers, they don’t show the actual piece of content but they are asked to describe it. In other words, “smelly” things need to be contained.
6. POLICY & LEGISLATION: For the US, the notorious Section 230 of the 1996 Communication Decency Act states that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”. In other words, social media companies are exempt from legal obligations to moderate their content. Nevertheless, they decided to do it because it is business savvy.

The situation is more or less the same in Europe, where again the social platforms (or hosting companies in general) are not responsible for the content posted by their clients, unless this content is reported as abusive. Europe also have the “The Right to Be Forgotten” legal principle which is rather more restrictive than the US legislation, separating the issues related to freedom of speech from those related to the right to privacy.

There is also a tier of self regulation created by most social media giants, with Safety Advisory Boars/Councils that gather independent experts, civil society organisations or even think tanks like Google’s Jigsaw to discuss and decide upon sensitive issues. Nevertheless, the discussions remain private. The most open, as industry insiders point out, seems to be Pinterest. Apparently, being open with it’s content moderation policies and trying to be more transparent has gained the platform a good reputation.

7. CONTENT: The actual content of some of the deleted imagery is hard to describe. They include cruelty, horror, death, terror towards humans, children, sacred artifacts and symbols. Most shockingly, testimonials say, is seeing these dark images through the eyes of the perpetrator. This means it is not like watching a news report but as watching it happen in real life, together with the often hyper elated criminal’s reactions.

8. RELATIVITY: Whether we say it or not, content moderation, is, to some extent, the same thing as censorship. So as to be able to censor a certain public’s access to a piece of content you must understand its deepest and most profound cultural, religious and historical traits. What could be outrageous for a certain public, could be mundane for another. This poses difficulties in both deriving automated mechanisms of content moderation and outsourcing this task to specialized businesses. There will always be a misunderstanding, an outcry, an accident. Meanwhile, Social Media companies have to walk a very thin line.

9. CONTEXT: The context in which a certain piece of content is put on a social network is also important and can result in traumatizing situations although the trauma is not readily apparent. Also, moderators’ decisions to delete or not to delete a piece of content, such as the footage of the shooting of a civilian girl during protests, can alter an entire country’s political life.

10. ABUSES: Many of the social networks still have problems with protecting victims of abuse via their own content moderation systems. It is quite common that the so-called internet trolls, savvy of the network’s automated systems, would act to quiet down those with opposite points of view. This is especially common in political disputes but also business, sports or other passion-stirring areas.

There is plenty more to say and grasp about this topic. I just highlighted my first conclusions after a few days’ worth of researching. I will leave bellow the most valuable links I could find for further reference.

Tagged , , , , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *