Abuse Standards Violations, 2016, 2018, 2021
Wall mounted Plexiglas panels with content moderation guidelines
UV print on plexiglass, various insulation materials, spacers, screws
100 x 100 cm / 150 x 100 cm
Courtesy the artists and Apalazzo Gallery
The presentation of The Bots is added to by the nine-part work Abuse Standards Violations, which marks the beginning of Eva and Franco Mattes’ research on the subject of content moderation. It takes as its theme the issue of the morality of social media and the tech giants associated with it.
Nine wall plexiglass frames, filled with insulation materials, present corporate guidelines, for example excerpts from the Facebook Community Standards, which are not intended for public viewing but for internal purposes only. The companies that have produced these guidelines are almost all unknown, as they wish to remain anonymous. Most of the time even the moderators themselves do not know who their employer is – one of them told Eva & Franco Mattes: ‘I’m pretty sure I work for Google’. The guidelines against violations of abuse standards set moral boundaries for what the companies consider questionable content on social media, laying down what is defined as racist, hateful, controversial, terroristic, pornographic or violent and thus to be removed. ‘Clean’ or ‘OK to show’ refers to images that are considered proper and therefore can circulate on social media, like ‘Shirtless but wearing pants or shirts (and not more than the top band of their underwear is visible)’; ‘inappropriate’ images may include politics and controversial social issues and so should be filtered. ‘Safe’ content includes fine art and celebrity gossip. Despite set guidelines, there is confusion as to when content should be removed, and who gets to decide what to remove. At this point interpretation made by humans is required, an algorithm-based assessment being insufficient.
The policies of large social media platforms change daily. They adapt to current social and political events. Since most IT companies are based in California, they mostly follow the guidelines of US laws and US ‘morality’, yet strive to be sensitive to local and culturally specific morals. The difficulty lies in exercising content moderation for all cultural contexts in a way that avoids the danger of allowing cultural biases to become political interpretation.