False Positives: Overview A – Timeframe: 04‘ 26”, 2015-16
False Positives: Overview H – Timeframe: 02’ 13”, 2015-16
False Positives: Overview I – Timeframe: 05’ 42”, 2015-16
C-Prints, each 130 x 97,51 cm
Courtesy of the artist
Esther Hovers’ work False Positives thematises the systems underlying so-called smart cameras. These cameras not only record their environments, but also interpret the images in real-time. Human behaviour, our movements in space, are analysed and compared with learned patterns. When a system detects anomalous movements, it categorises the behaviour as suspicious. At the same time, the system of pattern recognition remains open enough to allow a certain leeway so that most anomalies are revealed to be false alarms or ‘false positives’.
Hovers worked with security experts to determine eight suspicious movement patterns: standing still, fast movements, lonely objects, placement on a corner, clusters breaking apart, synchronised movements, repeatedly looking back, and deviant directions. Hovers’ images come from an administrative district in Brussels. She photographed both random pedestrians and staged movements through public space. In digital post-production, she assembles up to twenty shots into a single image, condensing the events over an expanded stretch of time into a single constructed moment. The synthesis refers to the methods of intelligent surveillance systems, which process larger temporal correlations.
Algorithms control an ever-increasing number of systems, from managing complex processes and infrastructures to rating individuals. The population is also increasingly aware of these applications. They have begun to transform people’s behaviour. Systems and people reciprocally determine each other, with each adjusting to the other. What is considered to be normal or abnormal is no longer just determined by a social collective, but also by data and statistics. These mathematical models of the world are applied via algorithms. There remains the question, however, of exactly which calculations and statistics underlie the measurements of the government authorities and private firms that commission these systems. Which classifications eventually lead to ‘false positives’? Which individual traits are held as suspicious and possibly criminal? Do we want to live in a society where algorithmically calculated estimations lead to the quantification and classification of individuals?
Dutch artist Esther Hovers (b. 1991) graduated in photography from the Royal Academy of Art, the Hague (NL). Since graduating her work has been featured in numerous international exhibitions, in-cluding C/O Berlin Foundation (DE), Alan Gallery, Istanbul (TR), Festival Circulation(s) in Paris (FR), Foam Photography Museum, Amsterdam (NL) and the National Gallery in Prague (CZ).