Decoded records: the amateur present

Images produced by the public in general are interpreted by a set of artificial intelligences: Amazon Rekognition, Darknet YOLO, Facebook Detectron, Google Cloud Vision, Microsoft Azure, IBM Watson and APIs of the services Deep AI and Clarify. The latter was officially used by the United States Department of Defense.

As these technologies for the reading of images are not totally trained to understand the art system, many of the results offer new modes of understanding of what was recorded there – expanding or flattening their meanings.

Any visitor of the Biennial can send images to our decoding album by using this form.


Records Methodology

The results of the images of the Biennial (of this one and of the past editions) presented here were arrived at by a “methodological” mode with technological/scientific bases, always with the aim of highlighting the unexpected results. We are interested in the responses of the artificial intelligences that expand the meanings of the images and/or reveal the categories of the structures of these platforms.

If you are interested in consulting the complete results, without our filter, visit the Biennial Art Decoder, an internal platform constructed exclusively for the project and which presents all the results of the images produced by the different artificial intelligences.

In the "Content" category there is a grading of the results in the descriptors “adult,” “physician,” “racy,” “falsification” and “violence” in the following order: very likely > likely > possible > unlikely > very unlikely. For its part, the term “Safe for Work” is an Internet expression used for contents appropriate for viewing in public places or at work.