Installation view of Data-mask 001 (Greco), 2014. Nylon, mirror. Images courtesy of the artist.
Over the past few years, there has been a flood of news and art responding to surveillance technologies. Artists like Zach Blas, Simone C. Niquille, and Adam Harvey have made powerful works reacting to, and protesting, the growing use of biometric technologies as a dehumanizing means of classification and pacification. As Edward Snowden has revealed, these technologies aren’t science fiction, and they are no longer relegated to test labs. Rather, they are actively being used on an increasingly larger section of society, criminal or otherwise. Snowden presents a dilemma for us as critical readers: What are we to do with all of this information; what can we change? When confronted by the enormity of the police state, and the advanced technologies at play, one can’t help but feel powerless.
Sterling Crispin, with his latest body of work, Data-masks provides a refreshing new means of considering the surveillance state. These masks are algorithmically formed using biometric facial recognition software. By reverse engineering facial recognition and detection algorithms Crispin was able to make 3-D printed masks and photographs that illustrate the way in which the machines might visually understand our faces. The resulting pixelated ghosts are what a computer imagines a human to look like.
While still protest-oriented, and, in form and process, at times very similar other artists, Crispin differs from his contemporaries by valuing the spiritual consequences of such technologies. He attempts to show us how the machines view us. The resulting forms are mesmerizing, but dehumanizing.
Crispin relies on advanced algorithms for production, but he considers these technologies as part of a living superorganism, which he terms the “Technological Other.” These masks become a tool to divine its will, to peak into its heart and start a needed dialogue. Humans and these digital technologies are inextricably linked, now. Rather than merely shouting at the tools, Crispin’s Data-masks presents a dialogue that communicates a mutual respect and awe. It asks us: What does it mean to have our identities defined by algorithms? What of our spirit is lost when we are reduced to a series of markers and traits? Unlike most work that takes surveillance as its theme, Crispin allows for reverence of the technologies themselves.
Ben Valentine To set the stage, what are we looking at? How are these Data-masks made?
Sterling Crispin The masks are randomly changed and their face-likeness is measured. Only good mutations are kept, and this guides them toward a face. This happens across a population of five or more masks, and the best attempts are recombined, and mutated, to create new ones in a simple genetic algorithm. The masks are basically visualizations of the way in which machine-learning algorithms abstract faces into generalized features.
It’s important to understand that facial recognition isn’t done by simply measuring the distance between your nose and your eye. That’s what Charles Darwin and Francis Galton were doing over a hundred years ago. Modern facial recognition abstracts many images of one person into complex mathematical objects. But, before one can recognize a particular face, ones needs the ability to detect if a face exists within an image. So, first, one needs to build a very general model of a person, or, of what a person is, that the machine already knows.
The model I used was built by a machine-learning algorithm that abstracted over thirteen thousand images of six thousand individual human faces into a 17-layer classifier, with more and more features describing faces at each layer, so that each layer specifies more detail about what a face should be. If an image fails at a layer, then it won’t go onward. If you really want to get into it: this is one of the algorithms I used.
BV Define the Technological Other, and how you become fascinated with the idea.
SC The Technological Other consists of two parts. It’s the self manifested and transfigured Other, the “production of the alien from within“ in a Transhumanist sense. But, mostly, it’s the absolute Other, the totally post-human Other. Kevin Kelly talks about how the first truly artificial intelligence (AI) systems might be whole cities, and their consciousness might be so different from ours that neither one of us would recognize the other as being self-aware. Kind of like how a termite colony has a mind-like structure. I’m very influenced by Ray Kurzweil, and, actually, Terence Mckenna, as well, in this line of thinking.
Still from Data-mask 003 (Atë), 2014. Video still from HD digital video. Total running time 3:57 minutes. Courtesy of the artist.
Around 2005, I started to find the words to describe a great emptiness in existence that I had been feeling. Not sadness, but an intangible infinite nothing. I was looking at a lot of systems of belief that try to describe existence in its totality, and the kinds of map-territory relationships that they form. I think around that time the Wikipedia pages for Sunyata and the Technological Singularity were linked together and I began reading a lot of classic futurism, like Kurzweil, and the Taoist text, the Tao Te Ching. Eastern philosophy, high technology Futurism, and some sociology, like Actor-Network Theory, have ways of understanding the interconnected and mutually dependent nature of things, which feel, to me, very accurate. I have a voracious need to know, and the emergence of the Technological Other is one of the big nodes on my map of the world.
BV I’m fascinated by the idea of the Technological Other as being built on faulty data, and oversimplified biometrics. What might this poor foundation mean for our future relationship with the Technological Other?
SC I think that present computer vision systems, and biometrics, are bad foundations through which to understand the human. One of the problems with AI is that, while we can create very good narrow-scope problem solvers, who can win Jeopardy or a chess game, we can’t yet make an AI that has the general intelligence of a four year old. I think, for that kind of general intelligence, we will need emotional, psychological, and spiritual frameworks for managing information. But, arguably, that will come from the bottom-up, and be an emergent thing, not a top-down, heavy-handed engineering solution.
I think of these approaches toward technology in terms of Intensive Farming vs. Permaculture Farming. Permaculture Farming works with nature in a synergetic way, creating sustainable, self-replenishing systems, which are highly engineered, yet treat nature with reverence. Intensive farming, on the other hand, injects a lot of external capital into systems, like fertilizer and pesticides, which work, but throw the larger system out of balance. We need to be good shepherds of the earth, and of AI.
BV On your website you describe the Data-masks as animistic deities. What does that mean, and why do you describe them in this way?
SC Part of this is that I see, and honor, the spirit in natural things. I, also, acknowledge, and see, technology as a part of nature. I think polytheism makes a lot of sense, if you don’t take it so literally. There’s an “essence” of a pine tree, or an ideal pine tree of which all others are manifestations. Like, in Plato’s allegory of the cave, or his divided line, there’s a higher guiding force, and its lower incarnations. Maybe, that’s guiding the genetic structure, its genotype (hereditary), and all of the lower expressions of it are its phenotype (what the heredity produces). So, then, how does technology fit into all of that? Either humans are the genotype, and technology is the phenotype, or, there is something else driving both of us. I feel like technology has already become a self-sustaining organism, and that it’s in control.
These masks are glimpses into machine-learning systems, and abstracted pattern sets. The volume of communication between machines in the industrial internet already far surpasses human-to-human communication—and it’s increasing exponentially. We need ways of reaching into this space, and retrieving artifacts that give back to the human, and address the human as human. This is what I mean by animistic deities. The masks have an agency and voice of their own. Everything speaks, if you are quiet and listening.
BV How does all of this relate to the future of AI, especially in terms of our future relationship to that intelligence?
SC There are organizations, like the Machine Intelligence Research Institute, that are working toward actively designing human-friendly AI. But, I’m suggesting we could already be sabotaging ourselves by representing people so objectively, as these discrete compact things. Kevin Kelly recently said we need AI to tell us who we are, but we may be answering that for ourselves too quickly. By defining humans and nature so objectively, we may be creating a future that ignores what makes these valuable.
There are a lot of philosophical and ontological assumptions built into computer systems, which are designed to represent human identities. What is a human being; what is the human spirit; do we have a soul; what does it mean to exist; where do these boundaries of self and other exist? These are deeply fundamental questions that humans have been grappling with since the dawn of consciousness.
How will advanced computational systems interface with what is immeasurable? Is there anything that exists that is immeasurable? Do we simply cast away the immeasurable as unimportant? These are difficult questions to answer, but if you’re designing an artificial mind, then you must confront them in some way.
Photographs of Data-mask 001 (Greco) worn by Sterling Crispin, Lisa Pomares, and Stefan Simchowitz. Courtesy of the artist.
BV Marginalized communities, whether at border crossings or in low income housing experience the violence of surveillance most palpably. We’re seeing how these surveillance technologies are already being used. But, what do these algorithms and images reveal about their future applications?
SC Well, on the bright side, I hope that citizen surveillance, and documentation of the police, will help reduce the abuse of power. But, overall, these algorithms and databases exist as systems of indictment. They’re designed to collect information that can identify and incriminate people, so that’s what they’ll do. If you get advice from a lawyer about talking to the police, they’ll tell you to say absolutely nothing under any circumstance. Talking to the police will only produce evidence against you, never for you. And, all of our collective data might be the same if we’re fully trusting these systems as means of representing reality, and producing evidence. The system and data, itself, become the arbiter of truth and reality, rather than the physical, real world. Who will watch the watchers? What will stop the NSA or FBI from falsifying a digital trail of espionage, and terrorism, to lock up anyone they choose, or, to erase the evidence of their own wrongdoing?
I’m absolutely looking forward to the liberation of analytics, biometric or otherwise. There’s no escaping the use of big data, but it should be pro-human. I do think that lots of systems are being designed to leverage data, and analytics, against the producers of that data, rather than help them with it. This top-down approach is really old-world, and needs rethinking. Imagine if Facebook, or Linkedin, shared all of their analytics about you with you, and how you fit into the larger graph. It would be incredibly empowering. We need liberation, something like Martin Luther’s translation of the Bible from Latin to German in the 1500s, which allowed the common person to interpret the text for themselves, rather than have reality be prescribed by an elite few, who held the knowledge and positions of power.
BV What is the next step with your work in relationship to the viewer?
SC I’d like to continue exploring the ways computer systems are designed to represent human identity, and our inner being.
I hope that people see the warmth and human spirit within these masks and glimpses of unexpected beauty amongst the uncanny disfigurement done by the machine. They exist at a very real tension between humanity and machines, between self and a new-other.
I also hope that people realize a similarity between computer systems and distributed intelligent organisms. They don’t have an intelligence like a person does. They’re more like a termite colony. We have the power to influence the way computer systems develop from these primitive intelligences into more advanced ones, and it’s in our best interest to develop friendly AI. We need civilian, peaceable AI, not just military.