Thoughts on Inclusive Data Systems

Notes from a panel discussion on Artificial Intelligence, Machine Learning, and data privacy at DEEP 2017, by Colin Clark


This topic of artificial intelligence is so filled with formalisms and engineering-speak, so strangely tangled up in fear and promise and the rhetoric of inevitability. But it's not inevitable. We just need to dream a little bit, to imagine other kinds of possibilities. 

To dream an inclusive artificial intelligence, or a machine learning that is reflective of the diversity of the ways we live and learn in the world, we have to start from scratch. First, we need to recognize that what's at stake here are the very metaphors that define our understanding of what these system are and can do. This means that we need to question the words and names we use, because they ultimately shape and structure our ideas about what is possible. Words like "intelligence," "learning," "prediction," even "algorithm." The risk is that these words, when left unquestioned, obscure a framework of automation, normalization, and erasure of difference—the true "artificials" of artificial intelligence—which are the amplifying agents of precarity and alienation in our work and social lives.

So, we have the admittedly daunting task of imagining alternatives to these vague and technocratic redefinitions of learning and intelligence, to dream up alternatives that are reciprocal, humane, and open to diversity. Can we imagine a personal, poetic, and situated model in which our data systems are not pre-trained with biases originating from the monocultures of Silicon Valley, but which are open to our environment and our needs, and to be influenced and understood by us? Can we also imagine systems of data that are not structured solely by the industrial values of automation and efficiency, but also can encompass the productive inefficiencies, impossible-to-valuate social exchanges, and serendipitous connections that are essential to the quality, not just quantities, of our lives and work.

For example, the Dear Data project, where designers Georgia Lupi and Stefanie Posavec sent each other weekly postcards that tracked and visualized, beautifully and idiosyncratically, data from their personal lives. We need systems that give us insight into what they do and the inferences they can help us make. Systems that allow us to choose what gets tracked and who it gets shared with, and which afford us the creativity to personally decide how our data, and the decisions made with it, is represented to us and the world.