Farewell to Artifical Humanity: Deep Learning in an Anthropological Context
Public discourse often prematurely equates new developments in Artificial Intelligence (AI) with a search for a synthetic subject. Understanding AI as a technology embedded in an anthropological-cultural context confronts this imagery with a counternarrative that helps clarifying our ethical priorities.
We may interpret this chasm between expert activity and public perception as a reaction to the opacity of the technology at hand. Algorithms that adapt their own weights in response to sample solutions (a structure common to all Machine Learning as well as Deep Learning as a subfield) seem to fire our imagination with their autonomy. The dynamic nature of the algorithm also means that its precise state is not always known to the researcher.
Deep Learning distinguishes itself by features are extracted from input through several layers. In image recognition, for instance, an image will be analysed into all its color values at a given pixel coordinate, which will be used to identify edges, which in turn will be used to identify contours and then shapes. All of this takes place in the software internals, the process is opaque to the user. The precise state of the algorithm and the deep nature of the architecture are both are two intransparencies that leave us with plenty space for speculation.
Using neural networks that are loosely modeled after biological neurons may suggest that researchers are primarily interested in understanding the human brain. On closer inspection, this is a non sequitur: The architecture of software designed to solve cognitively demanding problems does not necessarily teach us something about how we are solving them.
These unfortunate conflations throw us back to the big philosophical questions - What is consciousness? Mind? A subject? - question whose meaningfulness I would question even if they would not pertain to research in artificial intelligence. To move on from these problems, I will briefly test the productivity of a counter-narrative. Its starting point is the historical fact that the learning machine, even if it would become an autonomous subject, would still have to be made by us. It is a narrative of artificial intelligence as a technology embedded in culture.
These ideas may not seem particularly trendy right now, but they can be heard in the statements of many practicioners critical of the public discourse surrounding AI: If Joanna Bryson suggests that the history of Artificial Intelligence begins with the invention of writing, she thinks of AI as being cognitive exteriorisation.
Cognitive Exteriorisation is thus the removal of operation to increase our individual and collective capacity. By taking tasks off of us, technologies like writing (a mnemotechnic), a pocket calculator or Google Maps create space in our head for other things. For Bryson, all of this is AI in to a degree. If machine learning tranlstaes languages or interprets x-rays for us, we delegate some of our burden, independent from the fact whether we can use this technology to build a subject.
Exteriorisation as a concept embeds technology in its culturally determined context of use, without whom our Analysis would be incomplete. It reveals continuities between AI and reagular old technology. Thus, classical problems in the ethics of technology come into focus, while the question about the moral status of the machine that dominates in regards to the artificial subject is marginalised.
One example: To understand AI as an attempt to ameliorate people's material conditions must lead us to consider distributive justice. From the minor point on how much time that can be used for other things when using with an autonomous vehicle to the potentially life-altering effects of personalised medicine – Deep Learning (like most new tech) appears to widen the wealth gap: People able to afford things in the first place will become more economically productive and compensated accordingly while the rest are left behind. It is crucial that these kinds of problems are kept afloat in the rip current around artificial people.