Abstract
Responding to long-standing warnings that robots and AI will enslave humans, I argue that the main problem we face is not that automation might turn us into slaves but, rather, that we remain masters. First I construct an argument concerning what I call ‘the tragedy of the master’: using the master–slave dialectic, I argue that automation technologies threaten to make us vulnerable, alienated, and automated masters. I elaborate the implications for power, knowledge, and experience. Then I critically discuss and question this argument but also the very thinking in terms of masters and slaves, which fuels both arguments. I question the discourse about slavery and object to the assumptions made about human–technology relations. However, I also show that the discussion about masters and slaves attends us to issues with human–human relations, in particular to the social consequences of automation such as power issues and the problem of the relation between automation and (un)employment. Finally, I reflect on how we can respond to our predicament, to ‘the tragedy of the master’.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
Martin Rees, “Meet your replacement”, The Daily Telegraph, 16 May 2015.
Note that “master” and “servant” may be a better translation of the German Herr and Knecht. However, here I will continue to use the terms “master” and “slave” in order to remain in line with the popular discussion.
For instance, in my work on technology and distance, I have argued that electronic information and communication technologies (ICTs) tend to increase the distance—literally, by enabling communication over long distances, but also metaphorically, creating social and moral distance [see for instance my claims about financial technologies and distance in Money Machines (Coeckelbergh 2015a) and about nature and ICTs in Coeckelbergh (2015b)]. However, here I will focus on the issue of automation.
There is also use of automation technology that does not fully delegate the action to the machine. In such cases, the machine plays the role of mediator between humans and their environment. For instance, a surgeon may use a robot to remotely operate on a patient. In such cases, there is at least less distance to nature than in cases of full delegation of agency: there is technological mediation, but the action remains in the hands of the surgeon to an important degree. It could even be argued that at least some teleoperated systems decrease or bridge the distance between us and nature, in particular when they enable us to remotely manipulate and experience what could not be manipulated or experienced without the system. Consider for instance exploration of another planet by means of a tele-operated robot. However, in spite of this bridging of physical and experiential distance, there may remain some feeling of alienation as it is still an indirect way of relating to nature. The humans operating the machine and experiencing nature in this way are also highly dependent on the technology (see also the first point again).
See again my claims about “the automation of the social” in my recent keynote addresses at the Robophilosophy conference and at AISB 2015, 21 April 2015 (Coeckelbergh 2014).
One may object that the master still has the work of ensuring that the automation continues to work. However, in so far as this means that the master functions as an operator who monitors and supervises, this does not count as “work” according to the definition ‘mixing one’s labour with nature’. It only counts as “work” when the master repairs the machine. This is indeed a less alienated relation and involves a different kind of knowledge: know-how. I will say more about this below.
References
Aristotle, (1984). Nicomachean ethics. In J. Barnes (Ed.), The complete works of Aristotle (Vol. 2). Princeton, NJ: Princeton University Press.
Bainbridge, L. (1983). Ironies of automation. Automatica, 19(6), 775–779.
Bryson, J. J. (2010). Robots should be slaves. In Y. Wilks (Ed.), Close engagements with artificial companions: Key social, psychological, ethical and design issues (pp. 63–74). Amsterdam and Philadelphia, PA: John Benjamins.
Coeckelbergh, M. (2013a). Human being @ risk: Enhancement, technology, and the evaluation of vulnerability transformations. Berlin: Springer.
Coeckelbergh, M. (2013b). Drones, information technology, and distance: Mapping the moral epistemology of remote fighting. Ethics and Information Technology, 15(2), 87–98.
Coeckelbergh, M. (2013c). E-care as craftsmanship: Virtuous work, skilled engagement, and information technology in health care. Medicine, Healthcare and Philosophy, 16(4), 807–816.
Coeckelbergh, M. (2014). The automation of the social? In J. Seibt, et al. (Eds.), Sociable robots and the future of social relations (pp. 7–8). Amsterdam: IOS Press.
Coeckelbergh, M. (2015a). Money machines: Electronic financial technologies, distancing, and responsibility in global finance. Farnham: Ashgate.
Coeckelbergh, M. (2015b). Environmental skill: Motivation, knowledge, and the possibility of a non-romantic environmental ethics. New York: Routledge.
Hazeltine, B., & Bull, C. (1999). Appropriate technology: Tools, choices, and implications. San Diego, CA: Academic Press.
Hegel, G. W. F. (1807/1977). Phenomenology of the spirit (A. V. Miller, Trans.). Oxford: Oxford University Press.
Heidegger, M. (1927/1996). Being and time (J. Stambaugh, Trans.). Albany: State University of New York Press.
Heidegger, M. (1977). The question concerning technology and other essays (W. Lovitt, Trans.). New York: Harper & Row.
Illich, I. (1973). Tools for conviviality. Berkeley: Heyday Books.
Marx, K. (1844/1959). Economic and philosophic manuscripts of 1844. Moscow: Progress.
Marx, K. (1867/1976). Capital (Vol. I). London: Penguin Books (reprint in 1990).
Mumford, L. (1934/2010). Technics and civilization. Chicago: University of Chicago Press.
Schumacher, E. F. (1973). Small is beautiful: A study of economics as if people mattered. London: Vintage Books.
Wallach, W. (2015). A dangerous master: How to keep technology from slipping beyond our control. New York: Basic Books.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Coeckelbergh, M. The tragedy of the master: automation, vulnerability, and distance. Ethics Inf Technol 17, 219–229 (2015). https://doi.org/10.1007/s10676-015-9377-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10676-015-9377-6