ABSTRACT
The dialogue that follows was written to express some of our ideas and remaining questions about IT systems, moral agency, and responsibility. We seem to have made some progress on some these issues, but we haven't come to anything close to agreement on several important points. While the issues are becoming more clearly drawn, what we have discovered so far is closer to a web of connecting ideas, than to formal claims or final conclusions.
- Desharnais, P, Lu, J, and Radkakrishnan (2002), T. Exploring agent support at the user interface in e-commerce applications, International Journal on Digital Libraries, Vol. 3, No. 4, 284--290.Google ScholarCross Ref
- Dijkstra, E. Guarded commands, nondeterminacy and formal derivation of programs (1975), Communications of the ACM, Vol.18, No.8, 453--457. Google ScholarDigital Library
- Floridi, L, and Sanders, J. (2004) On the morality of artificial agents. Minds and Machines, 14.3, 349--379. Google ScholarDigital Library
- Matthias, A. (2004). The responsibility gap: Ascribing responsibility for the actions of learning automata. Ethics and Information Technology, Vol. 6, No. 3, 175--183. Google ScholarDigital Library
- Moor, J (1985), What is computer ethics?, Metaphilosophy Vol 16 No 4, pp 266--279.Google ScholarCross Ref
- Russo, M. Herman Hollerith: the world's first statistical engineer. http://www.history.rochester.edu/steam/hollerith/loom.htm, accessed September 8, 2005.Google Scholar
- Smith, L (1994) B. F. Skinner. PROSPECTS, Vol. XXIV, No. 3/4, 519--32, http://www.ibe.unesco.org/International/Publications/Thinkers/Th inkersPdf/skinnere.PDF, accessed September 8, 2005.Google ScholarCross Ref
Index Terms
- A dialogue on responsibility, moral agency, and IT systems
Recommendations
Moral luck and computer ethics: Gauguin in cyberspace
I argue that the problem of `moral luck' is an unjustly neglected topic within Computer Ethics. This is unfortunate given that the very nature of computer technology, its `logical malleability', leads to ever greater levels of complexity, unreliability ...
Moral responsibility and IT for human enhancement.
SAC '06: Proceedings of the 2006 ACM symposium on Applied computingWhat can be said against a moral obligation to use IT for enhancement purposes? Some have argued - and it is very well conceivable that this is an increasingly common conception - that we may have a moral obligation to use IT for enhancing human bodies ...
Moral Philosophy of Artificial General Intelligence: Agency and Responsibility
Artificial General IntelligenceAbstractThe European Parliament recently proposed to grant the personhood of autonomous AI, which raises fundamental questions concerning the ethical nature of AI. Can they be moral agents? Can they be morally responsible for actions and their ...
Comments