Skip to main content

Design Requirements for a Moral Machine for Autonomous Weapons

  • Conference paper
  • First Online:
Computer Safety, Reliability, and Security (SAFECOMP 2018)

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 11094))

Included in the following conference series:

Abstract

Autonomous Weapon Systems (AWS) are said to become the third revolution in warfare. These systems raise many questions and concerns that demand in-depth research on ethical and moral responsibility. Ethical decision-making is studied in related fields like Autonomous Vehicles and Human Operated drones, but not yet fully extended to the deployment of AWS and research on moral judgement is lacking. In this paper, we propose design requirements for a Moral Machine (Similar to http://moralmachine.mit.edu/) for Autonomous Weapons to conduct a large-scale study of the moral judgement of people regarding the deployment of this type of weapon. We ran an online survey to get a first impression on the importance of six variables that will be implemented in a proof-of-concept of a Moral Machine for Autonomous Weapons and describe a scenario containing these six variables. The platform will enable large-scale randomized controlled experiments and generate knowledge about people’s feelings concerning this type of weapons. The next steps of our study include development and testing of the design before the prototype is upscaled to a Massive Online Experiment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    cf. https://ihl-databases.icrc.org/ihl/INTRO/500?OpenDocument.

  2. 2.

    https://futureoflife.org/ai-principles.

  3. 3.

    https://ethicsinaction.ieee.org/.

References

  1. IEEE Global Initiative, The IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems (2017)

    Google Scholar 

  2. Campaign to Stop Killer Robots. Campaign to Stop Killer Robots (2015). https://www.stopkillerrobots.org/. Accessed 15 July 2017

  3. ICRC, Ethics and autonomous weapon systems: An ethical basis for human control? International Committee of the Red Cross (ICRC), Geneva, p. 22 (2018)

    Google Scholar 

  4. Dignum, V.: Responsible Autonomy (2017). arXiv preprint: arXiv:1706.02513

  5. Dignum, V.: Introduction to AI (2016). https://rai2016.tbm.tudelft.nl/contents/

  6. Roff, H.M.: Weapons autonomy is rocketing (2016). http://foreignpolicy.com/2016/09/28/weapons-autonomy-is-rocketing/

  7. US Air Force, Unconscious US F-16 pilot saved by Auto-pilot 2016, Catch News: Youtube

    Google Scholar 

  8. Etzioni, A., Etzioni, O.: Pros and Cons of Autonomous Weapons Systems. Military Review, May–June 2017

    Google Scholar 

  9. Arkin, R.C.: The case for ethical autonomy in unmanned systems. J. Mil. Ethics 9(4), 332–341 (2010)

    Article  Google Scholar 

  10. General Assembly United Nations, Joint report of the Special Rapporteur on the rights to freedom of peaceful assembly and of association and the Special Rapporteur on extrajudicial, summary or arbitrary executions on the proper management of assemblies, p. 23 (2016)

    Google Scholar 

  11. Kaag, J., Kaufman, W.: Military frameworks: technological know-how and the legitimization of warfare. Camb. Rev. Int. Aff. 22(4), 585–606 (2009)

    Article  Google Scholar 

  12. Rosenberg, M., Markoff, J.: The Pentagon’s ‘Terminator Conundrum’: Robots That Could Kill on Their Own, in The New York Times (2016)

    Google Scholar 

  13. Malle, B.F.: Integrating robot ethics and machine morality: the study and design of moral competence in robots. Ethics Inf. Technol. 18, 243–256 (2015)

    Article  Google Scholar 

  14. Cointe, N., Bonnet, G., Boissier, O.: Ethical judgment of agents’ behaviors in multi-agent systems. In: Proceedings of the 2016 International Conference on Autonomous Agents & Multiagent Systems. International Foundation for Autonomous Agents and Multiagent Systems (2016)

    Google Scholar 

  15. Bonnefon, J.-F., Shariff, A., Rahwan, I.: The social dilemma of autonomous vehicles. Science 352(6293), 1573–1576 (2016)

    Article  Google Scholar 

  16. Coeckelbergh, M.: Drones, information technology, and distance: mapping the moral epistemology of remote fighting. Ethics Inf. Technol. 15(2), 87–98 (2013)

    Article  Google Scholar 

  17. Strawser, B.J.: Moral predators: the duty to employ uninhabited aerial vehicles. In: Valavanis, K.P., Vachtsevanos, G.J. (eds.) Handbook of Unmanned Aerial Vehicles, pp. 2943–2964. Springer, Dordrecht (2010). https://doi.org/10.1007/978-90-481-9707-1_99

    Chapter  Google Scholar 

  18. Scalable Cooperation Group. Moral Machine (2016). http://moralmachine.mit.edu/. Accessed 27 Sept 2016

  19. Castelfranchi, C., Falcone, R.: From automaticity to autonomy: the frontier of artificial agents. In: Hexmoor, H., Castelfranchi, C., Falcone, R. (eds.) Agent Autonomy, pp. 103–136. Springer, Boston (2010). https://doi.org/10.1007/978-1-4419-9198-0_6

    Chapter  MATH  Google Scholar 

  20. Article 36. Killing by machine: Key issues for understanding meaningful human control (2015). http://www.article36.org/autonomous-weapons/killing-by-machine-key-issues-for-understanding-meaningful-human-control/. Accessed 9 May 2019

  21. Open Roboethics initiative. The Ethics and Governance of Lethal Autonomous Weapons Systems: An International Public Opinion Poll, 5 Nov 2015. http://www.openroboethics.org/laws_survey_released/. Accessed 15 July 2017

  22. Jackson, C.: Three in Ten Americans Support Using Autonomous Weapons (2017). https://www.ipsos.com/en-us/news-polls/three-ten-americans-support-using-autonomous-weapons. Accessed 17 June 2018

  23. AIV and CAVV, Autonomous weapon systems: the need for meaningful human control, A.C.O.I.O.P.I.L. Advisory Council on International Affairs, Editor, pp. 1–64 (2016)

    Google Scholar 

  24. Kuptel, A., Williams, A.: Policy guidance: autonomy in defence systems (2014)

    Google Scholar 

  25. UNDIR, Framing Discussions on the Weaponization of Increasingly Autonomous Technologies, pp. 1–14 (2014)

    Google Scholar 

  26. Awad, E.: Moral Machine: Perception of Moral Judgment Made by Machines. Massachusetts Institute of Technology, Boston (2017)

    Google Scholar 

  27. Reips, U.-D.: Standards for internet-based experimenting. Exp. Psychol. 49(4), 243 (2002)

    Article  Google Scholar 

  28. Oehlert, G.W.: A First Course in Design and Analysis of Experiments. W.H. Freeman, New York (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ilse Verdiesen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Verdiesen, I., Dignum, V., Rahwan, I. (2018). Design Requirements for a Moral Machine for Autonomous Weapons. In: Gallina, B., Skavhaug, A., Schoitsch, E., Bitsch, F. (eds) Computer Safety, Reliability, and Security. SAFECOMP 2018. Lecture Notes in Computer Science(), vol 11094. Springer, Cham. https://doi.org/10.1007/978-3-319-99229-7_44

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-99229-7_44

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-99228-0

  • Online ISBN: 978-3-319-99229-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics