Abstract
This paper examines the oncoming socio-economic impact of the Technological Revolution and the "AI Ecosystem", particularly on the legal community and its processes as a practical example with which both liberal artists and scientists might identify. In particular, the tension between the economic- and socially driven thrust of modern society and traditional human value systems is discussed at some length. It includes the effects on societal values and structures that have resulted and are likely to continue well into the foreseeable future and suggests possible remedies for these problems avoiding the deep jargon of both the law and technology.
Similar content being viewed by others
Notes
The error bar shows error or uncertainty in a reported measurement, sometimes represented as a "plus or minus" number. It compensates for mechanical variations in the capabilities of machines and allows the analysis to be "generally accurate." It is sometimes graphically presented.
This is particularly true in the case of tests to determine intoxication while operating motor vehicles.
The concept of “thinking machines” was initially put forward by Turing (1950).
The Turing test, was originally called the imitation game by Alan Turing in 1950. The test results do not depend on the machine's ability to give correct answers to questions, only how closely its answers resemble those a human would give. The test was introduced by Turing in his 1950 paper, "Computing Machinery and Intelligence", while working at the University of Manchester (Turing, 1950; p. 460).
See Chollet (2019) for a discussion of the two principal intelligence defining constructs in the 21st Century.
That said, the inability of the observer to grasp what the machine may be "thinking" is illustrated in the motion picture Ex Machina. The ultimate union of man and machine in an AI scenario was depicted in Star Trek: The Motion Picture, which of course, saved the Earth from destruction. Cf. Lady Ada Lovelace’s Objection, as noted in her memoir on Babbage’s Analytical Engine, 1842.
Nin (1959).
This is known as "Wigner's paradox." See Musser (2020).
This concept was originally articulated by the author in the paper "The Terminator Missed a Chip!: Cyberethics", presented at the International Astronautical Congress of 1995, Oslo and originally published by the American Institute of Aeronauticsand Astronautics, Inc. with permission. Released to IAF/AIAA to publish in all forms. The corollary is the ability of technology to drive alterations in those conventions without regard to human input in a societal "default" to the machines.
Boorstin (1994).
Engebretson (2018).
The abstract of this presentation can be found at: https://aaas.confex.com/aaas/2015/webprogram/Paper14064.html. The entire text is not available.
Nadin (1997).
See Marshall (2019) for an extended discussion of the problems of data retention.
See Aquinas, Commentum in Quatuor Libros Sententiarum, I, VII, 1,1. Parma Edition.
See Reynolds (2020). See also, the Star Trek episode "Court Martial" for the problem faced by a court when presented with manipulated computer-generated evidence.
Agricola (98), Book 1, paragraph 21.
For a discussion of the practical aspects of the application of AI in a conventional legal environment focusing on the search capabilities and electronic discovery pitfalls that are presented, see Miller, "Benefits of Artificial Intelligence: What Have You Done For Me Lately?", available at https://legal.thomsonreuters.com/en/insights/articles.
See the discussion in Pascal (2020).
Daubert (1993).
See Allen (2020)for a discussion of both the immediate and potential impacts of AI on the judicial process, leading to the conclusion that, while the potential impact in enormous, the problems of contact between the judicial process and the "AI Ecosystem" have yet to be measured.
See Hernández-Orallo (2017); see also, Chollet, supra.
References
Allen J (ed) (2020) Artificial intelligence in our legal system. 59 Judges J (ABA) 1
Boorstin DJ (1994) Cleopatra’s nose: essays on the unexpected. Random House, New York
Chollet F (2019) On the measure of intelligence. Cornell University. arXiv:1911.01547v2
Daubert v (1993) Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579, 113 S. Ct. 2786, 125 L. Ed. 2d 469
Engebretson J (2018) Data, data, everywhere. In: Baylor Arts and Sciences, (Fall) 24.
Guardans R, Czeglédy N (2009) Oriented flows: the molecular biology and political economy of the stew. 42(Leonardo 2):145–150
Hernández-Orallo J (2017) Evaluation in artificial intelligence: from task-oriented to ability-oriented measurement. Artif Intell Rev 48:397–447
Holmes O (1881) The common law. Little Brown and Co, Boston
Marshall JM (2019) The modern memory hole: cyberethics unchained. 3 Athenaeum Review 94–101
Marshall JM (2020) Examining judicial decision-making: an axiological analytical tool. 29 Studia Iuridica Lublinensia 3(Summer):55–65
Musser G (2020) Paradox puts objectivity on shaky footing. Science 369(6506):889–890. https://doi.org/10.1126/science.369.6506.889
Nadin M (1997) Civilization of Illiteracy, Dresden University Press
Nin A (1959) Seduction of the minotaur. Vol 5 of Cities of the Interior
Pascal S (2020) Who to blame in an autonomous vehicle crash?. Mensa Bulletin
Reynolds M (2020) Courts and lawyers struggle with growing prevalence of deepfakes. ABA J (Trial Litig)
Turing AM (1950) Computing Machinery and Intelligence. Mind LIX(236):433–460. https://doi.org/10.1093/mind/LIX.236.433
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Marshall, J.M. Technoevidence: the "Turing limit" 2020. AI & Soc 36, 1021–1028 (2021). https://doi.org/10.1007/s00146-020-01139-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00146-020-01139-z