Abstract
Attempts to employ discourse ethics for assessing communication and information technologies have tended to focus on managerial and policy-oriented contexts. These initiatives presuppose institutional resources for organizing sophisticated consultation processes that elicit stakeholder input. Drawing on Jürgen Habermas’s discourse ethics, this paper supplements those initiatives by developing a more widely usable framework for moral inquiry and reflection on problematic cyberpractices. Given the highly idealized character of discourse ethics, a usable framework must answer two questions: (1) How should those who lack organizational power (e.g., concerned citizens, students) conduct their moral inquiry under non-ideal conditions of discourse? (2) How ought they to understand the moral force of the judgments they reach under such conditions? In response, I develop the heuristic implications of Habermas’s principle of universalization. To render that principle usable for non-ideal discourse, I propose a modification that yields a scalar measure of “dialogically robust” judgments that are responsive to the actual state of discussion. To illustrate the use of these principles, I sketch two case studies, which examine the moral acceptability of violent video gaming and government cyber-surveillance.
Similar content being viewed by others
Notes
This paraphrases Habermas’s “Discourse Principle” (D), which he now understands as a broad principle of normative justification, applicable to the full range of types of norms: moral, legal, group-centric, and so on; see Habermas (1996, 107–11).
However, Birrer favors Albrecht Wellmer’s version of discourse ethics (Wellmer 1991) over Habermas’s.
Habermas (1993, chap. 1); he also distinguishes “pragmatic” issues, which concern the selection of means, but this has been less contentious.
Moor assumes that “some policies will be regarded as unjust by all rational, impartial people, some policies will be regarded as just by all rational, impartial people, and some will be in dispute” (1999, 67); but he does not say how to resolve disputed cases.
This idea harkens back to Kant’s Kingdom of Ends formulation of the Categorical Imperative: “A rational being must always regard himself as a legislator in a kingdom of ends” (Kant 1994, Ak. 434). For an analysis and defense of the deeper value commitments of Habermas’s broader discourse theory, see Rehg (1994, 134–149).
Habermas’s terminology reflects his distinction between particular interests, cultural values, and generally binding norms; notice however that each of these may be regarded as something a person does or ought to “value” in some way.
A typical defense of discourse ethics points to the vast area of moral agreement in everyday life. But that defense does not suffice for cyberethics, with its focus on unsettled, even muddled and opaque, moral problems.
Debates beginning at the end of the sixteenth century between probabilists and their opponents turned on the level of expert support necessary for considering a judgment morally acceptable; see Harty (1911).
I draw to some extent on cases that were collectively developed in a team-taught computer ethics course; I thank the co-instructor (and computer scientist) Erin Chambers and the students of this course (Computer Ethics, Saint Louis University, Spring 2014).
Some evidence suggests the lax policy is followed by the overwhelming majority of parents of teenage gamers; see Anderson and Bushman (2001, 354).
In 2011, the Supreme Court cited First Amendment grounds when it invalidated legal prohibitions against the sale of violent games to minors (Brown v. Entertainment Merchants Association). I assume that ruling does not bear on parental authority, but only on legitimate government intervention.
However, it is not clear how much weight their agreement or disagreement should have in assessing the level of reasonable consensus. In general, the status of less-than-fully-autonomous stakeholders remains somewhat undeveloped in discourse ethics; see Rehg (2003, 86–87, 93–96).
See, for example, the extended list of stakeholders in Gotterbarn et al. (2008, 436).
According to Sicart (2009), violent games can actually foster virtue when played intelligently.
The programs did not require any significant judicial oversight; they were designed to identify unknown terrorists, and thus potentially swept in every citizen as a potential link; surveillance of foreigners was almost entirely unregulated. The constitutionality of the surveillance programs was dubious from the start, and in any case, actual agency practices significantly violated even the lax rules that were eventually put in place; see Kirk (2014), Clarke et al. (2013), Gorban and Barrett (2013).
Students can easily miss this point, when they assume that de facto empirical inconsistencies between a practice and values render the practice morally unacceptable.
A practice can be morally wrong even apart from its effects. A recent example is the social experiment that Facebook conducted on its users without their consent. Although this experiment might have had negative effects on some users, the lack of consent makes it inherently wrong, whether it had such effects or not (Lanier 2014).
Clarke et al. (2013, 46–49) identify three further risks: to individual liberty, international relations, and trade. The negative effects of surveillance on these values, as well as its tendency to weaken public trust, flow from the undermining of privacy. On the risk to liberty and freedom of expression, see also Greenwald (2014, 172–86).
One might object that effective surveillance contradicts Moor’s publicity requirements. In reply, critics could cite NSA director Michael S. Rogers, who recently allayed fears of lasting damage from Snowden’s revelations (Sanger 2014).
Citizen metadata surveillance programs were authorized by sec. 215 of the amended Patriot Act, whereas foreign surveillance programs, which targeted data content, rested on sec. 702 of the amended Foreign Intelligence Surveillance Act. A more detailed analysis would have to consider the different effects of these programs; for example, the fact that US citizens’ data were easily swept into foreign surveillance methods (Rollins and Liu 2014; Clarke et al. 2014, 80–81).
References
Administration White Paper. (2013). Bulk collection of telephony metadata under section 215 of the USA Patriot Act. http://perma.cc/8RJN-EDB7?type=pdf. Downloaded 28 June 2014.
Alexy, R. (1990). A theory of practical discourse. In S. Benhabib & F. Dallmayr (Eds.), The communicative ethics controversy (pp. 151–190). Cambridge: MIT Press.
Anderson, C. A., & Bushman, B. J. (2001). Effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, and prosocial behavior: A meta-analytic review of the scientific literature. Psychological Science, 12(5), 353–359.
Apel, K.-O. (1980). Towards a transformation of philosophy. (G. Adey and D. Frisby, Trans.). Routledge and Kegan Paul: London.
Benhabib, S. (1992). Situating the self. New York: Routledge.
Birrer, F. A. J. (2001). Applying ethical and moral concepts and theories to IT contexts: Some key problems and challenges. In R. A. Spinello & H. T. Tavani (Eds.), Readings in cyberethics (pp. 91–97). Boston: Jones and Bartlett.
Bushman, B. J. (2013). Don’t buy your kid Grand Theft Auto V for Christmas. Huff Post Tech, 13 Dec 2013. http://www.huffingtonpost.com/brad-j-bushman/dont-buy-your-kid-grand-theft-auto-v-for-christmas_b_4440477.html. Accessed 19 Feb 2014.
Clarke, R. A., Morell, M. J., Stone, G. R., Sunstein, C. R., & Swire, P. (2013). Liberty and security in a changing world: Report and recommendations of the president’s review group on intelligence and communications technologies. http://www.whitehouse.gov/sites/default/files/docs/2013-12-12_rg_final_report.pdf. Downloaded 21 June 2014.
ESBR (2014a) ESBR ratings guide. http://www.esrb.org/ratings/ratings_guide.jsp. Accessed 5 August 2014.
ESBR. (2014b). Principles and guidelines for responsible advertising practices. http://www.esrb.org/ratings/principles_guidelines.jsp. Accessed 5 August 2014.
Floridi, L. (2008). Foundations of information ethics. In K. E. Himma & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 1–23). Hoboken, NJ: Wiley.
Gabbiadini, A., Riva, P., Andrighetto, L., Volpato, C., & Bushman, B. J. (2014). Interactive effect of moral disengagement and violent video games on self-control, cheating, and aggression. Social Psychological and Personality Science, 5(4), 451–458.
Gellman, B., & Poitras, L. (2013). U.S., British intelligence mining data from nine U.S. Internet companies in broad secret program, Washington Post, 7 June 2013. www.washingtonpost.com/investigations/us-intelligence-mining-data-from-nine-us-internet-companies-in-broad-secret-program/2013/06/06/3a0c0da8-cebf-11e2-8845-d970ccb04497_story.html. Accessed 20 June 2014.
Gorban, S., & Barrett, D. (2013). “NSA violated privacy protections, official says.” Washington Post, 10 Sept 2013. http://online.wsj.com/news/articles/SB10001424127887324094704579067422990999360. Accessed 21 June 2014.
Gotterbarn, D., Clear, T., & Kwan, C.-T. (2008). A practical mechanism for ethical risk assessment—A SoDIS inspection. In K. E. Himma & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 429–471). Hoboken, NJ: Wiley.
Greenwald, G. (2014). No place to hide. New York: Metropolitan-Henry Holt.
Gunter, W. D., & Daly, K. (2012). Causal or spurious: Using propensity score matching to disentangle the relationship between violent video games and violent behavior. Computers in Human Behavior, 28, 1348–1355.
Habermas, J. (1990). Moral consciousness and communicative action. (C. Lenhardt and S. W. Nicholsen Trans.). Cambridge: MIT Press.
Habermas, J. (1993). Justification and application. (C. Cronin Trans.). Cambridge: MIT Press.
Habermas, J. (1996). Between facts and norms. (W. Rehg Trans.). Cambridge: MIT Press.
Habermas, J. (1998). The inclusion of the other. (Ed.) Cronin, C., & De Greiff, P. Cambridge: MIT Press.
Harty, J. M. (1911). Probabilism. In Catholic encyclopedia (Vol. 12, pp. 441–446). New York: Robert Appleton.
Heath, J. (1995). The problem of foundationalism in Habermas’s discourse ethics. Philosophy and Social Criticism, 21(1), 77–100.
Heng, M. S. H., & de Moor, A. (2003). From Habermas’s communicative theory to practice on the internet. Information Systems Journal, 13, 331–352.
HLR. (2014). Recent administration white paper. Harvard Law Review, 127, 1871–1878.
Kant, I. (1994). Grounding for the Metaphysics of Morals. In Kant (Ed.), Ethical Philosophy (2nd ed.), (J. W. Ellington Trans.). Indianapolis: Hackett.
Kirk, M. (director) (2014) The United States of secrets. Frontline, 2 parts, May 15, May 20, 2014.
Lanier, J. (2014) Should Facebook manipulate its users? New York Times, 1 July 2014, A17.
List, C., & Pettit, P. (2011). Group agency. Oxford: Oxford University Press.
McCarthy, T. (1998). Legitimacy and diversity: Dialectical reflections on analytic distinctions. In M. Rosenfeld & A. Arato (Eds.), Habermas on law and democracy (pp. 115–153). Berkeley: University of California Press.
McCormick, M. (2001). Is it wrong to play violent video games? Ethics and Information Technology, 3, 277–287.
McMahon, C. (2000). Discourse and morality. Ethics, 110, 514–536.
Medine, D., Brand, R., Cook, E. C., Dempsey, J., & Wald, P. (2014). Report on the surveillance program operated pursuant to section 702 of the foreign intelligence surveillance act. Privacy and Civil Liberties Oversight Board. 2 July 2014. http://www.pclob.gov/All%20Documents/Report%20on%20the%20Section%20702%20Program/PCLOB-Section-702-Report.pdf. Accessed 4 July 2014.
Mingers, J., & Walsham, G. (2010). Towards ethical information systems: The contribution of discourse ethics. MIS Quarterly, 34(4), 833–854.
Moor, J. (1997). Towards a theory of privacy in the information age. Computers and Society, 27(3), 27–32.
Moor, J. (1999). Just consequentialism and computing. Ethics and Information Technology, 1, 65–69.
Olson, C. K., Kutner, L. A., Baer, L., Beresin, E. V., Warner, D. E., & Nicholi, A. M, I. I. (2009). M-rated video games and aggressive or problem behavior among young adolescents. Applied Developmental Science, 13(4), 188–198.
Rehg, W. (1994). Insight and solidarity. Berkeley: University of California Press.
Rehg, W. (2003). Discourse ethics. In E. Wyschogrod & G. P. McKenny (Eds.), The ethical (pp. 83–100). Malden, MA: Blackwell.
Rollins, J. W., & Liu, E. C. (2014). NSA surveillance leaks: Background and issues for Congress. In S. Cepeda (Ed.), NSA intelligence collection, leaks, and the protection of classified information (pp. 1–27). NY: Nova.
Sanger, D. E. (2014). Sky isn’t falling after Snowden, N.S.A. chief says. New York Times, 30 June 2014, A1, A13.
Schiesel, S. (2011). Supreme court has ruled; now games have a duty. New York Times, 28 June 2011. http://www.nytimes.com/2011/06/29/arts/video-games/what-supreme-court-ruling-on-video-games-means.html. Accessed 11 July 2014.
Sicart, M. (2009). The ethics of computer games. Cambridge: MIT Press.
Stahl, B. C. (2004). Responsible management of information systems. Hershey, PA: Idea Group Publishing.
Stahl, B. C. (2008). Discourses on information ethics: the claim to universality. Ethics and Information Technology, 10, 97–108.
Stansbury, J. (2009). Reasoned moral agreement: Applying discourse ethics within organizations. Business Ethics Quarterly, 19(1), 33–56.
Tavani, H. T. (2008). Informational privacy: Concepts, theories, and controversies. In K. E. Himma & H. T. Tavani (Eds.), The handbook of information and computer ethics (pp. 131–164). Hoboken, NJ: Wiley.
Tavani, H. T. (2011). Ethics and technology. Hoboken: Wiley.
Wellmer, A. (1991). Ethics and dialogue: Elements of moral judgment in Kant and discourse ethics. In D. Midgley (Ed.), Persistence of modernity (pp. 113–231). Cambridge: MIT Press.
Wong, P.-H. (2012). A Walzerian approach to ICTs and the good life. Journal of Information, Communication, and Information in Society, 10(1), 19–35.
Wright, D. (2011). A framework for the ethical impact assessment of information technology. Ethics and Information Technology, 13, 199–226.
Yetim, F. (2006). Acting with genres: Discursive-ethical concepts for reflecting on and legitimating genres. European Journal Of Information Systems, 15, 54–69.
Yetim, F. (2011). Bringing discourse ethics to value sensitive design: Pathways toward a deliberative future. AIS Transactions On Human-Computer Interaction, 3(2), 133–155.
Acknowledgments
I thank Garth Hallett and Erin Chambers for their feedback on earlier versions of this paper. I also thank Professor Chambers and the Computer Ethics class (Saint Louis University, Spring 2014), whose discussion of issues and case studies in this paper has shaped my analysis.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Rehg, W. Discourse ethics for computer ethics: a heuristic for engaged dialogical reflection. Ethics Inf Technol 17, 27–39 (2015). https://doi.org/10.1007/s10676-014-9359-0
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10676-014-9359-0