Phenomenon of Dual Use Technologies: Introduction

Not long ago the term dual use technology came into the scientific lexis. It first was characterized as an intangible technology transfer of weapons for mass destruction and then was associated with conversion of military-industrial complex to civilian purposes, and the demilitarization of American society. Any military technology was characterized as dual use if it could be used both for peaceful and military aims.

Later, the line between military and civilian use of technologies blurred. In many cases, there were no clear differences between civilian and military technologies, as many could be used in military and civilian spheres without the need for conversion, such as ordinary computers in the defense industry [1]. Finally, people recognized that technologies may be used for a variety of purposes. Recently, the term dual use technology has been used to imply an opportunity for the wider exploitation of research and manufacturing efforts beyond their initial (civilian or military) goals.

As Todor Galev writes, there is no consensus on the notion of dual use technology, but the way we define it is important because it focuses our attention on specific aspects of the problem under investigation [2].

Two main approaches to dual use technologies were raised during discussion at the Conference “The Advancement of Science and the Dilemma of Dual Use: Why We Can`t Afford to Fail” in Warsaw in November 2007: the Anglo-American pragmatic approach, and the continental metaphysical approach [3].

It seems that these approaches based also on different understandings of technology. The first one rather is associated with technology’ notion as a sequence of processes and operations aimed to products with necessary and useful for people properties. The second one with enough neutral definition—understanding of technology as use of organized knowledge for aiming of practical goals by systems of machines and people that are put in order.

As a result in the American scientific ethos the term dual use is associated with concrete technologies that may present a significant threat for a society, and thus cause special concern. The phrase dual use technologies of concern refers to research and technology with the potential both to yield valuable scientific knowledge and to be used for nefarious purposes with serious consequences for public health or the environment [4]. In this narrow notion, the potential duality is reflected in the ways by which a technology’s products, which are used in one human area, can be adapted and applied in others.

We suppose in the American social and scientific context, the term dual use technologies has enough pragmatic sense, and lends itself to describing concrete things, i.e. dual use experiments, dual use review, dual use materials, dual use information, dual use cases, dual use discoveries, dual use risk, etc. [4]. The notion of dual use often arises in the area of biological research. In this context, the term usually refers to the manipulation of different biological agents in so called experiments of concern, which enhance the virulence of a pathogen, increase its transmissibility, or alter its host, etc. [5]. Experiments of concern create a dilemma for researchers that stems from the potential actions of others. This type of dilemma is discussed in the context of biosecurity, biosafety, the creation of weapons of mass destruction, terrorism prevention, the development of governmental control systems, and the establishment of authorities independent of both the research institutions and government [6].

By contrast to their American counterparts, European scientists usually consider the duality of technology as not inherent to the technology itself. Technologies are not a priori either military, civilian or both. Their character depends on human mental structures, and the social networks in which they are developed or used. The duality is perceived to be a characteristic of human attitudes toward other people and the environment. As such, it can periodically appear or disappear in different forms mirroring human nature and social relations.

The broad European definition implies that all technologies (not only of the narrow category of technologies or experiments of concern) may be potentially dangerous when applied by second or third party users, or even before their transfer. The distinction between the American and European definitions of dual use, and the dichotomy in perspectives that it reveals is a focal point of the following discussion.

Many scientists, including M. Bunge, V. R. Potter, and E. Agazzi, believe that moral philosophy and technology have many things in common, and that a problem of technology can not be resolved without considering its metaphysical grounding. M. Bunge explores the idea that ethics is a philosophical technology, and there are many commonalities between the practical branch of philosophy and praxiology, or action theory [7]. He concludes that only the union of philosophy and praxiology can address the moral problems that accompany new technological practices and products that alter the everyday lives of many people.

In his famous book “Bioethics: Bridge to the Future” V.R. Potter stresses that the ethics of science and technology can not be considered separately from philosophical wisdom. Dr. Potter cities I. Kant: “… wisdom—which consists more in doing and letting be than in knowing - needs a science, not to learn about it, but to afford its prescript admission and endurance” (emphasis in original) [8]. Potter claims Kant’s “prescript admission and endurance” means the application of wisdom as an action policy, based on reason and duty for either doing or letting be.

According to the typical Anglo-Saxon approach, which was introduced by J. Lock, D. Hume, J. E. Moor, B. Russell, and L. Wittgenstein, the aim of any authentic philosophy is the analysis and clarification of empirical world pictures. Ken Wilber writes that “this point of view has always amazed European philosophers by its simplicity … They think that philosophy is not only the question of creation of objective world pictures, but an investigation of structures in subject, which give the possibility of these pictures’ creation. Frankly saying, everywhere on all maps, made by cartographer his fingerprints are left” [9].

In continental philosophy, beginning from I. Kant and continuing in philosophical works of F. Shelling, G. Hegel, F. Neither, A. Schopenhauer, and M. Heidegger, the dominating concept is that the empirical world (in number of senses) is understood not through simple apperception but through interpretation. The empirical world “does not wait for to be discovered”, it is built in subjective and intersubjective contexts and preconditions, which determine anything that can be seen in the empirical world.

The philosopher Seumas Miller discussed both pragmatic and metaphysical ways of viewing the dual use dilemma. Demonstrating the pragmatic point of view, he noted that secondary users may implement technologies for different design-purposes than the ones initially intended. Demonstrating the metaphysical point of view, he noted that S and T outcomes may be: (1) intended; (2) unintended but foreseen; (3) unintended and unforeseen and perhaps unable to be foreseen [10].

In our opinion it is important not to confuse a technology’s purposes (design purposes) with its outcomes or with the goals and intentions of society, professionals, and researchers in developing and applying the technology.

There is a difference between an activity’s consequences and an activity’s goal. An activity’s goal is something that is apparent when the action is being planned. In the case of man’s actions, they are conscious goals reflective of clear intentions. However, there may be outcomes that are not intended. These are an activity’s consequences. Even when consequences are not intended, moral responsibility remains with the agent. In the legal system, an activity’s consequences are described as “excusable crimes”, “errs out of ignorance”. There may or may not be punishment for these consequences, even if they were not intended.

People are responsible for the consequences of their actions, even if they are not expected or intended. An outcome might be intended, unintended or unforeseen by scientists, but it does not free people from responsibility for their actions. They are responsible for researching the outcomes of their actions, and for changing their initial design-purposes accordingly. For example, scientists who preserve smallpox samples for research purposes in the context of a policy of mandatory destruction of samples are responsible if those samples are weaponised.

Our position is that technology’s developers, as well as original and secondary users, are responsible for what they are doing, and for the outcomes of the technological inventions they are dealing with. Design-purposes, methods, goals, outcomes (foreseen and unforeseen) of the technologies need to be considered prior to any scientific or technological decision and activity. S and T should reflect human values at all stages—from design and early development to realization.

Our aim is to analyze dual use phenomena as a result of intentionality—humans’ ability, by means of rational reflection and feelings, to create the reality of moral values. We understand intentionality and intention phenomenologically, as not only a reasonable function of our minds, but as a phenomenon expressing human wholeness: the unity of human rational abilities, feelings, psychic intuitions, etc. This would have been impossible without the renaissance of the holistic approach. The holistic approach reflects an understanding of a human, not only as an intellectual and rational creature, but as a symbolic and moral person, a unification of body, soul and intentions.

The Dual Nature of Man

The phenomenon of dual use technologies reflects the particularities of postmodern knowledge about a human being as a human “making a choice” between natural and artificial, life and death, and development and stagnation. It also reflects knowledge of a human as a personality looking for and broadening the frameworks of his own existence, establishing a super new order through already existing orders. According to American scientist David Heyd, we can better understand the anthropogenic consequences of biotechnologies (and technologies in general) having answered three key questions about: (1) the existence of human beings; (2) the essence of human beings; (3) the number of people involved [11]. He notes that technologies appear to be dangerous in that they may allow us to deplete the humanistic sense of life, or threaten the existence, essence, or number of mankind.

Nowadays there is a diversity of life-worlds (Lebenswelt) which cross and interact, and yet remain themselves. The existence of numerous life-worlds accounts for the plurality of positions and points of view, but may create difficulties when we have to decide on the use of a technology. Our choice is limited by the logic of progress and the speed of change. In such a case an individual’s choice is neither based upon great number of possibilities, nor the subject of probabilities. It probably creates benefits that address life’s necessities on one hand, and has a cost, which may entail good or bad “payments” for such benefits on the other [12].

The dual use technologies concept is based on many oppositions: man and nature, artificial and natural, subject and object, general and individual, theoretical and practical, and relative and absolute. Innovations in science and their technological realizations have raised new kinds of problems, for example defining development and destruction, peace and war, international collaboration and opposition, security and hazard, etc.

A new paradigm adopts the concept of technology as both artifact and as whole systems of social relations. Technology reflects numerous abilities of man: to recognize technical problems, to develop new concepts and tangible solutions to technical problems, to explore the concepts, to act for the progress and welfare, to guard the life, the health, to recognize a danger, to create security, to built alliances.

As pointed out by Jurgen Habermas, the main dangers come from changing our attitude towards ourselves, our ability to obliterate the boundary between external and internal natures, and the instrumentalization of human life according to the preferences and value orientations of the anonym third party [13]. Technologies modify the human image as a cultural species and correspondingly his ethical self-consciousness. We are no longer able to understand ourselves as ethically free and morally equal creatures.

A new moral self-consciousness of natural and artificial phenomena arises, which is defined not by man himself, but by technological means (i.e. the interference of the “third party” from outside as the unlimited possibilities expand from new ideas to new technologies). The choice is made not between our nature and the thing which is alien to it, but between different technological approaches that more or less modify the essence of our cultural species.

Generally speaking, we deal with dual use technologies and the dichotomies they present everywhere that the dual nature of man (symbolic and biological) interacts with the cultural landscape of mankind. The main goal of philosophical reflection on dual use technologies is to clarify “how to transfer” or “transform” technologies into ones more humanistic and friendly to human beings.

How is it that a given technology ends up being evil, good, or both? There are some explanations: (1) the dual nature of human being; (2) cultural ethos; (3) the absence of adequate control over the technology; (4)the impossibility of correctly determining the boundaries of risks; (5) man “as a subjective factor”; (6) God’s intervention.

According to Eric Fromm, individuals have the tendency to manifest two main tendencies in society: biophilia or necrophilia (love to life and love to death) [14]. Each individual can be either necrophile or a biophile. These possibilities are the result of God’s intention that man be an open and uncompleted project.

Whether a person becomes a necrophile or a biophile depends on the peculiarities of the individual’s biography and the impact of society. Biophilia is a deep, vital orientation and an important element of human existence. It is orientated toward a holistic perception of the world, and reflects love and a sacred attitude toward all living creations. The man-necrophile is an antipode of a biophile. He is attracted by everything mechanical, not alive. He would like to transform all vital processes and feelings into objects of his manipulation, into things. Neither individuals nor professional groups of scientists that are biophiles, theoretically and practically, would use technology for bad purposes or for the destruction of the world.

Necrophilic tendencies are followed, not only by individuals, but by whole communities. They have been inculcated into the internal layer of the industrial and postindustrial culture. It is not more chance that necrophilic tendencies have manifested themselves in not only in the vital interests of man, but also in mass manslaughter. Science only enhances man’s force and power, thus frequently multiplying evil or good inside him. The problem of dual use technologies does not discredit the objectivity of science. Instead, it focuses our attention on the necessity of increasing the responsibility of the scientific collectives, of individual scientists, and of other people dealing with the development and use of technologies dangerous to human beings.

Philosophical reflection allows us to reach a better understanding of the dual-use phenomenon in human activity. This understanding is closely related to the double character of the human body as both subject and object, as person and thing. This phenomenon is rooted in the rational intentions of human consciousness. It requires a dual-acceptance and self-understanding of human beings as individuals whose rational choices can be simply separated from their bodies and souls. Yet, the body is the foundation of experience in a way that is the most intimate in human existence, though a body may also be experienced from the outside as an object that is different from consciousness. This also means that the presence of the body as a physical object is not the only criterion of personhood. The subject would rather be characterized by embodied, living personhood.

The kind of effect that one technology or another has is conditioned on many factors related to cultural differences. Each country defines its own vision of science and technology in the light of its history and traditions. Actually, the concept of dual use technology demonstrates a tradition. It reflects the fact that the human’s mind allows for “distrust and disclosure,” enabling the evaluation of the essence of S and T as good and evil in several social and cultural contexts.

A particular technology usually takes root a particular country and culture, which makes its use less dangerous in its mother country. For example, the healing arts, reflected in the technology of using remedies from plants and animals, has existed in China for a long time. The country accumulated extensive experience related to herborization and the use of proper parts of animals in health improvement. This technology was then transferred to the USA and modified into the manufacture of biological active drugs (BAD) tablets. However, the problems of culture and the lack of skills related to BADs’ application in the American population resulted in harm to the public (including thousands of lethal outcomes). When we deal with the transfer of technology from one society to another, and moreover with its modification, we need to pay special attention to the importance of proper education for the products’ producers (manufacturing companies) and users (Americans).

It is necessary to admit that technologies influence not only the environment of human beings, but also their thoughts, habits, lifestyles, ideals, and value systems. Furthermore, as science becomes more prevalent, and technology becomes more scientific. As a result, the power and status of S and T in the society are exponentially enhanced. At the same time, man’s ability to exert adequate control over S and T decreases. Technology escapes man’s control and begins to live under its own laws.

The determination of an appropriate risk framework for technologies is complicated by several issues. Firstly, there exists not only one single technology, but many technologies which interact and potentate each other. Therefore their common effect is not only the sum of their actions, but a completely new interaction. Secondly, there is the problem of “How to evaluate the risk”. It is not sufficient just to observe the technology. Special equipment and professional experience are essential for many undertakings (e.g. detection of genetically modified products in food). Thirdly, a human being can consciously reorder a set of risks to reach some existential aims, such as to justify playing extreme sports, or stay alive and healthy, such as to use risky therapies to treat terminally ill patients.

“Man”, with his desires and feelings as a subjective factor, is of great importance in this matter. This subjective human factor can lead to the development of an absurd situation in which standard operation procedures conflict with man’s decision to improve the operations or to avoid making a decision based with contradictory input. For example, on July 2, 2002, a plane crash between Bashkortostan airlines’ Tu 154 and a DHL Boeing cargo aircraft in the sky above Bodenskyy lake occurred when airplane computers’ registrations and air traffic controller’s commands contradicted each other. The crash caused 71 casualties.

The human factor plays a great role in risk’s enhancement when, under stress, man, with his chaotic intentions and conflicting decisions, attempts to act to according to the algorithms of machines and technologies. Another example of this may be found in the August 12, 2000, shipwreck of the Russian atomic submarine “Kursk,” resulting in 118 victims. One of the hypothetical explanations was the human factor, conjecturing that one of the rockets had been dropped while shipping.

If we cannot explain the presence of evil in the world using any common reasonable arguments, we turn to the concept of God’s intervention. From this point of view, the Creator looks at the world as his own gigantic experiment, expressing immeasurable curiosity. God is eager to learn about, investigate and experience his own potential as a Creator. Thus, human drama for God is only a game, while for mankind, it is an essential experience. The dilemma of good and evil, and the dilemma of how to use a technology, here is presumed to occur at the decree of God and divine intervention [15]. This idea was reflected in an EU High-Level Experts Group (HLEG) on synthetic biology statement in its 2005 report: “it seems likely that the notion of creating entirely new life forms will also stimulate debates about the proper ethical boundaries of science” and that to some, “this is sure to seem like ‘playing God’” [16].

Technology Poisoning

We suppose that making the association between the effects of technologies and poisons is substantial. What is “poison”? It may be defined, not as a type of substance, but rather as a measure of a substance. Poisons are not all dangerous for living creatures due to their ability to end life in very small doses (e.g. warfare agents, such as yperite). On the contrary, ordinary table salt can be a poison if enough is ingested.

Not all poisoning come from ingesting toxins. Another type of poisoning can occur in the form of social patterns and beliefs. John Naisbitt describes the symptoms of poisoning by high technologies as follows: (1) a preference for quick decision-making (from religion to a healthy diet); (2) a feeling of fear and a worship of technology; (3) the absence of distinctions between reality and fantasy; (4) the acceptance of violence as a normal part of life; (5) a love to technology as a toy; (6) alienation as a characteristic of human life [17].

We suppose there is a relationship between our understanding of technologies poisoning and John Naisbitt’s one. Technologies “overdosing” as usually results in lack of decision making reflection, infantilism, alienation, etc.

S and T tend to transform the nature of an ordinary man to the Nietzschean one, focused on obtaining power as quickly as possible. Thus, due to the newest biomedical technologies, the realization of mechanisms of biopower are becoming possible (e.g. birth and death rate control). The will of the power and power itself are the things usually encompassed in societies by the gloss of mystique and worship.

We can see the idolization and the worship of wealth and success in the fact that our politicians, businessmen, and statesmen remain at the highest steps of the social staircase. We can find exactly the same idolization in our attitude towards modern technical innovations—our fear of criticizing them, and in our consideration of ourselves as nonexperts and laities. Meanwhile the heuristics of fear may become the foundation of new ethics and morals, and new principles, such as principle of precautionary retribution.

On the platform of the sacralization of the “will to the power” as an existential human characteristic, we more easily accept patriarchal norms and authoritarianism everywhere, including in our attitude towards technologies. Our natural rhythms begin to obey the rhythms and energies of technologies. Thus, computer technologies greatly intensify our work, but also keep us busy all the time, both at work and at home. Television and mass-media technologies subordinate viewership to the logic of TV-pictures and rapidly changing information.

So, technologies that make an illusion of reality, or simulacra, have become a part of life. It mainly effects our vision of real events and experiences. A lot of phenomenology oriented philosophers, for instance, A. Whitehead, M. Mamardashvili, deem that the universe consists of numerous discontinuous moments of life, and its main element is not a substance but separate moments of experience and events [18]. As a result, there is always a gap in causal world. This means that there is a free will gap between two moments of life, A and B. Thus, if somebody makes good in point A, it does not automatically mean he will make good in B. In other words, an event can be continued only through discontinuous forces in each moment of life. A transition from reality to the virtual world is apparent in the foundations of Being and Life and, in turn, can be a source of metaphysical insights aiming at deep respect for all living creatures and responsibility for the world.

Homo ludens or Homo gamer is one of the main human images and ontological characteristics. But very often, a play may be transformed into escapism (the desire to escape from real life to a virtual world of puppets). Technologies, which become the toys, have an awful destructive potential both for a person and for his environment. It is obviously that symptoms of poisoning and overdosing with technologies can be explained by the nature of human beings as Homo faber or Homo ludens and are related to his search for a sense of life.

When we talk about dual use technologies we mean goals, that are measured and compared with human values, and the intentions of those who develop and use them. Confusing goals and intentions has resulted in the situation in which the notion of goals was expelled from science, and from modern rationality. Goals were considered to be only subjective. We distinguished goals from intentions in order to show that moral judgment about technology supposes an analysis of goals as well as intentions both their objective and subjective aspects.

Both in Russian and Ukrainian, there are similar in meanings for words combining the two ideas mentioned above: measure and intentions. The Ukrainian word is “namir,” the Russian is “namerenie.” Both mean a person’s “intention” to do something. At the same time, these words contain the particles “mer” and “mir” (respectively) which take their roots from “mera”, “mira”—“measure”. Thus, in some languages we can find words that combine the concepts of measures and aims in their meanings. “Namir” or “namerenie” suggest a person’s intention to measure something before a real action. They suggest a person’s intention to compare his feeling, desires, thoughts and experiences with some other existing things, and to put them into proper moral frameworks. This intention to make moral frameworks to evaluate technologies must precede their use. The etymology of words in Russian and Ukrainian languages suggest that, if a goal is morally acceptable, then intentions, means, instruments and activities aimed at the goal are also acceptable.

With technology, we confront the possibility of a situation in which the artificial world created by humans has no remaining his intrinsic nature or values. Moreover, many unintended, unexpected, unforeseeable consequences may result from new technological realizations. If we want to generate an appropriate moral framework to order and regulate our technologies, it must be based on human values.

A human’s behaviour is always taken with reference to his goals, which precede his activity aiming at these goals. Thus, one important human characteristic is the ability to imagine an ideal state of things, which may be achieved with the help of symbolic activity. Thanks to this ability, intentionally imagining ideal entities may sometimes generate idealistic models for our activities and its orientations. We call this kind of activity intentional [12]. Unlike an animal, a human can direct his consciousness to contemplate abstract, possible or future situations, as well as the principles, ideals, norms and values. Human intentional activity opens the door for human values and ideals to govern and prevent unexpected, unforeseeable consequences of technologies. As the contemporary world becomes more technically complicated, each new technology can first be measured and compared with man’s values, traditions and societal norms, and concepts of human nature not only as Homo faber and Homo sapience, but Homo ludens, Homo patients, Homo astheticus, etc. [19].

But there are no neutral or anonymous technologies today. They all are “civilian” or “military” arms, characterized as either doing harm or bring benefits to a person. They contribute to mankind’s development and well-being, or do harm and shorten our life spans. In this sense they are products not only of the intellect of man, but of his consciousness, illusions, and his anticipation of joy. The “second nature” (techne) is the idea that everything made by man that deviates from a real being (eidos) and turns into a simulacra of real nature, is a child of illusion and mistaken imagination about being, the world, and man’s nature. It seems dangerous for man.

The degree of technological danger is determined by the balance between risks and benefits. The principle of minimizing risks and balancing risks and benefits is one of the most important components of human protection (e.g. the protection of research subjects in biomedical research).

Ethical dilemmas that arise while using biomedical and other modern technologies are often explained as “unpredictable bad outcomes” of the application of good ideas. In my point of view, the problem with dual use technologies does not come from their improper application (“improper place or time”). Instead, the problem depends upon humans’ understanding of measures and norms. This is apparent in the example of antidepressant consumption. These drugs are used in order to help people overcome grief, sorrow and psychological vulnerability. The risk with such uses is that a healthy person may be improperly diagnosed as being ill. We must all sometimes grieve. If we are all potentially ill and it is a norm, “the healthy person” no longer exists. Likewise, modern scientists in biomedicine consider death to be a contingent biological facility, which we can hopefully overcome in the future, rather than an essential component of a human being’s nature. Death at the end of life is no longer perceived as a norm, but as a consequence to be postponed and avoided.

Science has an objective goal: gaining new knowledge about objective legitimacies, and finding the scientific truth. Abstract science is an activity with the internal and determinative aim of increasing knowledge. The situation is different in case of applied science and technology. If gaining “effective knowledge and procedures” is their intentional aim, we will never approach a context for moral judgment because the definition of “efficacy” refers to the investigated objects, and not to our desired goals and tasks.

For a moral evaluation of technology, we need to investigate the concrete goals of every form of applied research or technical scientific applications. In the case of applied science and technology, only intention (the goal of scientific application) is designative for moral judgment [2]. Nevertheless, a technical activity itself is not morally indifferent to its internal goals. Two explanations for moral scientists’ alienation of technology’s goals are well-known: more and more structuring and specialization of scientific knowledge, and the collectivization of technical activity.

When we talk about means, we would rather consider, not machines, but instruments, which are objects (neither bad nor good). When we think about concrete steps and actions, we again get into the realm of intentions following these concrete actions. For example, the aim of artificial fertilization “in vitro” is definitely humanistic. The technique is the subject of debate, though. According to Christian ethics and theology, fertilization by a husband’s sperm is morally right, but fertilization by anonymous sperm destroys the bands of matrimony.

Ontological Antidotes and Ethical “Remedies” for Technology Poisoning

In “Critics of cynic reason,” German philosopher Peter Sloterdijk emphasized a dual tendency in the investigation of human nature: the separation of knowledge from human consciousness, and the triumph of the logic of suspicion towards the human body. These were the main characteristics of the modernistic transformation of great Renaissance ideas [20]. The concept of dual use technology belongs to the postmodern period, when “dual” was considered, not as manifestation of conspiracy, but as a result of a dialectical interaction between two opposite parties. Instead, here we emphasize not “opposition,” but technological “synergy” between two kinds of activity.

Human nature is deficient. There is a gap, a channel for the utilization of all existing negative and destructive intentions. This channel is opened not only to God’s image but also to nothing, to the abyss. This is one reason for the dual human nature of everything made by man.

In the context of dual use technologies, we stress the importance of three main understandings of human nature as vulnerability, responsibility and narrative identity. In our opinion, these can become a strong ontological “antidote” to technology’s poisoning of modern man. All human intentions that concern the use of technologies need to be interpreted and considered in the context of the human being as vulnerable and earthborn, existing together with other living creatures in a common world (Mitwelt) and being responsible for saving and developing life.

Emmanuel Levinas defines vulnerability as an intrinsic feature of human subjectivity. The experience of my personal vulnerability makes me an ethical subject by motivating my openness to the moral imperative in the other’s face, and motivating me to take responsibility for the other. Thus being fundamental to bodily receptivity of the individual’s being-in-the-world as a being-together, vulnerability indicates something very fundamental and constructive in our lives. Many technologies make our lives more intensive and dynamic. In such a way, they make us more vulnerable.

Responsibility is an ethical imperative for civilization. It is necessary in order to secure the survival of humanity when confronted with modern technologies, and to ensure the integrity and dignity of future human beings. Society should be aware of its responsibility not to destroy all life on earth. Responsible individuals and societies should set limits for social interventions in nature, animals and the human body. The principle of responsibility is based on the close relationship between the human being and the whole living world. In the context of technologies’ use we are saying that man has a responsibility not only to other living creatures, but to all existing things and technologies which have been made by man.

Narrative identity is the concept of a self that resides the unity of a continuous history linking birth, life and death. In other words, the life of a person has a beginning, a middle and an end—structures in any narrative. The narrative of a person will have a certain “unity” about it, which accounts for its being the narrative of the one person rather than another—again just as a story has a unity which makes it one story and not another. The intelligibility and unity of a narrative is dependent upon values.

Using the theoretic-system method, Evandro Agazzi has shown that, in order to introduce a moral dimension into a scientific–technologic system, it is first necessary to change one’s angle of view. One must consider the system not as some global mechanism, but as only one of numerous systems interacting with each other [12]. To function effectively, S and T need to cooperate with the environment, and with human values. It is obvious that S and T aims to enforce its norms in the moral sphere, but by virtue of double feedback it is also influenced by the transformation of modern philosophy and ethics.

Postmodern knowledge is a kind of centaur knowledge. In the philosophy of science, centaur knowledge means the unity of the qualitatively different elements of science, which are very often opposed to each other. Centaur knowledge exists as notions, theories, and statements. Bioethics is also a sort of postmodern centaur knowledge, combining not only the heterogeneous genesis of complex elements, and theoretical and praxeological components of this or that sphere of knowledge, but fragments, theories, and concepts, taken from different segments of science and the arts.

It is obvious that S and T can produce unforeseen sequelae for our lives and can challenge our traditional ways of thinking and moral reflection. We cannot continue to rely upon professional skills and the results of the actions of scientists’ critics. So we need some “preventive means.” We need the following bioethical principles for the regulation of S and T: the principle of good intentions, the principle of the correspondence of goals and means (complementarily); the principle of balance between risks and benefits; the principle of simplicity; and the principle of contextuality.

The problem of good intentions’ application is in their subjective character. To determine whether intentions are good or bad is impossible unless they are actualized in a moral agent’s actions. But even in this situation, it is not easy to evaluate such behaviour, because any individual pattern of public behaviour can be interpreted in different manners. The only way to resolve this problem is through a post factum evaluation of intentions [21].

The principle of correspondence of goals and means (complementarily) means that there should be a correlation between these two things: it is not appropriate to reach peace by war. Each technology’s aim should be appropriate to its means. The principle of balance between risks and benefits means that one should constantly evaluate and compare the amount of harm and benefits that occur with the use of technologies. The principle of simplicity points to the necessity of developing technologies and products that are easy to use. The contextuality principle means that we must consider that technologies in cultural, glocal (global plus local) collective and individual contexts.

Structured by its principles, bioethics forms a new type of postrational knowledge that seems to reside between metaphysics and pragmatics, theory and practice, scientific experiments and reflection, and intuition and normative principles. Despite all of the critical passages concerning “principalism” as a methodological approach, it is growing its popularity due to the openness of bioethical principles to interpretation, and their ability to co-exist with human values such as good, happiness, health, safety, beauty. Ethical principles are concepts in the interspace between scientific and trivial notions. These concepts have cognitive as well as heuristic functions.

Usually principles are presented as the instruments and the standards for human behaviour, as measures of intention in one moment. Usually bioethical principles reflect the intrinsic value of the human being as such, and also a virtue, for example, the principle of integrity. In the case of using of one or other technology, bioethical principles act as a tuning fork. They suggest a standard of behavior, and at the same time, a moral subject’s intention.

From the point of view of scientific naturalism, the source of all value is human consciousness. However, it by no means follows that the locus of all value is consciousness itself, or a mode of consciousness like reason, pleasure, or knowledge. In other words, something may be valuable only because someone values it, but it may also be valued for itself, not for the sake of any subjective experience. Value may be subjective and affective, but it is intentional, and not self-referential.

Conclusion

Nowadays the problem of dual-use technologies can be solved only through the joining of two approaches to its understanding: scientific and philosophical, pragmatic and metaphysical. The pragmatic point of view is realized in the list of ethical principles, norms, specific guidance and policies. These apply to researches and technologists, as well as to everybody who is under the great pressure of modern technologies, or is responsible for the results of his professional activities.

The necessity of a metaphysical approach to study the dual use technology phenomenon is explained by the difficulties in foreseeing the negative outcomes of S and T. This approach exhorts us to a deeper study of human nature, encompassing man’s intentions, goals, values, ideals and social relations.

Analyzing the social, cultural and symbolic contents of the relationship among man, nature, science, and technology we can make considerable improvements through the work of social institutes, education, and politics, which aim at governing a person’s values.

The phenomenon of dual use technologies is associated with some new tendencies in philosophical reflection: (1) the importance of a moral “measure”, and ethical norms and standards in human life (e.g. Good clinical practice, Good laboratory practice, Good manufacturing practice in biomedicine); (2) emphasis on such notions as human “intentionality” and “values”; (3) approaches to the humanization of technologies; (4) the development of a bioethical outlook; (5) the consideration of the goals of technologies with the aim of bringing simplicity to their use and the benefits they bring; (6) the contemplation of achieving the best scenario (the future development of the science of risk).

In our future perspective, we need to admit that technologies will contribute greatly to the development of the introvert features of a modern man. Thus, to prevent of any possible harm from S and T, we must understand our intentions, our intentionality and values, our moral ideals as a tangible part of the real world, and the social relations and ethos of S and T as something that can be measured.