SPT v3n4 - Ethics and the Systemic Character of Modern Technology


Number 4
Summer 1998
Volume 3

ETHICS AND THE SYSTEMIC CHARACTER OF MODERN TECHNOLOGY

S. Strijbos, Vrije Universiteit


A distinguishing feature of today�s world is that technology has built the house in which humanity lives. More and more, our lives are lived within the confines of its walls. Yet this implies that technology entails far more than the material artifacts surrounding us. Technology is no longer simply a matter of objects in the hands of individuals; it has become a very complex system in which our everyday lives are embedded. The systemic character of modern technology confronts us with relatively new questions and dimensions of human responsibility. Hence this paper points out the need for exploring systems ethics as a new field of ethics essential for managing our technological world and for transforming it into a sane and healthy habitat for human life. Special attention is devoted to the introduction of information technology, which will continue unabated into coming decades and which is already changing our whole world of technology.

Key words: technological system, systems ethics, social constructivism, technological determinism, information technology.

  1. Introduction
  2. A burning question for our time concerns responsibility for technology. Through technological intervention, people have succeeded in virtually eliminating a variety of natural threats. At least that is true for the prosperous part of the world, although here too the forces of nature can unexpectedly smash through the safety barriers of a technological society, as in the case of the Kobe earthquake (1995). The greatest threat to technological societies appears to arise, however, from within. It is as if there is a devil in technology that stirs from time to time to terrify society with the threat of disasters to come. In 1996, England was in commotion about the sick cow disease. There were fears for the British economy, and public health was thought to be at risk. Consumers were warned about a possible link between bovine spongioforme encephalopathy (BSE) in cows and the fatal Creutzfeld-Jakob disease (CJD) in humans. At one time or another, headlines throughout the world have announced disasters resulting from: a fire in a nuclear reactor in Chernobyl; the sinking of the ferryboat Herald of Free Enterprise before the harbor of Zeebrugge (1987); and the crash of an airplane into the Bijlmermeer, a residential suburb of Amsterdam (1992). The phenomenon of iatrogenic diseases, which are caused by the treatment itself, also has been recognized for some time. One could go on. In a word, our society has aptly been called a risk society ( Beck 1995 ). While in earlier times people feared mainly the capriciousness of nature as a threat to their existence, today they fear risks attached to technological interventions and the sometimes disastrous consequences of unsuspected interconnections in the humanly-constructed technological environment ( Luhmann 1993 ).

    Thus the progress of technology confronts us with unexpected problems. Yet amidst all the uncertainties, one thing is sure: The question concerning human responsibility for technology has become more acute and at the same time more complex than ever. What is the normative framework for technology? Who is responsible for what? What are the norms for this responsibility, and how can this responsibility be invoked? The German sociologist Ulrich Beck (1995, 1997) has observed that modern society has become a laboratory in which no one is responsible for the outcome of the experiment. He even ventures to speak of organized irresponsibility. Politicians, for example, shrug off any responsibility, saying they do not produce the technology and can at best influence its development only indirectly. Meanwhile, scientists and technologists claim their only task is to engage in solid research and create new technological possibilities. They disclaim responsibility for what is done with their results. Business leaders, however, also claim that it is not they who ultimately determine what happens and what does not. They depend on the market to determine that. The consumer, they argue, has the final say in deciding what is preferred.

    In this article I contribute to current reflection on the ethics of technology. I propose a general approach to the subject and, as an illustration (Section 4), I devote special attention to the significance of information technology. In Section 2, I argue that the problems of this society can be grasped more effectively when the systemic character of modern technology is kept in view. However, the problems of our technological society have deeper roots than this. Technology and the technological infrastructure of modern society are the work of human beings and therefore evince normative moments of choice and motives in their development. These normative issues may not be neglected and relegated to a realm exempt from critical assessment. I consider the latter in Section 3 where I work out in more detail the idea of a systems ethics, which I have presented in my earlier work (1994, 1996).

  3. The Systemic Character of Modern Technology
  4. What are we to understand by the concept of a technological society? To answer this question especially insofar as the meaning of the adjective "technological" is concerned, we may begin with a definition closely connected to technology as many experience it in their everyday lives. Our society is called technological because compared to life barely a generation ago we have fantastic artifacts available to us. Thus this article is written on a personal computer with which, thanks to a built-in modem, I can also get out onto the internet and have daily e-mail contact with my colleagues around the world. To the right of the keyboard, within easy reach, is a telephone, above and slightly to the right is an electric lamp that casts a warm glow on my desk, and before me, just a few meters away, is a heater with a thermostat that makes it possible to be relatively comfortable in my study even in the winter. This is probably a picture of everyday life that most people associate with the technological society. Definitive according to this picture is the fact that people are surrounded by countless useful appliances. This, then, is what many will mean by technology and the technological society.

    This picture, however, cannot convey us to a deeper interpretation of the modern world. In earlier ages the human habitat was also furnished with a great variety of technological artifacts. Even the most primitive societies were not without technology. Technological things have always been at humanity�s service. What is specific to the modern, technological world can thus not be sought in traditional things. An important characteristic of modern technology is that the entire human habitat has changed along with the technological artifacts. Upon closer examination, modern technology is found not to consist of a collection of isolated things and processes, but the very environment in which the artifacts function has been brought within the sphere of technology. Technology is therefore more than the sum of an array of technological components. Technology in modern society has become a complex system that can be viewed as the common house in which we all live together today. Thus here we have a second approach to the phenomenon of technology. Technology today consists of more than a number of distinct and isolated things: Technology has become the habitat of modern humanity. With my computer I can log in to a worldwide computer network. The lights work thanks to an electricity grid to which my house is connected. The radiator in my study is part of a heating system in my house, which is part of a much larger complex, including a network for distributing gas. In general we do not pay much attention to these things, but occasionally we become conscious, usually when something goes wrong, that our existence, individually and societally, has become virtually entirely dependent on complex and, as a rule, vulnerable technological infrastructures.

    When we say that technology has become a system, we do not just mean that technological things have become components of a larger whole (an auto is nothing without an entire technological infrastructure of roads, traffic regulations, etc.); people too have become dependent on a technological environment. Someone who cannot use the transportation system can scarcely participate in the life of society. Technology as a system includes material artifacts but it also includes organization, planning, procedures, and much more. The expansion of technology has gone hand in hand with the rise of various infrastructures that consist of much more than roads, railways, air traffic, and power generating stations. Also belonging to it are financial structures and tax regulations, social legislation, information and communication networks, a system of education, and government-sponsored scientific and technological research ( Franklin 1990, 76 ). Not only has technology affected relationships in societal intercourse; it has radically altered our connections with nature as well. Through technology, nature has become more remote. In other words, even the original natural environment, including the natural (biological) nature of humans themselves, has been brought under the direction and control of technology. This is apparent today, insofar as nature apart from humans is concerned, in our speaking of "nature conservation" and "nature development." Again, technology consists not only of isolated artifacts but forms a complex system of widely diverse components. Modern life unfolds in a largely humanly constructed and controlled technological environment. Its complexity accounts, at least in part, for the torturous progress of the process of integration in which the countries of Europe are presently engaged. It is an example of how politics in the technological society is busy with the search for solutions to all manner of complex systems questions, which is to say problems connected with the organization or re-organization of overarching infrastructures.

    The systemic character of the modern technological world is broadly accepted in current studies about technology. A side variety of approaches to technology call attention to how much technological systems interconnect and mutually influence each other. A change in one sector of technology can have far-reaching consequences for (all) other sectors. Computer and information technologies are an obvious example For a variety of divergent reasons, some writers prefer not to think in terms of systems but to speak of networks, of a "web of interactions" or of a "web of technology" ( Franklin 1990, 58 ). What at first glance appear to be entirely contrary conceptions, namely, notions of technology in line with Ellul�s versus what has come to be known as a social-constructivist view of technology (see Hughes 1983, 1986, 1987 ), turn out, upon closer inspection, to proceed on the basis of a systems approach to technology. The latter view stresses that technology is not autonomous with respect to society. Technological developments must be understood as the results of social construction processes in which many agents participate. This approach has as its points of departure: (1) technological development is not determined but is the result of a process of investigation; (2) a technological artifact forms part of a greater technological-societal system; and, finally, (3) a technological development is the result of the contributions of many agents ( Lintsen 1992, 20 ). Technological innovations occur within a particular technological-societal framework. Innovation involves more than introducing into society a new technological discovery to which society must then adjust. Innovation means the transforming of technological-societal systems such that not only technology but also human beings, social relationships, the economic basis, norms, values, and so forth, change ( Lintsen 1992, 15 ). Wiebe Bijker�s study of the bicycle (1995b) is, by now, a much discussed and fascinating example of the social-constructivist view of technological development.

    In contrast to a social-constructivist approach, which is based on a detailed agent-analysis of concrete technological developments, Ellul has taken a more global approach to technology and the developmental dynamics of the technological society. By global, I mean that Ellul emphasizes the sociological reality of technological systems and not of the actions of the agents within them. These agents do not appear in a global systems approach. It is similar to the study of an object under a microscope at different levels of magnification. At a global level the system as a whole is brought into focus, and one therefore sees other things than in a social-constructivist investigation where more detailed examination is made of the social agents within the system and the relations between them. Confusing these levels of analysis permits Ellul to be criticized from a social-constructivist standpoint as a form of technological determinism. For Ellul, technology seems to develop in a way that is wholly indifferent to human choices and completely determined by the autonomous dynamics of an ever expanding technological system. Yet such a critique misses the point. At a global level there is no need to deny that continuing technological development demands new choices. How an innovation in the technological system will be received is not established beforehand. There is a certain freedom of choice. Ellul would not deny that. The point is that this freedom of choice unfolds within the boundaries and conditions that are given with the technological system. People can make choices, certainly, but these are secondary with respect to the technological system. Thus there can be freedom of choice within the system and at the same time something like determinism at the systems level (cf. Strijbos 1996, 151 ).

    Consider, for example, the freedom of action on an airplane that has been put on automatic pilot. For the passengers there is freedom to choose various activities during the flight, but these are totally insignificant at the global systems level. The choices that determine the aircraft�s flight path are of an entirely different order. Well then, here we have the two perspectives of social constructivism and technological determinism side by side, and they need not exclude each other. The starting point of social constructivism, that technological development is the result of the inputs of diverse agents, says nothing about the freedom of action of those agents. This freedom is limited, or conditioned, by the environment in which action takes place. The environment creates the space of possibilities, or it forms the field of its play, for human action. To say that conditioning may impose limits is not the same as to speak of determinism. The latter appears if the environment of action, in the case of the technological environment of our society, is placed under the dictates of technological thinking. In that case freedom of choice is destroyed as the norms of technological-scientific control gain exclusive sway. The classical example of our industrial society is the factory and Taylor�s scientific management.

    Bijker (1995a, 16) argues that if we do not subscribe to the social and constructed character of technology, it becomes impossible to speak in an integrated way about technology and society; technology in that case can be little more than an onward hurtling flywheel, an unguided projectile. Technology and society do not develop independently of each other; they do so, according to this writer, in a form of co-evolution. Now, I agree that the "social" and "technological" of the technological society cannot, after all, be separated. To this extent a social-constructivist approach does not account for precisely that to which it is critically opposed. Technological determinism is rejected but the question remains concerning why the development of the technological society seems to suggest, to put it mildly, the existence of such a determinism. How are we to explain (an at least ostensible) technological determinism as a phenomenon? The social-constructivist approach to technology puts all the emphasis on the social agents that together are the beareres of the development. But the systemic character of technology entails that developments can acquire "momentum" at a certain moment as the agents� powers are bundled together ( Lintsen 1992, 21 ). This observation has little explanatory value. The question that must be asked is: What determines the remarkable moment of the confluence of powers? Are there deeper motives underlying the dynamics of technology and society than an analysis of agents brings to light?

    In their zeal to keep global, philosophical views of technology at a safe distance, social constructivists simply ignore questions of this sort. Yet in their efforts to remain as close to the praxis as possible in order to show that technological development is a product of complex social interactions, they lose touch with concrete reality after all. To get things into sharp focus, distance is sometimes needed, and a vantage point must be sought from which all the developments can be considered together as a whole. In the mind of the objective observer questions arise that lie at the global level which cannot be simply put aside. A telling example is the theoretical physicist Casimir, who for several decades shaped research policy at Philips Research Laboratories in Eindhoven, one of the greatest industrial laboratories in the world. Drawing upon a vast knowledge of practical details, he spoke and wrote time after time about the (ostensibly) autonomous spiral of technology and science, and he puzzled over related questions of human responsibility ( Casimir 1983; cf. Strijbos 1998 ). Social constructivists lose touch with societal reality in another respect too. Their projects, as Winner (1993, 375) correctly notes, are "carefully sanitized of any critical standpoint that might contribute to substantive debates about the political and environmental dimensions of technological choice." Thus they turn away from what happens to be the most interesting questions, those developing from social-political and philosophical standpoints. Compared with other approaches, such as those of Marx, Ellul, Heidegger, Habermas, Illich and some of the younger philosophers of technology, social constructivism lacks a normative position regarding the social and technological questions that form the object of study. Marxists point, for example, to the key role of structural relationships between the social classes. Other thinkers have called attention to the influence of technological rationality on the shape of society. Here it is emphasized that human action does not take place in a spiritual vacuum (a point considered more broadly below). Individuals, but also societies and cultures, gain their bearings from a meaning perspective on reality. One can also say that people allow themselves to be guided by a particular overall view of reality, a world view. Any ostensible technological determinism can then be interpreted as the outcome of a prevalent technical world view ( Strijbos 1996 ).

  5. Systems Ethics
  6. What benefit is gained for an ethics of technology from insight into the altered character of technology, from insight, namely, into the character of technology as a system, which it has become? Recognition of the systemic character of technology makes clear to us the need for a conceptually new approach to ethical questions. Once we understand that technology has become the new habitat within which we move, it follows immediately that traditional ethical approaches no longer measure up to the problems we face. That is why the last two decades have witnessed a great deal of discussion about the need for a new ethics. The underlying idea is that traditional morality, and reflection upon it, are tied to particular action situations. As a result of the development of science and technology, however, we have moved ever further from the "traditional" situation and the conditions for human action given with it. The shifts that have taken place are captured in the concept of the technological society, a type of society in which technology has become the habitat in which human actions ordinarily take place. Now, the crux of the matter is that both the particular context of action and actions within that context have become objects of ethical reflection. While the context was traditionally given and was, as such, a fixed point of departure for ethical reflection, that context itself has now become, as a result of the altered, systemic character of modern technology, an object of human action and manipulation. Ethics of technology therefore can no longer be only a matter of asking how a person should use a particular machine or artifact for some purpose or another. At least as important today are the normative questions concerning how we should shape, rearrange and adapt the infrastructures of our technological environment to form a healthy and sane habitat for humans to live in.

    The crux of the matter, in other words, is whether we can discover ethical or normative criteria for the technological systems in which we live. Ellul (1989, 24) puts it as follows:

    If technique is a milieu and a system, the ethical problem can only be posed in terms of this global operation. Behavior and particular choices no longer have much significance. What is required is thus a global change in our habits or values, the rediscovery of either an existential ethics or a new ontology.

    In speaking of a new existential ethics and of a new ontology, Ellul�s intention, if we understand him correctly, is to bring the technological environment once again into the sphere of human responsibility. He is interested in an ethics and ontology whereby we can reclaim our freedom, in the face, this time, of the technological environment. Thus when he states that "behavior and particular choices no longer have much significance," we must understand that this behavior and the choices associated with it are of the second order, conditioned by the context within which action takes place. In Ellul�s view, so one could say, we must gain once again an "outside" that exceeds or transcends the technological systems within which our existence unfolds. Just as humanity once freed itself from the constraints of the natural environment, so today the challenge is to attain liberation from the technological environment.

    The philosopher of technology Jonas (1984, 6) also has referred expressly to the new character of modern technology. The scale and scope of human actions, so he argues, have altered drastically. While traditional ethics had to do with human actions in the here and now and were restricted to the interpersonal, individual sphere, a growing domain of collective action has been added. In other words, a spreading domain of collective action has overshadowed the traditional sphere with its neighborly focus. Increasingly there is a collective domain where "doer, deed, and effect [are] no longer the same as they were in the proximate sphere, and which by the enormity of its powers forces upon ethics a new dimension of responsibility never dreamed of before." In the broad discussion evoked by Jonas�s work, especially in German philosophy of technology, it has been asserted by various critics that Jonas, who has laid so much emphasis on the collective dimensions of responsibility in technology, has in fact not moved beyond the concepts of individual ethics. Because he adopts the relationship between parent and child as the model for political responsibility, the sociopolitical and institutional anchorage of his ethics of technology remains weak (cf. Hastedt 1991, 176 and 262).

    What can be of help here is a systems view of technology that clarifies the interweavement between human actions at the various systems levels and the responsibilities that belong to a variety of agents at these levels. This leads to a systems ethics for technology by which I mean the field of normative questions concerning the arrangement and regulation of the technological-societal systems or infrastructures within which our existence unfolds. At issue are such questions as: What societal agents are responsible for particular developments? How are the different responsibilities of the agents related to each other and how are they coordinated? What are the norms for action by the various agents? Systems ethics as a field of such questions must be distinguished from the normative approach or direction that may be taken with respect to this field. Just as there are various normative approaches to individual or personal ethics and the traditional interpersonal sphere, so there may be different normative perspectives in systems ethics and in the sphere of collective action.

    Paralleling the sense of the paired terms "field" and "direction," as I use them here, is my reference elsewhere to the difference between a structural perspective and a world view perspective ( Strijbos 1996 ). If modern technology has become our habitat (structural perspective) then the question arises whether particular spiritual contents and directions have accompanied it (world view perspective). Does our technological world consist entirely of material components or are particular mental states proper to it as well? Technology in its material form is by no means neutral; technology may not be viewed in isolation from the intellectual and spiritual history of our culture and from the varying views that people in it have adopted of the world around them. It has been argued that mental and spiritual structures have adapted to the technological era (cf. Berger, Freyer). The more that technology comes to function as a second habitat, the more people�s view of the world is determined by the dominant technological categories. Thus the material structure of modern technology takes precedence, and its continuing development through the centuries produces, to borrow a phrase from Bolter (1984) , ever new "defining technologies." These are attractive windows through which people view the world around them. Permanently engaged in drawing anew the line that separates nature from culture, people have also always been disposed to redefine their own role with respect to nature with the help of technology. In our times the computer, so Bolter avers, furnishes us with a new definition of the human being as an "information processor" and of nature as "information to be processed." Thus in Bolter the emphasis is on the influence of technology on the intellectual and spiritual side of a culture. Influence in the opposite direction is also noted. In that case a technological view of reality is held to precede the technological world. Thinking oriented to technology is a force in the rise and global expansion of the technological society (cf. Habermas, Heidegger).

    Despite my view that spiritual motives underlie the rise and development of the technological society, it still cannot be denied that everyday experience with modern technology gives rise to a certain mentality. The artificial environment in which we live inevitably forms a deposit in the structures of technological consciousness which one can divest oneself of only with the greatest difficulty. For an interpretation of the technological society and its continuing development one must, therefore, consider both a structural side and an intellectual-spiritual side and the interaction between them (cf. Strijbos 1997 ). Moreover, with respect to the question of a systems ethics for our technological society, we must keep in mind that "field" and "direction" are not mutually isolated identities. Namely, the field of systems ethics�that is, the technological environment in which we live�is itself an outcome of human action but also of the motives that guide human action. And if it is true, as we stated above, that a technological view of reality forms the normative horizon of modern society, then this means that modern society generates pressure to adapt to its structures in matters of ethics. The technological environment has a tremendous influence on the intellectual framework and on the ideas about ethics inherent in it, and that fact, like the mind of the times or the spirit of the age, ought never to be underestimated.

  7. The Computerization of Society
  8. In this connection information technology and computer technology are of much greater importance than is ordinarily perceived. Insofar as the influence of the technological environment on people�s thinking is concerned, the German information pedagogue Haefner (1984) has observed that the autonomous self-determining subject of traditional humanism has long since been eliminated in the strongly integrated and highly structured technological society. Since the rise of information technology, freedom of action has been tied more and more to various networks, and there is in many places just a rudimentary remainder of responsibility and personal competence. Haefner (1984, 89) adduces many examples from everyday life. The ground stewardess no longer organizes the seating for passengers on board, reservations and bookings are done by an apparatus with a built-in calculating program; the pilot in the cockpit is dependent upon automated navigation systems; the soldier who wants to know about troop movements in enemy territory no longer looks at the situation through his own binoculars but must instead pore over computer enhanced satellite images. Here two things are going on that affect the individual�s position. First, specific human competencies pale and are transferred to integrated systems, entailing at the same time a devaluation of specific responsibilities. These are now located somewhere in the systems of which people have become components and over which they no longer have any individual control. Many people feel entirely lost in this situation, run aground, as it were, in a network of electronics. In this way modern information technology sets tottering one�s belief in his or her own autonomous competency to act and to bear responsibility. In this situation many people become deeply uncertain of their own identity. Who is the person, really, that is being increasingly integrated into collective structures? Information technology radicalizes the situation of the technological society, as we described it briefly above. It makes possible the full integration of human beings into technological infrastructures. People and things come to function in the same way as elements within comprehensive systems; this degrades the being-as-a-subject of the individual into an object of manipulation. In this connection, Haefner (1984, 90) poses penetrating questions about morality and systems ethics:

    Shall humanity use modern technology to attain a new level of integration in which the personal ego is subordinated to a "collectivity"? Shall the picture of the autonomous person as the equal image of God be replaced by a collective-integrated "human system" as a counterpart of the divine creator? Can and may people resign from their individual responsibility and leave guidance and control to an integrated system?

    The questions Haefner poses arise from his broad knowledge and experience with the computerization of society. Computer and information technologies reinforce certain tendencies that make people appear to be prinsoners of the technological environment they themselves have created. Many see no alternative but to adapt to the technological systems in which they live. In this way the technological environment generates a systems ethics that we may call an ethics of adaptation. The normativity of human actions is derived from the rationale of collective infrastructures. Haefner illustrates what we observed earlier, namely, that the technological environment has an influence on the framework of ethical thinking that one ought not to underestimate. The field influences the direction of thought: people around the world begin to think in a manner consistent with what they experience in everyday life. However, the direction of thought may also influence the field: it is not difficult, especially where information technology is concerned, to find an illustration of how a technologically conditioned direction of thought, namely a technological world view, influences the field of the technological society, which is to say that the world begins to appear as people see it to be. An example from the systems thinker Smon will help to make this clear. For this article a few observations will have to suffice.

    In Simon�s extensive scientific oeuvre, The Sciences of the Artificial (1996) is of more than special scientific interest. In it Simon identifies the (philosophical) view of the person as a rational being on which he based his theories of artificial intelligence. Essential for a good understanding is his renowned comparison between the zigzag movements of an ant on the surface of a beach formed by wind and waves and the thinking behaviors of people endeavoring to solve problems. Just as the ant adapts to the sandy surface over which it seeks its way, so a person adapts to the problem situation. To gain a theoretical grasp of human thinking behavior, Simon (1996, 53) advances the hypothesis that "human beings, viewed as behaving systems, are quite simple." It is the environment that determines the complexity of a person�s thought. The thinking person is an adaptive system, so we are told. And the goals he sets "define the interface between his inner and outer environments,�. To the extent that he is effectively adaptive, his behavior will reflect characteristics largely of the outer environment." One of the crucial questions that Simon overlooks is that of the ontological status of the human "task environment" or "problem environment." For humans, is there such a thing as a fixed environment? Is it not proper for the individual to have the capacity to define his or her task environment? For humans there is no such thing as a fixed, pre-given task environment. On the contrary, human beings seem to possess the puzzling capacity to apply certain standards to reality and to make judgments on that basis. That is the first thing. But Simon overlooks other unique aspects of human thought. For a person has the capacity to visualize something new, to formulate norms or standards for a new desired situation. In other words: One may design something new and can subsequently also try to realize it.

    It is not here necessary to delve further into designing as a part of human thought behavior. From the preceding, it should be clear that, in my view, the new situation is not derivable from the existing one. If, on the basis of certain standards, the existing situation is identified as a problem situation, then it does not follow that there is just one particular solution to it: A number of solutions may be conceivable for the same problem. Here we have returned to our problem of systems ethics. Consider Simon�s thinking person, and place this person in the modern technological society in which solutions must be sought for the numerous problems that there are in our times. How are these to be approached? What perspective would Simon�s problem solver offer us? I believe that his view lacks a critical potential with respect to the given technological society and its problems. Yet what, really, are the problems that we have to deal with today? And what ends must the human mind as an adaptive thinking system pursue in the problem environment of the technological society? Simon�s thinking person has, in my opinion, been knocked for a loop in the technological world, and he has no other choice than to adapt to the given ends of the technological systems of which this society is comprised. This, as it turns out, is how Simon�s (1996, 22) problem solver works, "adaptivity to an environment is their whole raison d�être." The systems ethics implicit in Simon�s view of the human being is an ethics that does not permit one to adopt a critical stance towards the field of the technological society as that has now taken shape. It is also an ethics that reinforces the situation as Haefner sees it in the continuing computerization of society.

  9. Closing Remarks

Is there a perspective for our technological society? What possibilities for reorientation may be found in the present situation? Is it possible to become free from the spirit of technology? To answer these questions in the affirmative is not to engage in a romantic spiritual flight from our technological world. Exploring the field of systems ethics aims to discover the normative principles needed for the transformation of our technological society and the systems in which we live (cf. Strijbos 1996, Part II ).

REFERENCES

Achterhuis, H. 1995. Natuur tussen mythe en techniek. Baarn: Ambo.

Beck, U. 1995. Ecological politics in an age of risk. Cambridge: Polity Press.

���. 1997. Wie is verantwoordelijk voor gekke koeien, varkenspest en Brent Spar? Trouw, 15 March 1997.

Berger, P. L. 1979. The heretical imperative: Contemporary possibilities of religious affirmation. Garden City, NY: Anchor Press.

Bolter, J. D. 1984. Turing�s man: Western culture in the computer age. Chapel Hill, NC: The University of North Carolina Press.

Bijker, W. E. 1995a. Democratisering van de technologische cultuur. Inaugural address, Rijksuniversiteit Limburg te Maastricht, 24 March 1995.

���. 1995b. Of Bicycles, bakelites, and bulbs: Toward a theory of sociotechnical change. Cambridge, MA: MIT Press.

Casimir, H. B. G. 1983. Haphazard reality: Half a century of science. New York: Harper & Row, Publishers.

Ellul, J. 1980. The technological system. New York: The Continuum Publishing Corporation.

���. 1989. The search for ethics in a technicist society. In Research in philosophy & technology, edited by F. Ferré & C. Mitcham. Vol. 9. London and Greenwich: JAI Press, Inc.

Franklin, U. 1990. The real world of technology. Toronto: CBC Enterprises.

Freyer, H. 1970. Über das Dominantwerden technischen Kategorien in der Lebenswelt der industriellen Gesellschaft. In Gedanken zur Industriegesellschaft. Mainz: v. Hase & Koehler Verlag.

Haefner, K. 1984. Mensch und Computer im Jahre 2000: Ökonomie und Politik für eine human computerisierte Gesellschaft. Basel, Boston, Stuttgart: Birkhäuser Verlag .

���. 1991. Aufklärung und Technik: Grundprobleme einer Ethik der Technik. Frankfurt am Main: Suhrkamp.

Hughes, T. P. 1983. Networks of power: Electrification in western society, 1880-1930. Baltimore: John Hopkins University Press.

���. 1986. The seamless web: Technology, science, etcetera, etcetera. Social studies of science. 16: 281-91.

���. 1987. The evolution of large technological systems. In The social construction of technological systems, edited by W. E. Bijker, T. P. Hughes and T. Pinch. Cambridge, MA: MIT Press.

Jonas, H. 1984. The imperative of responsibility: In search of an ethics for the technological age. Chicago and London: The University of Chicago Press.

Lintsen, H. W. 1992. Wat is techniek? Een geschiedenis van menselijke secreten en discrete technieken. Inaugural address, Technische Universiteit Eindhoven, 15 May 1992.

Luhmann, N. 1993. Risk: A sociological theory. Berlin and New York: Walter de Gruyter.

Ropohl, G. 1979. Eine Systemtheorie der Technik: Zur Grundlegung der Allgemeinen Technologie. Munich and Vienna: Carl Hanser Verlag.

Simon, H. A. 1996. The sciences of the artificial. Cambridge, MA: The MIT Press.

Strijbos, S. 1994. The individual and the collective in health care: A problem of systems ethics. Systems Research 11: 67-74.

----------. 1996. Ethics for an age of social transformation. Part I. Framework for an interpretation. Part II. The idea of a systems ethics. World Futures 46: 133-44, 145-55.

----------. 1997. The paradox of uniformity and plurality in the technological society. Technology in society 19, no. 2: 177-95.

----------. 1998. Science and the university in a cultureless age: The need and possibilities for ethics. World Futures 51: 269-286.

Strijbos, S., ed. 1985. Nieuwe medische ethiek. Amsterdam: Buijten & Schipperheijn.

Taylor, F. W. 1967. The principles of scientific management. 1911; reprint, New York and London: W.W. Norton and Company.

Winner, L. 1993. Upon opening the black box and finding it empty: Social constructivism and the philosophy of technology. Science, Technology and Human Values 18, no. 3: 362-78.