SPT v2n3n4 - Progress, Values, and Responsibility


Number 3-4
Spring-Summer 1997
Volume 2

PROGRESS, VALUES, AND RESPONSIBILITY

Hans Lenk, University of Karlsruhe

"In any case, progress implies that it looks much greater than it really is." This statement by the Austrian poet Johann Nestroy became famous when Ludwig Wittgenstein chose it as a text for the beginning of his Philosophical Investigations . Is it true, however? It seems to be true for the problems, methods, and methodologies of philosophy itself--for Wittgenstein's own investigations, which depend on Georg Lichtenberg, Arthur Schopenhauer, Fritz Mauthner, and other forerunners. But it does not seem to be true for technical or technological progress. Instead, we might say that, "Technical progress is much greater than seems to be the case."

But technical progress is not identical with cultural or moral progress. Discrepancies between the two have been highlighted and focused on time and again; these discrepancies have also been used-- e.g., by Albert Schweitzer, C. P. Snow, Daniel Bell, and Arthur Koestler--to explain untoward effects. Progress as a unique general social phenomenon, without any differentiation, does not exist.

However, there are systems interlockings and systems effects which had been discovered already in the twenties by the philosopher of economics, F. von Gottl-Ottlilienfeld. In his classic, Wirtschaft und Technik [economics and technology] (1923), he stressed the "interlocking of individual steps toward a total movement of technology": namely, creative analogies from different fields, combinations of results from earlier progress that are preconditions of a genuinely innovative technical progress in general--which has to be relevant to the fulfilment of demands. He also said that technical progress has to be distinguished from technological progress, which is progress in knowledge about and of technologies even where no technical progress has occured. Progress in technological knowledge, moreover, is extremely important for the mutual interlocking of subsystems. Here von Gottl-Ottlilienfeld mentions system continuities, technological and technical mutations (the jump to another basic problem solution), as well as systematic contexts for individual technical problems and solutions (his name for this is Eigenleben or "proper context") and derivative problems and solutions. All together, these eventually lead to a unitary system of technical questioning or probing, i.e., to a kind of methodological unity. The "self-advancement" ( Selbststeigerung ) of technological progress is here rendered in system terms (see Ropohl , 1979).

This led, later on (see Hübner, 1968), to the idea of modern technology "becoming self aware," and this idea led to applications in cybernetics.

What is characteristic of technological development, of ever accelerating technical progress in its overall situation in our society today? Doubtless one characteristic is the fact that moral and legal concepts, except when they have been modified, have not kept up with technological applications. Hans Sachsse, the late Nestor of German philosophy of technology, claimed, for instance, that concepts like "property," "theft," "just," "exchange," and even "consumption"--developed in view of the classical concept of possession of goods, of some kind of underlying or philosophical category of substance which cannot be multiplied--cannot be simply transferred to the new concept of information.

Moreover, rapidly changing circumstances in modern life in dynamic and pluralistic societies--as well as the accelerating dynamics of social, political, economic, and technological conditions--also require respective changes in the applicability conditions of ethical concepts and opinions. Quantifiable acceleration here also engenders a kind of shifting accent if not causal changes within the systems themselves. Omissions have become, ethically speaking, much more important than in earlier times. Often very dramatic collisions of values--even the generation of fundamentally new values--even lead to a new kind of social harm. Think, for example, of the results of applying the products of recent scientific medicine, of the overpopulation problem, and of other "social traps" (as sociologists call them) which may lead to a new kind of "tragedy of the commons" (to use Garrett Hardin's term).

In any case, the most decisive point demanding a new interpretation of the ethics of technological progress is, beyond any doubt, the immensity of technological power, which has grown to an extent and with such intensity that some kind of systems backlash, a kind of overkill effect of a self-destroying dynamic, might occur or might already be occurring. This is particularly evident in ecology, in the imbalance of ecosystems in highly industrialized (often over-industrialized) regions, but also globally. We seem unaware, incapable of implementing the responsibility required for the overall functioning of ecosystems in general.

We can--without being able to develop a whole theory of technical or technological progress here--start with some remarks about the concept of technical progress. It is, first, clear that the concept of "technical progress" has a normative character. It is always a comparison to an "is" state, a state of the system to be aimed at and which motivates the technical solution or operation to be called "better" if it is achieved. This is the case whenever it can be achieved at less cost or effort or when we succeed in getting a higher output or a better achievement with the same input or effort. The assessment criteria include: the improvement of quality, longer endurance of product, safety, freedom from liability, greater precision, feasibility, better control, higher speed, simpler calculability, and economic efficiency, particularly in terms of costs of production or maintenance as part of the input-output ratio.

Economists define technical progress simply as increased output in comparison to equal or less investment of capital or labor. This would imply that increases of production stemming from higher morale--as for instance in the famous Hawthorne experiments--or improved organization without any new investment of capital or labor, would count as progress; this, however, cannot be called technical progress in the proper sense. Friedrich Rapp (see Rapp , Jokisch, and Lindner, 1980)--taking up von Gottl-Ottlilienfeld's distinction between technological progress (in knowledge) and genuine technical progress in society--distinguishes potential from realized or materialized technical progress. Von Gottl-Ottlilienfeld had restricted his analysis to so-called "real technology" ( Realtechnik ), consisting in the production of material artifacts, applications to new operations, and "the totality of operations and means of actions needed to dominate nature" (1923, p. 9). If one adds social contexts, one has to extend the concept of progress to social, economic, and other factors. However, with the introduction of a concept of socio-technical progress, the specific traits of technical and technological progress in the narrower sense might easily lose precision. Therefore, this terminological shift is not generally to be recommended.

Generally speaking, the overall direction of technical development, if it is to be called "progressive," cannot be understood purely as an outcome of economic or technological factors only, but as a complex systemic interplay among different factors with no mere linear causality involving one factor alone. Many authors stress that there are mutual dependencies to be taken into consideration in the explanation of technological development, and they can only be grasped in a multidimensional analysis. That means, generally speaking, that the concept of a general or an overriding and cumulative or ever-escalating progress with a kind of dynamics ( Eigendynamik ) of its own turns out to be an interpretive construct. And such a construct could occur in reality only if there were a permanent interplay among all the mentioned agents and fields of influence that engender the great complexity of contributions, interconnections and factors. What can be called a societal state of progressiveness--or social progress, for short--is thus a complex integration of many detailed factors, processes, and subsystems of different kinds.

The fact that the probability of improvements or advancements must always be assessed in terms of dependence on former states of development in technology and science, as well as societal factors, also limits any alleged law of exponential growth for technical (including technological) progress. This is particularly true with respect to any alleged law of constant acceleration.

Turning now to moral assessments or judgments of responsibility, it is difficult if not impossible in view of these systems effects to attribute the responsibility for detailed aspects of technical advances to just one individual technologist or researcher. If development depends on a multiplicity of mutually escalating interconnections and interactions, it is not possible to attribute responsibility to just one person.

However, in a wider perspective, of responsibility for preventing accidents or catastrophes--or for an attitude of trust or stewardship for ecosystems, for example--certainly individual participants, namely technologists, engineers, and other members of the technological intelligentsia do bear a certain co-responsibility. (This is a point stressed by Hans Jonas in his book, The Imperative of Responsibility, 1979 ). On the other hand, these individuals cannot be assigned total moral responsibility, especially if the harmful effects could not possibly have been foreseen in advance. (This is a general problem of the individual responsibility of engineers and scientists in applied research which cannot be dealt with here in detail.)

Generally speaking, the sorting out of individual responsibilities and the assignment of different kinds of responsibilities to different bearers (e.g., collectives, or persons as role takers or moral agents) poses real and difficult problems. And these problems have not yet been analyzed in detail, and they certainly have not been solved thus far. It seems true, however, that there should be collective responsibility for technical operations, procedures, and enterprises; and this responsibility must be borne by human beings. (Unless one is prepared to defend a thesis of a quasi-natural technological process with a dynamic of its own.) Technology is an outcome of human activity and initiatives, and it must be dealt with in terms of responsibility or moral duties even if we must refer collective responsibilities to human decision-makers. It is human activity which promotes technology, even in complex networks and combinations of accomplishments. This certainly has consequences for the application of the concept of responsibility. No abstract deduction will suffice, and assignments of responsibility depend on actual contributions of individual participants in the complex processes. Such assignments are easiest where it is a matter of individual misuse, omission, or neglect.

In industrialization and technical progress without social progress, the situation has become problematic, as we all know very well. Can we say, then, that progress in general (whatever that might mean), or even technological progress, can be responsibly attributed to anyone in particular? Apparently not.

However, advancing scientific and technological knowledge certainly does pose a problem, and that problem grows with the increasing power of technology in many realms of our society. In an age of pervasive technology, responsibility for technological development does pose a more pressing problem than armchair science did in earlier times. With the increasing range of effects of technology, it is clear that problems of responsibility grow considerably. Cannot even advances in knowledge be misused? Can a neat separation between pure science and technical application still be defended? (Certainly not, in view of an ever growing technicalization of science and the simultaneous scientification of technology. Differences between basic research and applied research or technology are today mere differences of accent; the borderline between them has become fluid.) Has not every knowledge claim whatsoever, or every technological discovery, taken on a kind of ambivalence or Janus-like characteristic?

Generally speaking, the moral and ethical problems associated with technological procedures are certainly not new. Already in earlier times a knife could be misused. Only the range of effects and the magnitude of risks have grown to the point that there is a danger of the self-destruction of humankind. In addition, unforeseen side effects have grown with the broader range of technological activities and their effectiveness. Now, the traditional ethical regulations of behavior with which humans throughout their evolutionary history have constrained themselves seem to be overextended. The ethics of loving one's neighbor does not suffice in an age of pervasive global technology with remote effects and interconnections to other continents. If just pressing a button can kill hundreds of thousands of humans, or injure or harm millions, even humankind itself, then traditional regulations of actions and their respective motivating concepts--which were developed from face to face encounters in our sociobiological evolution--will certainly fail.

It seems necessary therefore to deal with problems of responsibility in a more specific way, with separate responsibilities for individual agents, for collective decision makers, for whole nations, as well as for humankind in general. It is not just the solutions of technical problems or the increases of technological power (even though we have become dependent on them) that determine our future, but the social and ethical problems associated with them that have to be solved. And these have been neglected in recent decades. Certainly, we are dependent on technical and technological advances, but we must implement them in a wise manner. Searching for wisdom is still the traditional burden of philosophy--but also its opportunity. Philosophers to the front! We must meet this down-to-earth challenge of developing a comprehensive ethics that includes responsibilities for both individual and collective agents in an age of pervasive and ever-accelerating technology.

I will here discuss two examples where philosophers can practically contribute to this endeavor--and have done so. Later, I will present briefly some conceptual typologies of responsibilities, at the same time relating them to nontechnical values in technical contexts. I will there paraphrase earlier work of my own on the typology and differentiation of different kinds and analytic models of responsibilities ( Lenk, 1982, 1991, 1992; Lenk and Ropohl , 1987). I will, as my second example, report on an initiative of the German Engineers Association ( Verein Deutscher Ingenieure ) which has developed guidelines for handling value issues in technical activities, including technology assessment; they were designed by a subgroup of which I have been a member for more than a dozen years (see Lenk , 1992).

Before getting to these examples, I note that, within the course of history, humankind has never had at its disposal as much material abundance, as much power and energy as today. This is due to technology and its progress. And technology is no longer a mere instrument; it is a world-changing, a world-shaping, a world-making factor--on which it is, I think, very important to reflect.

In proportion to its powers, technologically multiplied to an extreme,

humankind's responsibilities have grown if not exploded. Today, much more than hitherto, large-scale ethical and moral problems have grown in step with the extended technological power humans have to disturb the non-human environment or nature--a power that allows us to manipulate and tamper with life, including human life. Because of tremendous technological power, because of the scope of technological activity, a new situation requiring a new ethical orientation seems to be evolving, and it obviously calls for new rules of behavior. Even if some basic definition of the good remains constant, the executive rules applying ethics to the conditions of today must be changed, must be adapted to new possibilities of behavior, to new activities with new effects; and this includes also institutional arrangements and responsibilities. All of which is a very tough question to tackle. Eventually we will discover as well as suffer from paradoxes, natural and social traps, limited resources, overburdened systems, and overextended capacities. Technology assessment--our means for dealing with such issues--thus becomes a matter of necessity, even when it requires us to take on the difficult but important task of anticipation. Technology assessment must be anticipatory.

In relation to ethical questions of responsibility, what is new about this situation? In brief, there are six factors.

First, the number of people affected by technological measures or their side effects has increased tremendously, as everybody knows.

Second, natural systems are now subject to human activity, at least in the negative way that humans can permanently disturb or destroy them by their technological impacts. Again, everyone has heard about the CO 2 problem.

Third, humans as well are now subjected to technological manipulation in ways they were not before, not only by pharmacological means and in terms of mass suggestion or subconscious influences, but also, at least potentially, by genetic engineering.

Fourth, moreover, we seem to be observing a progressive trend which might be called a systems technocracy or an informational technocracy. By this I mean a technocratic trend of combining bureaucracy or red tape with the progressive development of microelectronics, data storage, retrieval, and processing, etc. This is taking place in public administration, in organizations handling computer-stored information, and so on. Personal privacy is likely to be endangered by all these developments if they are widely implemented. And then all the problems of data protection linked to the information and systems revolution will also emerge.

Fifth, technocracy in any guise brings other problems. Edward Teller, the so-called father of the hydrogen bomb, once stated in an interview that, "The scientist or technologist ought to apply everything he has understood and should not put limits on that. Whatever you understand, you should also apply." This, to my mind, represents an ideology, of what some have called the "technocratic imperative." It turns Immanuel Kant's old dictum, "Ought implies can," into a reverse technological imperative, "Can implies ought." Whether or not humanity is allowed to, or ought to, initiate, apply, make, produce, or carry out everything that is possible, certainly represents a serious ethical problem, indeed.

Sixth, in the increased possibilities of manipulation, in the biomedical as well as the ecological context, the problem of responsibility for unborn individuals, for future generations, for the future of humankind, as well as for natural (sometimes partially human-made) ecosystems, including natural species, we have another dramatic problem.

I want now to turn to my experience (mentioned earlier) as a member of the working group of the German Engineer's Association ( Verein Deutscher Ingenieure ). In a subgroup, "Humanity and Technology: the Engineering Profession and Society," we developed a set of guidelines, the "VDI-Richtlinie" (VDI 3780) of the Association. After twelve years of cooperative work and dedicated committee involvement it was officially adopted in 1991 (see Lenk , 1992). The official title is, "Technology Assessment: Concepts and Foundations" ( Technikbewertung: Begriffe und Grundlagen ). The work highlights and defines the function and significance for technology, for technical decision-making, technological activity, etc., of various socio-technical and social systems, including non-technical values. The purpose was to instruct, to sensitize, and to help the practitioner in planning and technological decision-making: but it was also to help the reflective engineer on the job with regard to the role of values, notably social values, goals, aims, attitudes, needs, and specific interactions when he or she is involved in technological planning, acting, or decision-making. We were not proposing ex cathedra decrees. We cannot relieve the engineer, the technological administrator or manager or entrepreneur, of his or her responsibility in deciding. But such reflections, instructions, and information may help sensitize decision-makers and practitioners.

Here, without comment, are some of the topics touched on:

Technology Assessments: List of Analyzed Values, Concepts and Foundations

Functional/Technical Efficiency

usefulness/practicability

feasibility

effectivity

perfection

- simplicity

- robustness

- exactness/precision

- reliability

- durability/service life

technical efficiency

- degree of efficiency

- full utilization of material(s)

- productivity

...

Economic Efficiency (private or corporate)

profitability

(cost minimization)

security of corporation

growth of corporation

...

(Public) Wealth ( national and international economy )

supply of needs and wants

quantitative vs. qualitative growth

international competitive position

full employment

distributive justice

...

Safety

freedom from physical injury

protecting individual lives

protecting and safeguarding the existence of mankind

minimization of risks (extent and probability of harm)

- for production unit

- concerning failure

- concerning abuse/misuse

...

Health

physical well-being

psychic well-being

increasing life expectancy

minimization of direct and indirect health risks, strains and burdens

- at work

- in private life

- by pollution, polluting products and production processes

Quality of the Environment

protection of nature, landscapes and ecosystems

protection of natural species

conservation and saving of resources

minimizing emissions and polluting deposits

Personality Development and Quality of Social Life

liberty (freedom of action)

freedom of information and opinions

creativity

guaranteeing privacy and informational autonomy

participatory opportunities

control, survey and command

social contact and acknowledgement

solidarity and cooperation

cultural identity

sufficient minimal agreement on basic values

order, stability and rule-governed systems and processes

transparency and public availability

justice

...

We find here a wide range of values, ideas, goals, aims and general objectives which are related to technological activity and decision-making. As I said, I cannot comment in detail here. But the interrelationships and interactions between the different major types of values and their subspecies or kinds--for instance, between health and safety--are obvious. By contrast, there is likely to be conflict between some aims: e.g., between economic efficiency (at least for private firms) and the quality of the environment.

Next I want to proceed to my second example of a way philosophers can help out in our present situation. I will merely summarize work I have done before on the complicated web of technological responsibilities and priorities in dealing with them, particularly in technological activities and decision-making (see Lenk , 1991). Typically, codes of ethics or declarations or statements, fundamental principles, canons or guidelines by technological societies or engineering associations only mention the global responsibility of the engineer or manager, or of the employee or worker. And there is no differentiation as to what kinds or types of responsibilities there are--including conflicts among them. This led me to formulate a theory of the types of responsibilities as a first step to be able later on to analyze and possibly help resolve conflicts between the different kinds of responsibilities.

First of all, the concept of responsibility is relational; it is a theoretical or interpretive construct designating at least a five-place relation: a person is responsible for something, with respect to another person, in view of a standard, and with regard to some authority. And all of this must pertain to a specific level; for instance, the responsibility might be moral or legal or contractual. The entity or authority to which I may be held responsible may be a person (a parent, for instance), who may have a special role, or an institution, a corporation, an organization, or the law, the state, society, humankind or the ideal of humankind, or God. I shall not go into details but merely summarize my views schematically.

The first classification is in terms of responsibilities for types of actions (including omissions or neglect):

_________________________________________________________________

individual or joint or

responsibility group or

collective

responsibility

in cases of:

positive negative responsibility responsibility

causal responsi- for long-range for institu-

responsi- bility (e.g., patterns or tional or

bility for neglect or activities group actions

particular omission)

actions

(preventive responsibility in any representative

of these 3) responsibility

Figure 1: Types of Action Responsibility

_________________________________________________________________

These are not self-explanatory, but I will assume that the thrust is generally clear.

The second set of distinctions has to do with role or task responsibilities:

________________________________________________________________

role- task loyalty corporate

execution responsi- responsibilities responsibili-

responsi- bilities ties of insti-

bilities tutions (to

formal or informal insiders or outsiders):

legal or representative (e.g., organizational,

manager, board member) moral, or

legal

all of which may involve:

caring responsibilities

(individual or group)

legal liability

(individual or group)

Figure 2: Role and Task Responsibilities

________________________________________________________________

Again I will assume that the general thrust is clear, even though there are details that require explanation.

The third schema deals specifically with moral responsibilities (recognizing that some of the earlier categories would not generally be classified as moral):

_________________________________________________________________

responsibilities responsi- specific corporate

to others bilities to responsi- responsi-

self bility to bilities of

direct or indirect meet con- institu-

(situation- (with re- tractual tions

activated: spect to or formal

effect on remote con- obligations

other in- sequences

dividual of actions obeying job-

human or or omissions) related

non-human) code of ethics

all of which involve:

responsibilities for the

safety, health, and welfare

of the public

joint responsibilities or group responsibilities

of individuals, according

to degree of influence

Figure 3: Moral Responsibilities

________________________________________________________________

Here a comment is called for. I take all moral responsibilities to be universal, and they remain so even when they are linked to specific role or task responsibilities (not necessarily moral). Furthermore, I assume that moral responsibilities are the most important, and they cannot be diminished, divided up, or dissolved; certainly they do not vanish, no matter how many people are involved.

The ways that these schemes can be applied in assessing technical decisions can be exemplified in a famous case, that of the DC10 airliner involved in a crash near Paris in 1974 (See Eddy , Potter, and Page, 1976). In 1972, three inspectors of the Douglas Long Beach plant in California had wrongly approved modifications of a fatally dangerous cargo door locking system, when no work on the cargo door had actually been done. This was one (admittedly only one) decisive factor in the crash over Paris. Usually, there is more than one factor, and that is the problem: how to follow the different chains linking these different factors which lead to a catastrophe or a major accident.

Corporate moral responsibility on the part of institutions is also relevant--assuming, that is, that corporations have moral responsibilities. (And this is still highly controversial; see French , 1984.) Laws and the affected people may change, but the relationship of moral responsibility might still obtain. For instance, one of the best known cases in business ethics is that of the Johns Mansville firm and asbestos production during World War II. It was only in 1947 that engineering and other professional codes of ethics began to include responsibility for the safety, health, and welfare of the public among their guidelines--some even saying it is of "paramount" importance.

An obligation to abide by the ethics code of one's professional society, for instance, an engineering society, certainly is also a moral obligation--as obeying the law is also a moral obligation, though at a higher level of responsibility. Thus we must make different moral distinctions at different levels.

In considering different types of responsibility, we must also develop priority rules: for example, that moral responsibility takes precedence over role-responsibility. What follows is an attempt to provide a first sketch of such a set of rules:

1. Moral rights of individuals are predistributive rights, overriding utility considerations.

2. In a compromise, the interests of everyone should be taken into consideration on an equal basis; in cases of unresolvable conflict between equally relevant basic rights, this condition is especially important.

3. Only after considering the moral rights of all parties should one opt for a solution, and then it should be one that causes the least damage or that maximizes utility for all the involved parties.

4. After rules 1, 2 and 3 have been invoked, utility considerations should be weighed against potential harms. In general, nonalienable or predistributive moral rights are prior to considerations of avoiding harm, and these latter are prior to utility considerations. (For these first four rules, see Werhane , 1985, p. 72.)

5. When the weighing of harms or utilities generates unresolvable conflicts, one must seek a compromise that distributes them equally, or at least proportionally.

6. A general or higher level moral responsibility is to be preferred over restricted or non-moral obligations, even if they are prima facie.

7. Universal moral responsibility, generally speaking, takes precedence over role and task responsibilities.

8. Direct responsibility is usually, but not always, prior to indirect responsibilities for remote consequences. However, this ranking must necessarily be modified if the consequences are especially serious or have very long-lasting impacts.

9. Primary, i.e., personal, moral responsibility precedes corporate responsibility.

10. The common weal or public good precedes all other particular or specialized interests.

In 1990, discussions of responsibility and engineering education led to the drafting, by a German Commission of Engineering Education, of a declaration of the responsibilities of engineers and natural scientists. This declaration stresses the importance of reliable basic knowledge, but it also stresses the necessity of taking into account--beyond economic criteria--those ecological and social criteria that are conducive to conserving and securing the natural basis of life. "Engineers should be able to make statements about positive and negative consequences for natural and social contexts of the application or non-application of technology, and preferably at the appropriate time." They should try to foster impartial and knowledgeable discussion of large-scale technological developments. Professors in technical fields, according to the declaration, must, more than before, acknowledge and exercise their interdisciplinary responsibility, integrating social concerns within their teaching. They should, according to this guideline, organize case-study seminars, directly related to technical topics; they should also be willing to collaborate, providing technical examples, in interdisciplinary seminars. In addition, a need for continuing education in these issues was stressed.

Unfortunately, there is no mention in the document of responsibility types, of conflicts, of priority rules, of all the intriguing and difficult problems which I have just mentioned. And that is still the usual case.

Interestingly enough, neither the German Conference of University Rectors nor that of the more practice-oriented Technical Colleges (the Fachhochschulen ) signed this declaration. However, most engineering and natural science professional societies did.

In conclusion, I want to give a short overview of the state of the debate regarding codes of ethics in technology. This declaration by the German Commission of Engineering Education amounted to a kind of code of ethics for technology and engineering education. The American Institute of Electrical and Electronics Engineers had earlier paved the way by giving a paragon example of an ethics code, while also institutionalizing boards, procedures, and sanctions as well as publications, discussion material and conferences. These were added because codes by themselves are only appeals, and appeals do not have binding force. They are not enough; codes alone will not do the job. That is why the Guidelines of the American Association of Engineering Societies of 1984 stressed professional conduct, that is, the incorporation within their behavior of rules and norms on the part of engineers and other technological practitioners.

There is still a long way to go in the direction of sensitizing, of helping the "ethical engineer" (as he or she is sometimes called in the United States). The engineer is not obliged to become a martyr, losing his or her job; but neither may the engineer close his or her eyes and carry out unethical, or even unlawful, orders forced on him or her by supervisors or firms. Engineers are, and should be, responsible citizens both on the job and elsewhere. They are human beings, with personalities and moral responsibilities which cannot be deposited in the cloakroom, so to speak. Moral responsibilities cannot be diminished by giving them to others or anything of that sort.

Engineers--and other experts and technical practitioners--usually do dutifully exercise their professional and humanistic responsibilities, but much can still be done to improve the situation, to deepen their consciousness of moral and social responsibilities, as well as of potential conflicts between them and their personal consciences. The humanities and social sciences can and should help make them aware of intricacies and conflicts, of the complicated interplay of values and norms with contracts and laws, etc. At the same time, none of this should detract from the engineer's professional responsibility or freedom of decision-making and acting.

Beside Mens agitat molem [mind can move mountains], an ancient engineering slogan, I think that Humanitas praestat [humanity comes first] must be a necessary complementary slogan in engineering, in engineering education, and in continuing education.

It is an especially challenging task of the humanities, and of philosophy in particular, to make this imperative work, unobtrusively helping the engineer, the practitioner, the manager, the entrepreneur, as well as political decision-makers to know how to abide by social, moral, and humanistic values and norms. They should also help solve or at least mitigate conflicts among them. And they should fulfill the ancient obligation of the humanities to share in the work that is needed for the survival and progress of humankind in our complicated technological world.

REFERENCES

Eddy, P., E. Potter, and B. Page. Destination Disaster . New York: New York Times Books, 1976.

French, Peter A. Collective and Corporate Responsibility . New York: Columbia University Press, 1984.

Gottl-Ottlilienfeld, F. von. Wirtschaft und Technik [economics and technology]. Tubingen, 1923.

Hübner, Kurt. "Von der Intentionalität der Modernen Tecknik" [on the intentionality of modern technology]. Sprache in technischen Zeitalter 25(1968):27-48.

Jonas, Hans. Das Prinzip Verantwortung: Versuch einer Ethik fur die technologische Zivilisation . Frankfurt, 1979; The Imperative of Responsibility: In Search of an Ethics for the Technological Age . Chicago: University of Chicago Press, 1984.

Lenk, Hans. Zur Sozialphilosophie der Technik [on the social philosophy of technology]. Frankfurt, 1982.

Lenk, Hans. Zwischen Wissenschaft und Ethik [between science and ethics]. Frankfurt, 1992.

Lenk, Hans, ed. Wissenschaft und Ethik [science and ethics]. Stuttgart, 1991.

Lenk, Hans, and Gunter Ropohl, eds. Technik und Ethik [technology and ethics]. Stuttgart, 1987.

Rapp, Friedrich, R. Jokisch, and H. Lindner. Determinanten der technischen Entwicklung [determinants of technical progress]. Berlin, 1980.

Ropohl, Gunter. Eine Systemtheorie der Technik [a systems theory of technology]. Munich and Vienna, 1979.

Werhane, Patricia H. Persons, Rights, and Corporations . Englewood Cliffs, N.J., 1985.