Techné: Research in Philosophy and Technology
Research in Ethics and Engineering
Delft University of Technology
Engineering ethics has only recently started to take off as a research discipline. Whereas in the US ethics-textbooks for the education of engineers have seen the light from the 1970s onwards, there have been very few research efforts in the field. This is all the more surprising if one compares engineering ethics with bioethics, which has developed into a booming field of research. It seems obvious that engineering and technology pose at least as many pressing and interesting ethical questions as medicine and biotechnology.
Fortunately, recently there are more and more initiatives for advanced efforts on engineering ethics. In the Netherlands, all the three Universities of technology (Delft, Eindhoven and Twente) have developed substantial philosophy departments in the last years that do research in the philosophy of technology, with groups that are specialized in engineering ethics. In the spring of 2002, the ethics group of the Philosophy Department of Delft University of Technology organized a conference on "Research in Ethics and Engineering". Participants came from various parts of the world: Europe, the US and Asia. The conference was organized around three themes: risk, autonomy and engineering as a profession.
Some of the contributions to this conference can now be read in this special issue of Techné. In this editorial we will discuss some common themes that can be identified in the various contributions. We will also try to clarify where discussions in this field might be related to mainstream moral philosophical issues. Our aim is to highlight topics that might be the focus for future research in the field of engineering ethics.
Social Arrangements for Decision-Making about Technology
When thinking about how the benefits of technology to mankind may be maximized, one could try to evaluate particular actions or developments, but one might also take the view that what needs to be evaluated are the social arrangements for decision-making about technology. The latter view might be characterized as procedural: the idea is, roughly, that if decision procedures are morally sound, then the decisions that result from these procedures will generally be sound as well. This approach is found in the contributions of Richard Devon and Yannick Julliard.
Richard Devon's aim is to generate, from the context of technology, research questions aimed at improving (the systematic study of) social arrangements for decision-making. He argues that the quality of social arrangements for decision-making (about technology) to a large extent determines the ethical acceptability of the outcomes of such decision-making. Therefore such social arrangements are a worthwhile object of study for philosophical ethics. So, whereas the traditional focus of ethics is on the right action, the focus of social ethics is on the right social process. For technology, according to Devon, this means focusing on the design process and on project management. He emphasizes two central values in the social ethics of technology: cognizance and inclusion. Cognizance is to be understood as understanding, as well as possible, the implications of technology: "its possible uses and its social and environmental impacts in extraction, production, use, and disposal". Inclusion means: "making sure the right people are included in the decision making." As Devon remarks, inclusion also improves (but does not ensure) cognizance.
Devon argues rather convincingly that what he calls "social ethics" would be a fruitful field of study—by showing, e.g., that many famous engineering ethics case studies are more suitably conceptualized as problems in social ethics rather than individual ethics. He also speculates on what insights might be achieved by practicing a social ethics of technology. His contribution is mostly programmatic in character. It is to be hoped that researchers in this area acknowledge the importance of this approach and develop a methodology and formulate research questions. Some researchers have of course already started doing this (see Herkert 2003; & Devon & van de Poel 2004).
The contribution of Yannick Julliard fits squarely within the approach proposed by Devon. Julliard advances a system of Ethics Quality Management (EQM), inspired by Total Quality Management as laid down in the ISO 9001 norm. The intention of Julliard's approach is to ensure ethical behavior of companies involved with technology—but presumably, any type of company—by creating procedures that force all individuals within a company to somehow act in line with the needs of society. The author intends his contribution as a first step on the way of giving ethics within companies a very concrete focus, by means of a system of norms that has proven its success in other areas.
In fact, what EQM is aiming to achieve is successful "inculturation" of technology: "the task of ethics quality management is to focus on the acceptability of technology and products as a central value" (Julliard (this issue) p.6). However, there could be a gap between acceptability and acceptance. A technology or product could be, as a matter of fact, accepted by customers or even society at large, while nonetheless being unacceptable from a more Archimedean point of view, for example because of considerations related to sustainability. It is not hard to think of actual products where this is the case. EQM would appear to be at least in danger of emphasizing acceptance at the cost of acceptability considerations. This worry arises as a result of the emphasis on conflict solving between the involved parties, and the praise for EQM as a way to reduce economic risks of companies. If ethics is conceived of as some kind of marketing strategy, do we thereby achieve the disinterestedness and long-term view that would seem to be the trademarks of ethics proper?
A general worry about the procedural approach, which can be found in discussions of the work of Jürgen Habermas and of John Rawls, remains: how can a procedural approach guarantee the quality of its outcomes? That is, can we trust that if the right procedures are in place, all ethical problems will be properly dealt with? One should probably conclude that a procedural approach in this sense is not sufficient on its own. Devon's point that social engineering ethics should be a topic of academic study alongside the traditional individual approach is of course not affected by this. However, it may be a legitimate worry in fleshing out Julliard's EQM approach.
In a rather different way, Hansson's contribution also addresses social arrangements for decision-making about technology. However, rather than reflecting on the procedure, he comes up with a substantial moral condition that collective decisions which lead to the imposition of (technological) risk would have to satisfy: "Nobody should be exposed to a risk unless it is part of an equitable social system for risk-taking that works to her advantage." It would be interesting to think about whether, and if so how, a decision making procedure could be designed which would ensure fulfillment of this condition.
(Cultural) Context of Engineering Ethics, and Differences in Approach
To what extent does and should the context within which an ethics was developed determine the central norms, and also the focus on certain types of problems and their approach? Two contributions to this issue, in rather different ways, are concerned with this question.
Heinz C. Luegenbiehl considers the notion of autonomy, which is central to Western engineering ethics. Is an engineering ethics that is centered on this notion also applicable in other parts of the world? No, is the answer that Luegenbiehl gives in his article "Ethical Autonomy and Engineering in a Cross-Cultural Context." By discussing the case of Japan, he shows that in some cultures autonomy does not play a central role. Luegenbiehl contrasts American society with Japanese society. Japan is a culture with an emphasis on the group above the individual. Hence, there is little space for professional autonomy. Rather, in Japan we see the practice of collective responsibility. The head of a company resigns instead of the person who made a mistake. It is not the profession but the corporation that is responsible for the well being of the society. Nevertheless, in the Japanese context we can still make use of important insights from "standard" engineering ethics by distinguishing between the value of autonomy and the goal that autonomy has to achieve ("safety, health and welfare of the public"). This means that certain ideas will need to be rephrased and the emphasis has to shift from the individual to the group. So even though there are important cultural differences, as this case shows, a global engineering ethics is still possible according to Luegenbiehl.
Another type of difference in context that is relevant to how a body of ethics has developed has to do not with regional or cultural differences, but with differences between disciplines. Joe Herkert and Brian O'Connell, in their contribution "Engineering Ethics and Computer Ethics: Twins Separated at Birth?", observe that engineering ethics and computer ethics have developed along parallel, but separate paths. In part this appears to be due simply to the fact that different individuals contributed to the two topics; however, it may also be due to the fact that engineering and computing have significant differences in their development. Whereas the former is traditionally focused on "transformation of the physical world," the latter is in first instance grounded in abstraction. The resulting difference between engineering ethics and computer ethics is that the former is much more grounded in a robust, everyday practice, whereas computer ethics is more abstract. O'Connell and Herkert argue that computer ethics should adopt this practical attitude as well. On the other hand, computer ethics can serve as a model on how to integrate micro-ethical and macro-ethical approaches. The former focus on individual agents, the latter on social institutions. Traditional engineering ethics has had a hard time integrating both approaches. Computer ethics is more advanced in this respect.
Apart from the fact that computer ethics and engineering ethics can learn important lessons from each other, the authors point out that there is another obvious reason why the two branches of ethics they consider should not remain separated. Many important moral issues in information technology have implications for other areas of engineering as well, since computers permeate many areas of engineering. A lot of engineering nowadays is simply unthinkable without the use of computers. Issues such as privacy and computer system reliability are therefore not only relevant to computer ethics but also to engineering ethics. O'Connell and Herkert mention the case of the Therac 25 as an example of this.
It is interesting to note that whereas Luegenbiehl is mostly cautious about how insights in American engineering ethics can be applied elsewhere, O'Connell and Herkert enthusiastically argue that it's time that computer ethics and engineering ethics integrate more and start learning from each other. Perhaps discipline-related differences in context are easier to overcome than cultural ones. There is another difference between the two cases that may explain this: whereas the different disciplines have led to concern with very different types of issues, and therefore advances in different areas, the cultural difference shows that the norms that have been developed may need to be readjusted, by trying to discern a more universal underlying norm. However, in both cases the ethics originally developed in different contexts more and more permeate each other, and therefore there is little choice but to attempt to somehow find ways to integrate them.
Practical/Professional versus Abstract/Applied Approach
A common theme that can be identified among various authors is the question whether engineering ethics should start from general, abstract principles that should be applied to particular cases or whether it should start from the concrete professional practice of engineers. This discussion relates to a discussion that is lead in general moral philosophy. Philosophers who adopt a Kantian approach think that ethical reflection should start with general, abstract principles. Aristotelians instead emphasize the role of concrete experiences, practices and (moral) perception of particular cases. Utilitarians (at least act-utilitarians) as it were choose a middle ground: the utilitarian principle is general and abstract, but action prescriptions depend on concrete circumstances. With the authors who contributed to this special issue, we see various positions being taken: Whitbeck defends a practical, Aristotelian approach, Luegenbiehl and Herkert/O'Connell can be seen as authors who emphasize the role of particular contexts but still think that formulation of general moral insights is possible, whereas Hansson discusses various general moral principles and defends one specific general moral principle.
In her paper "Investigating Professional Responsibility," Caroline Whitbeck distinguishes between two philosophical approaches to the topic of professional responsibility, namely "applied ethics" versus "practical ethics." "Applied ethics" refers to approaches that start from a general ethical theory that can be applied to concrete cases. However, as has been argued by intuitionists such as W.D. Ross, Aristotelians, Wittgensteinians and feminist philosophers, the moral landscape is too complex and diverse to allow for such a generalistic top-down approach. These philosophers all advocate an alternative approach. Moral deliberation and reflection has to be bottom-up: starting with the particular facts of a concrete case and forming moral judgments based on these particular cases. This is the kind of approach that Whitbeck defends.
Whitbeck concludes that philosophers working in professional ethics should adopt the practical ethics-approach: on the one hand providing the professions with arguments, ideas and concepts from moral philosophy, on the other hand learning from the vast experience from the professions and interacting with social scientists and other scholars in the humanities.
Various other authors address the issue of an abstract versus a practical approach more or less explicitly. As said before, according to O'Connell and Herkert, a major difference between engineering ethics and computer ethics is that the former is more practical, the latter more theoretical, and they think that both approaches can be fruitfully combined. Heinz Luegenbiehl's contribution can be read as a more practical approach to engineering ethics. However, Luegenbiehl argues that despite the enormous differences between the American and the Japanese approach, a global engineering ethics is still possible. As mentioned in section 3, this can be done by focusing on the goals we want to achieve with technology instead of on how to achieve them, which can differ from culture to culture.
Sven Ove Hansson thinks that general moral theories can be challenged by cases involving technological risks. He argues that standard ethical theories are ill suited to deal with indeterministic cases. He advocates a closer collaboration between moral philosophy and decision theory, especially concerning ethical aspects of risks. Furthermore, standard approaches to risk analysis, which are based on utilitarian calculus, inherit all the well-known problems of utilitarianism such as the possibility that minorities have to suffer in order that a majority gets certain advantages. This leads Hansson to formulate the general principle mentioned before: "Nobody should be exposed to a risk unless it is part of an equitable social system for risk-taking that works to her advantage." Hansson concludes that moral philosophy has a lot to contribute to the fields of risk analysis and risk management, but that at the same time the topic of risk raises some challenging issues that require new philosophical approaches.
These articles all put a different emphasis on abstract or practical approaches. However, rather than posing mutually exclusive alternatives between which we are forced to choose, it might be possible to learn something from all these various arguments. Note that even Hansson's general principle explicitly mentions particular circumstances, i.e. concerning an "equitable social system...that works to her advantage." This leaves open various different social arrangements and doesn't prescribe in advance what kind of social system might fulfill these conditions. Still Hansson's contribution shows why it is worthwhile to try to formulate the conditions for such a system in a general way. This way we can make some general comparisons between alternative guidelines for acceptable risks. Luegenbiehl thinks that the concept of autonomy might be too much tied to a cultural context to be worthwhile for a global engineering ethics, but he thinks that the goals of technology can be formulated in a general and universal way. O'Connell and Herkert think that computer ethics and engineering ethics can contribute a lot to each other, exactly because the former is more general and abstract and the latter more practical and concrete. Whitbeck emphasizes the experience professionals have and which cannot be replaced by general and abstract ideas. The professional does not only have technical experience but also moral experience inherent to his or her work, experience that cannot be adequately replaced by abstract, general philosophical ideas.
The conclusion we can draw from these interesting arguments is that the role of general reflection can be to discern general patterns and formulate criteria for comparability, while this can never replace concrete, practical moral judgments in particular circumstances. General reflection can only be an aid in reflection, but not an absolute guide, since concrete moral reality is much too diverse and too complex to allow for this. People who work in practical contexts have an expertise and practical wisdom that can be assisted, but not replaced by, general moral reflection. Moral philosophers who work in engineering ethics should also consider concrete, particular cases and listen to the experience of professionals and include this in their normative assessment.
Professional Values Across Different Professions
If the approach taken in engineering ethics is predominantly an approach in which we try to distill values from professional practice, rather than applying abstract principles to that practice, then supposedly we should expect differences between the ethics of different practices. Caroline Whitbeck gives several examples of this. For example, whereas in engineering there is a prohibition on taking work outside one's competence, there is no such prohibition in medicine. This is because medical education is generally such that trainees necessarily have to perform procedures on patients in real life situations in order to acquire the skills needed for their future work, whereas engineers can acquire the required knowledge in a theoretical setting. In medicine, on the other hand, there is a strong obligation not to cease medical help to a patient, whereas in engineering there is generally no such rule. Probably this is connected with the fact that a medical patient finds himself in a much more vulnerable and dependent position than an engineer's client. Another example may be found in a comparison of legal professions with engineering: solicitors should at all times avoid conflicts of interest, whereas engineers "merely" have to find a way of dealing with such conflicts openly and fairly.
But also when we take a more abstract approach to ethics, as outlined in the previous section, there may be interesting differences between professional values. Luegenbiehl observes that in the American approach to engineering ethics, there is an increasing responsibility for the professional, this in contrast to other professions that move away from paternalism to more individual responsibilities of clients or users. In medicine, for example, the focus is much more on patient autonomy, as may be seen from the importance of the principle of informed consent. According to Luegenbiehl, this is because technology is increasingly complex. Engineers possess specialist knowledge that enables them to make responsible decisions, whereas the general public lacks the necessary knowledge to understand the technology.
Interestingly, Luegenbiehl appears to think of this move towards paternalism as a necessary feature of professionalization. This gives rise to the puzzling question whether technology is indeed so much more complex than medicine, where increased professionalization led to a move away from paternalism. The issue is in fact controversial within engineering ethics, for whereas Luegenbiehl is without a doubt giving a correct description of the status of client autonomy in the engineering profession, other authors (such as Martin & Schinzinger 1996; Robert Baum 1983) call this norm into question.
There are many interesting and pressing ethical topics that engineering and technology give rise to. This special issue features authors who all make an effort in identifying problems and offering possible solutions. The authors have various backgrounds: moral philosophers, philosophers of science and engineers who have devoted research on ethical aspects of their work. The basis is there for an interdisciplinary, international research community. Hopefully this special issue will spark further discussions and research in this important and interesting field.
Baum, R. "The Limits of Professional Responsibility." in Schaub J.H. & Pavlovic K (eds.), Engineering Professionalism and Ethics. New York: Wiley, 1983.
Devon, R., & van de Poel, I. "Design Ethics: the Social Ethics Paradigm." International Journal of Engineering Education 20:3 (2004): 461-469
Herkert, J. "Professional Societies, Microethics, and Macroethics: Product Liability as an Ethical Issue in Engineering Design." The International Journal of Engineering Education 19(1): 2003, 163-167.
Martin, M. & Schinzinger, R. Ethics in Engineering. New York: McGraw-Hill, 1996.