SPT v8n1 - Engineering Ethics and Computer Ethics: Twins Separated at Birth?


Number 1
Fall 2004
Volume 8

Engineering Ethics and Computer Ethics: Twins Separated at Birth?

Brian M. O'Connell
Central Connecticut State University

Joseph R. Herkert
North Carolina State University


Over the past two decades, engineering ethics and computer ethics have emerged as identifiable fields of applied ethics. While some individuals have made contributions to both fields, for the most part they have developed in the USA along parallel, but separate paths. In previous presentations ( O'Connell & Herkert 2001a ; 2001b ) we have argued that material drawn from computer ethics should be standard fare in all engineering ethics treatments, not just those aimed specifically at computer engineers and scientists. This conclusion emerges from the ever-expanding prominence of computer technology in both engineering education and practice and the form of engineered products. As noted by William Wulf (1997) , a University of Virginia Professor and President of the National Academy of Engineering:

The pervasive use of information technology in both the products and process of engineering …has the potential to change the practice of engineering significantly, and hence the education required to be an engineer…As the power of computers…increases exponentially, more and more routine engineering functions will be codified and done by computers, simultaneously freeing the engineer from drudgery and demanding a higher level of creativity, knowledge, and skill. [emphasis added]

The importance of social and ethical implications of computing with respect to engineering practice and products should also not be ignored. For example, George Fisher (2000) , Chairman of the Board of Eastman Kodak Company, who compares the impact of "digital computing and communication" to that of the printing press notes that "integrating human needs (with respect to information and communication technology) is engineering's biggest challenge and opportunity."

Despite its domineering role in all of contemporary engineering education and practice, computer technology is afforded little if any special consideration in standard treatments of engineering ethics (see for example Harris, Pritchard, & Rabins 2000 ; Martin & Schinzinger 1996 ; Whitbeck 1998 ) except when the target audience is explicitly computer engineers. In contrast, chapters on environmental ethics are typically found in general engineering ethics texts (see again Harris, Pritchard, & Rabins 2000 ; Martin & Schinzinger 1996 ; Whitbeck 1998 )—indeed, some engineering ethics texts focus primarily on environmental issues (see for example, Gorman, Mehalik, & Werhane 2000 ).

Examples of computer-related ethical issues that are of importance to engineers of all disciplines are intellectual property in the digital age, privacy, and computer systems reliability. Issues relating to the ownership of digital material have become increasingly relevant to engineering for a variety of reasons. Computers have become integral elements of design, manufacturing and control of even the most conventional of devices. Questions affecting the ownership of instruction sets, firmware, interfaces, routines and applications are thus of extreme significance to a wide variety of actors, from design to implementation and beyond. Computing has also become a primary vehicle for the dissemination of information in the form of digitally mediated journals and books, to networked communication by electronic mail. Within the United States, the constitutional mechanisms of copyright and patent law have been animated by a policy of limited protection of intellectual material. The doctrine of "fair use" and the time-limitations of the patent protection are examples of provisions favorable to the public access of scientific and technical information.

Currently, a number of legal initiatives have been enacted which have increased ownership controls of digital material to beyond that which was permitted under traditional policies. The matter of Universal City Studios v. Corley (2001) , involved the reverse engineering of the "Content Control System" (CSS), a proprietary device used by the movie industry to encrypt DVD material in order to prevent copying. This effort produced the creation of the " DeCSS" program, which, among other things allowed the copying of DVD material. In the ensuing suit to enjoin the use or communication of DeCSS, the defendants, operators of a Web site which had published the code, claimed, with the support of many from science, engineering and academic law, that a prohibition would prevent many "fair uses" of the technology, enabling producers of information to lock-out access at their discretion, and to the detriment of the public. The DeCSS case represents a new, unprecedented trend toward information restriction that threatens a wide variety of activities within the scientific and technical environment.

Privacy presents another area in which computing has brought with it new issues and paradigms for analysis. Due to the data-centered nature of digital devices, it is possible to easily create many forms of data collection within many types of computer-related applications. Because the actions of the computer frequently occur beyond the "front-end" of a device, it is often impossible for users to know that their data is being collected. Similarly, the use of digital data collection opens the door to uses that may not have been contemplated or anticipated by designers. The development of ":Cue Cat" is an example of these wide-ranging effects. This hand-held instrument employs an optical scanner to read bar codes embedded in such mediums as conventional publications to directly access Web sites or search pages. This enables readers to avoid the need to type complex URL's into their browsers and affords instant connections to online information and services. The technology has been criticized for its less-publicized ability to track user actions through its assignment of unique identity codes to individual units ( Olsen 2000 ). The :Cue Cat serves as an example of the need to recognize that digital devices present inherent potentials for unanticipated or undesirable uses of information, well beyond that of analog counterparts.

Another area of concern for engineering involves the increasing role of computer-generated or mediated data. Although engineers are well acquainted with the importance of measurement within their fields, it has been suggested that they are often less aware of the inherent problems associated with computer-related information. Often, computerized information is derived from models that are created by programmers who are not versed in the real-world dynamics. In real-world applications, investigators have noted an over-reliance placed upon software by engineers who are not familiar with its developmental processes and shortcomings ( Leveson & Turner 1993 ). The disastrous results of the Therac-25 radiological devices, considered in more detail within, exemplify how engineering competence must extend to core aspects of computing.

In the rest of this paper, we expand on this theme and address the more general question of how engineering ethics and computer ethics stand to benefit further from one another, in both education and research.

Lessons from Engineering Ethics

In this section, we propose that the most valuable contribution engineering ethics offers to computing is its mature sense of identity. We submit that this identity is linked to broadly accepted, core professional practices, which are strongly materialist. Flowing from this recognition is an ethical posture informed by realistic and comprehensive understandings of purposes, effects and implications. We will contrast this with the state of computing, which by tradition, but not necessity, possesses a self-image which emphasizes theory and abstraction. As a consequence, there is minimal consideration of ethics as an intrinsic element of practice.

At first glance, contemporary computing and engineering ethics seem to be so similarly situated that neither pursuit would appear to have much to offer the other, except in the way of encouragement shared between two newly evolving disciplines. Both fields remain dynamic and unstable as they pursue substantive development and recognition within their respective communities. In these instances, engineering and computer ethics are similarly engaged as relatively new institutional actors and their recency presents problems for a useful interdisciplinary exchange of ideas.

Engineering and the Ethics of Practice

Institutional developments, significant as they are, do not define the limits of ethical resources. While the birth of contemporary engineering ethics is placed in the 1970's ( Lynch 1997/1998 ), concerns about the moral implications of its endeavors likely pre-date conventional history altogether. As Albert Jonsen (1998) stated within the context of medicine, modern conceptions of professional ethical behavior did not begin with a "Big Bang." Instead, early and fundamental, working definitions of ethics are most clearly derived from the specifics of practice. Thus, Dr. Richard Cabot (1869-1939) essentially defined ethical medical behavior as competence as a practitioner. Significantly, this definition was not confined to purely technical skill, but involved a broader, "appreciation of the personal and social needs of the patient" ( Jonsen 1998 , p. 9).

While modern engineering ethics have gone well beyond the realm of simple competence, the role of practice retains pre-eminence within its ethical analyses. Contemporary engineering ethicist, Michael Davis (1999) affirms this when he resists separating ethical from practical aspects of engineering, stating that "engineering ethics is part of thinking like an engineer".

Ethics, according to this perspective, requires a substantial understanding of the actual activities involved within the profession. It is an epistemic process, which demands technical, material knowledge sufficient for the widest possible consideration of goals, implications and effects. Efforts, both scholarly and popular, continue to advance descriptions of core activities that define engineering as an activity.

In his study of this issue, Walter Vincenti (1990) initially employs the definition of G.F.C. Rogers:

Engineering refers to the practice of organizing the design and construction of any artifice which transforms the physical world around us to meet some recognized need (quoted in note 4b).

Extrapolating from this definition, Vincenti concludes that "(e)ngineering knowledge reflects the fact that design does not take place for its own sake and in isolation". Rather, it occurs as "a social activity directed at a practical set of goals intended to serve human beings in some direct way" ( Vincenti 1990 ). Davis (1998) also alludes to this attribute of engineering when he refers to it as "sociological knowledge, a knowledge of how people and tools work together, but it is nonetheless engineering knowledge."

A comprehensive analysis of the dynamics involved in the joining of practice to ethics is beyond the scope of this paper. It is however, important to note two attributes of this condition. The first concerns focus. By associating its essential activities with human effects and interests, engineering has implicitly included issues of public accountability and responsibility within its framework. Thus, ethical reflection is, as Davis states, a natural aspect of thinking like an engineer. The second attribute involves relevance. The grounding of ethics to actual practice imparts an increased confidence that value judgments will be responsive to the issues encountered. The influence of this perspective upon the activities and pedagogy of engineering ethics are considered within. Of immediate significance, is the contrast between this approach and that of computing.

Any comparison of computing with engineering must initially take into account significant developmental differences. Modern engineering has evolved from ancient roots rich in references to the practical, "transformation of the physical world". Until relatively recently, it has been largely regarded internally and popularly, as a unitary profession ( Davis 1998 , p. 22). Even the advent of specialization has not erased a public and scholarly acknowledgment of commonality, or what Layton has termed a "professional nucleus" which is differentiated by individual professional societies ( 1986 , p. 26). This status has doubtlessly supported a shared notion of purpose and has facilitated the consideration of common, practice-specific values.

Computing and Abstraction

In contrast, computing possesses a more disparate heritage. Its origins may be located within mathematics as well as philosophy, with more recent advances emerging from such diverse disciplines as electrical and electronic engineering, physics, economics, psychology and biology ( Davis 2000 ). Many founding (and still influential) actors migrated from their original fields to the new disciplines of computer science and computer engineering. These developments imparted a degree of professional identity, but for reasons examined below, they have also produced significant effects on the focus and nature of computing ethics. 1 Additionally, unlike engineering, computing has arisen mainly from academic settings. Consequently, while specific, tangible and commercial achievements such as mainframes or the personal computer are lauded, academically oriented subjects remain central to the field's identity, as is evidenced by the title of a popular text, Algorithmics: The Spirit of Computing ( Harel 1987 ). 2

An ethics of practice is generated by a substantially shared vision of primary activities. Whether by reference to "design", "organization", "construction", or similar terms, engineering possesses a core understanding of itself, which implicitly incorporates the idea of social responsibility. Layton (1986) and others have discussed how this understanding has been imperfectly applied and even avoided. Nevertheless, the presumption of its existence remains constant. In contrast, computing has largely evolved from mathematics and the theoretical sciences. Many founding members of computing faculties have been drawn from these disciplines and often retain a primary identification with their original fields. In these environments, competence is commonly defined as facility with such abstract subjects as algorithms, formal languages and logic. The consideration of material or social effects, while certainly possible, cannot be assumed as a natural outcome of these activities.

It is undeniable that abstraction is thus a critical component of computing. What can be questioned is how it is represented within the curriculum and the profession. Most frequently, it is exists in a hermetic state, detached from real world problems and effects. Consequently, the role of ethical study, though not totally incapacitated, arguably takes on a forced and almost intrusive quality—imported as an after-thought rather than an intrinsic consideration.

Arguably, assisted by the lack of a "native" practice-centered ethic, curricular and scholarly work in the field has largely emerged from a collaborative effort between computing and such external disciplines as philosophy, law, the behavioral sciences and theology. This is not a unique or negative development, as the results of a similar evolution of modern biomedical ethics will attest. However, as addressed below, it does raise issues regarding the balance of disciplinary participation, including the question of which discipline exerts the most influence in the setting of agendas.

Based upon these circumstances, the current condition of computing exhibits two related and problematic ethical situations, both typified by disconnection. The first and most controversial submission is that computing as an activity has remained, due to its dominant self-definition, disconnected from reality.

While there was arguably a time when computing could be viewed as the pure activity of symbolic manipulation, the moment was shorter than is generally acknowledged. Almost immediately after their production and limited dissemination, computers became involved in human affairs, most ostensibly through the processing of personal data and the specter of automated decision-making. As early as 1971, the direct effects of computers on human relationships had been identified as a critical contemporary and future problem. Significantly, the threat was presented as a professional issue. Harold Sackman makes this clear when he comments that "universities are turning out the first generation of theoretically oriented computer scientists— interested in hardware and software, but not people—scientists who are too often temperamentally and technically unsuited for the vast work of building a computer-serviced society" ( 1972 , p. 17).

Similar early concerns for the human effects of computing were addressed by Joseph Weizenbaum (1976) . In an interesting contrast with the engineer's "inextricable" concern for the material world, he states:

One would have to be astonished if Lord Acton's observation that power corrupts were not to apply in an environment in which omnipotence is so easily achieved. It does apply. And the corruption evoked by the computer programmer's omnipotence manifests itself in a form that is instructive in a domain far larger (than) the immediate environment of the computer (p. 115).

The "Eliza effect" ( Nelson 2001 ), is appropriately named after a program created by Joseph Weizenbaum (1976) to study aspects of text scripting, but which gained unanticipated fame for fostering illusions of intelligence in many who observed its operation. It is a term now used to describe the belief that digital output is inherently more "trustworthy" than that generated by the material world. Such an attitude was a primary ingredient in the incidents surrounding the THERAC-25 medical devices. Here, misplaced faith in software-mediated radiological measurements resulted in serious injury and death ( Leveson & Turner 1993 ). On a more metaphysical, but also ethical level, commentators have submitted that trust in the superiority of abstraction as represented by some advocates of artificial intelligence, virtual reality applications and cybernetics, significantly degrade valuation of the material, including human beings, at least as physical entities ( Heim 2000 ; Hayles 2000 ).

The point made throughout this commentary is that regardless of abstraction's epistemological dominance, computing is indeed powerfully connected to real-world effects. This has been true in the past and is even more so today with the ubiquitous use of "intelligent" devices in medicine, transportation, environmental processes and other safety-critical systems. The failure to engender a practice-centered ethical perspective in computing has resulted in the masking of such material issues in computing's self-identity, particularly as communicated through its basic teaching, research and internal dialogues. Evidence of this deficit can be witnessed in numerous ways, ranging from the paucity of ethical content in "serious" technical papers to "hard" computer science courses, which never mention the ethical implications of their subject.

This condition leads to the second major effect caused by computing's practice-centered void, a condition which might be termed "disciplinary drift". While wide collaboration is of unquestionable value, those in computing may be tempted to delegate choices of problems and analytical approaches to nonpractitioners. In such instances, there is significant risk that issues relating to practice will be missed.

Equally problematic are texts that broadly address policy issues, but leave it to the reader to supply or even correct the technical details. When written by noncomputing experts, there is a risk of incomplete integration and the creation of an illusion that ethical issues only emerge in certain, often-ethereal contexts. Indeed, ethics may be presented as literally requiring a "federal case". Authors unfamiliar with the practice of computing appear more particularly susceptible to the embracing of analogies and terminology, which, while popular in the pages of popular "e-zines", are entirely inappropriate to real practice scenarios. A common example is the ubiquitous use of the term "cyberspace" to represent a non-existent dimension envisioned largely by non-technically oriented commentators and "visionaries" ( Koppell 2000 ).

Engineering ethics is not without abstraction, but in contrast with computing, it is animated by a robust and active movement concerned with the seamless identification of ethics with practice. Gorman, Hertz, Magpili, Mauss, & Mehalik ( 2000 , p. 463) point to the necessity of cultivating the "heterogeneous engineer" who is "adept in understanding the entire context of a problem." Through the use of "moral imagination" ( Werhane 1994 )—an ability to assume perspectives beyond that of the technical actor—these authors lay a theoretical groundwork for engineering as "reflective practice".

The blending of ethical considerations with practice issues is apparent in a number of projects undertaken within engineering education. Examples include design courses that present computational accuracy as an ethical issue ( Goddard 2001 ), case studies that combine technical problems with ethical scenarios ( Pritchard & Holtzapple 1997 ), and faculty education directed toward developing sensitivity to ethical problems encountered within industry ( Gorman et al. 2001 ). A particularly poignant example of this approach, described by Catalano et al. (2000) , involved a capstone engineering design experience that focused on the needs of an individual with advanced cerebral palsy. Follow-up interviews with the students, graduating members of the United States Military Academy, included reports of sensitization to the need for technical and financial resources directed toward the disabled, the achievement of growth "both as engineers and as people," and the accomplishment of their project as "a labor of love".

There is no feature of computing which would render it unable to engage in similar programs. The critical stumbling block has been a general failure to regard its most intrinsic aspects directly relevant to the material, everyday world. There are a number of positive signs that computing is recognizing this necessity. Professional forums such as the ACM's Forum on Risks to the Computer Public in Computers and Related Systems is a particularly salient example ( Neumann ) as are the commentaries generated in the evolution of software engineering ( Pour, Griss, & Lutz 2000 ). A more general correction is also possible, but only if computing's practical dynamics are elevated to a level of prestige which approaches that accorded to its theoretical dimensions.

Lessons from Computer Ethics

The strong grounding in practice of engineering ethics does not come without a cost. As noted above, the implicit commitment to social responsibility imbedded in such an approach is often hard to realize in the actions of engineers and professional engineering societies. Ironically, though as we argued above, computing is far less grounded in practice, the field of computer ethics has done a much better job to date of integrating "microethical" and "macroethical" perspectives in research and education.

Microethics and Macroethics in Engineering ( Herkert 2001 ; 2003 )

A number of authors have suggested that engineering ethics encompasses multiple domains. The ethicist John Ladd (1980) subdivides engineering ethics into "micro-ethics" or "macro-ethics" depending on whether the focus is on relationships between individual engineers and their clients, colleagues and employers, or on the collective social responsibility of the profession. In each case Ladd seems to be concerned with what might be called "professional ethics," with micro-ethics focusing on issues for the most part internal to the profession and macro-ethics referring to professional responsibility in a broader, societal context.

McLean (1993) , an engineer, utilizes three categories in discussing engineering ethics: technical ethics, dealing with technical decisions by engineers; professional ethics, dealing with interactions among managers, engineers and employers; and social ethics, dealing with sociopolitical decisions concerning technology. McLean's notion of professional ethics is narrower than Ladd's, incorporating only those dimensions that Ladd describes as micro-ethics. At the same time, McLean has a broader overall notion than Ladd of the spheres of ethics that are relevant to engineering for he includes both individual and societal dimensions. Another engineer, Vanderburg (1995) , while employing terminology similar to Ladd's, seems to neglect professional ethics entirely while distinguishing between "microlevel" analysis of "individual technologies or practitioners" and "macrolevel" analysis of "technology as a whole," categories that track to McLean's technical and social ethics categories.

De George, an ethicist, distinguishes between "ethics in engineering," and "ethics of engineering" ( Roddis 1993 ). The focus of the former is on actions of individuals while the latter is concerned with both relationships internal to the profession and the responsibilities of the engineering profession to society. De George's notion of "ethics of engineering" thus incorporates both Ladd's micro and macro dimensions. In addition, the "ethics of engineering" specifically includes professional engineering societies.

As shown in Table 1, when combing these various facets of engineering ethics, an interesting pattern emerges. Three frames of reference are apparent: individual, professional and social. Combining Ladd's and Vanderburg's terminology, "microethics" can be seen to include concern with individuals and the internal relations of the engineering profession, while "macroethics" applies to both the collective social responsibility of the engineering profession and to societal decisions about technology.

Heretofore, most research and teaching in engineering ethics has had a micro focus either in the sense Vanderburg uses the term or the sense in which Ladd uses it. This state of affairs is lamented by Winner, who is critical of the over emphasis in engineering ethics on case studies of microethical dilemmas to the exclusion of larger issues relating to the development of technology:

Ethical responsibility...involves more than leading a decent, honest, truthful life, as important as such lives certainly remain. And it involves something much more than making wise choices when such choices suddenly, unexpectedly present themselves. Our moral obligations must...include a willingness to engage others in the difficult work of defining what the crucial choices are that confront technological society and how intelligently to confront them (1990, p. 62).

Recently, scholars have begun to address macroethical issues in connection with engineering ( Herkert 2000 ; Lynch & Kline 2000 ; Woodhouse 2001 ). Yet to be developed, however, is a comprehensive framework for integrating microethical and macroethical approaches in engineering ethics. Indeed, as suggested in the critiques of Ladd and Winner, many scholars and teachers of engineering ethics explicitly exclude macroethics as a fundamental focus in engineering ethics. A lack of appreciation of the role of macroethical perspectives is reflected in popular definitions of engineering ethics, such as the following passages from two of the leading engineering ethics texts:

Engineering ethics is (1) the study of moral issues and decisions confronting individuals and organizations engaged in engineering and (2) the study of related questions about the moral ideals, character, policies and relationships of people and corporations involved in technological activity ( Martin & Schinzinger 1996 , p. 2-3)

Engineering ethics is concerned with the question of what the standards in engineering ethics should be and how to apply these standards to particular situations. One of the values of studying engineering ethics is that it can serve the function of helping to promote responsible engineering practice. ( Harris, Pritchard, & Rabins 2000 , p.26)

The apparent disconnect between microethics and macroethics in engineering is problematic for a number of reasons ( Herkert 2004 ). From a societal viewpoint, we need policies that are ethical and ethical viewpoints that are sensitive to social problems and issues. For example we should question a product liability policy that might make it more difficult for engineers to perform their jobs in an ethical manner, thus compromising public safety ( Herkert 2001 ; 2003 ). On the other hand, an ethical stance that all technology should be risk free on the grounds that engineers have a duty to avoid harm would clearly run counter to societal needs and economic realities.

From the individual's viewpoint, engineers need ways of dealing in a consistent and holistic manner with ethical issues that arise in their various roles. In the absence of integration of ethical considerations from their personal and professional roles with issues that may arise in their public roles, engineers might become confused or complacent regarding the importance of ethics in all of these roles.

Microethics and Macroethics in Computing

In contrast to engineering ethics, computer ethics has often been broadly defined so as to include both microethical and macroethical aspects. This tendency dates at least as far back as James Moor's seminal 1985 article, "What is Computer Ethics," in which he argued that "…computer ethics includes consideration of both personal and social policies for the ethical use of computer technology."

Even when attention turns from research to pedagogy, the computer ethics community seems to take a broader view of their field than does the engineering ethics community. For example, in highlighting the goals of engineering ethics instruction, Davis ' focus ( 1999 ) remains squarely on the microethical:

Teaching engineering ethics…can achieve at least four desirable outcomes: a) increased ethical sensitivity; b) increased knowledge of relevant standards of conduct; c) improved ethical judgment; and d) improved ethical will-power (that is, a greater ability to act ethically when one wants to).

In contrast, Johnson's ground-breaking classic text on computer ethics ( 1994 ) prominently includes understanding of the societal context of computer technology within the goals of computer ethics courses (note especially items 3 and 4):

(1) to make students (especially future computer professionals) aware of the ethical issues surrounding computers;

(2) to heighten their sensitivity to ethical issues in the use of computers and in the practice of computing professions;

(3) to give them more than a superficial understanding of the ways in which computers (do and don't) change society and the social environments in which they are used;

(4) to provide conceptual tools and develop analytical skills for sorting out what to do when in situations calling for ethical decision making or for sorting out what the likely impacts computer technology will have in this or that context (p. 6).

Indeed, in a review of the field of computer ethics Mitcham notes that in Johnson's text "she commonly weaves together professional ethical, legal, governmental, and societal concerns" ( 1995 , p. 119). Mitcham goes on to argue for the importance of societal concerns in reevaluating traditional approaches to ethics and integrating them with practice:

Such professional efforts to take into account general societal concerns about the right to privacy clearly constitute efforts not only to reevaluate the application of traditional ethical principles, but also to establish new agreements about both principles and practices in the presence of computers and other new electronic information technologies (p. 120).

The broader perspective of computer ethics also extends to accreditation of professional programs and educational standards recommended by professional societies, suggesting that the profession's view of the scope of computer ethics is similar to and perhaps influenced by that of the computer ethics community. In engineering, the focal point of attention on Engineering Criteria 2000 (EC 2000) of the Accreditation Board of Engineering and Technology (ABET) has been on Criterion 3, which specifies program outcomes and assessment. Among other outcomes, "engineering programs must demonstrate that their graduates have…an understanding of professional and ethical responsibility…[and] the broad education necessary to understand the impact of engineering solutions in a global and societal context." ( ABET-EAC, 2003 ) There is no suggestion in EC 2000, however, that these criteria necessarily have anything in common, or that they can or should be approached in integrated fashion.

The current ABET criteria for accrediting Computer Science are equally vague, providing only that "[t]here must be sufficient coverage of social and ethical implications of computing to give students an understanding of a broad range of issues in this area." ( ABET-CAC 2003 ) Professional groups, however, have gone far beyond this, by suggesting detailed criteria for the integration of ethical and social issues in the computer science curriculum. For example, an integrated curriculum model for ethical and social impacts of computing (ES) was developed in Project ImpactCS ( ComputingCases.org ), funded by the National Science Foundation. The fundamental knowledge units in ES recommended by the study included professional responsibility, basic elements and skills of ethical analysis, and basic elements and skills of social analysis.

The ImpactCS study was no doubt one important input to the design of the proposed Social and Professional Issues (SP) component in Computing Curricula 2001 of the Joint IEEE Computer Society/ACM Task Force on the "Model Curricula for Computing" ( ACM & IEEE-CS 2001 ) which contains a range of ethical and social issues in computing:


SP1. History of computing
SP2. Social context of computing
SP3. Methods and tools of analysis
SP4. Professional and ethical responsibilities
SP5. Risks and liabilities of computer-based systems
SP6. Intellectual property
SP7. Privacy and civil liberties
SP8. Computer crime
SP9. Economic issues in computing
SP10. Philosophical frameworks

In defending the need to include these issues in the computing curriculum, the authors refer to arguments made ten years earlier in Computing Curricula 1991:

Undergraduates also need to understand the basic cultural, social, legal, and ethical issues inherent in the discipline of computing. They should understand where the discipline has been, where it is, and where it is heading…. Students also need to develop the ability to ask serious questions about the social impact of computing and to evaluate proposed answers to those questions. Future practitioners must be able to anticipate the impact of introducing a given product into a given environment ( Tucker et al. 1990 ).

The picture that emerges in computing is an ethical posture willing to acknowledge the multiple roles of computing professionals (personal, professional, public) and the importance of confronting a broad range of microethical and macroethical issues in research and education. Engineering ethics, on the other hand, its core knowledge having developed from a strong grounding in engineering practice and professionalism, appears less willing and capable of integrating broader social responsibilities of engineers and the engineering profession with considerations of individual behavior and internal relationships of the profession.

Conclusions

Engineering ethics and computer ethics emerged as academic fields in the USA at about the same time (1980s) and for many of the same reasons. Practitioners in these fields became increasingly aware of the social and ethical implications of their work and philosophers began to see these fields as fertile ground for the scrutiny of applied ethics. Despite these similar origins, like twins separated at birth, engineering ethics and computer ethics have been "raised" in radically different environments and thus have developed with different strengths and weaknesses. The notable lack of emphasis on computing ethics in engineering ethics education, despite the predominant role of computing in engineering processes and products, is indicative of the degree of this separation.

The strength of engineering ethics lies in its strong grounding in professionalism and the practice of engineering. In contrast, computer ethics, like computer science, sometimes lacks the professional identity and sense of the practical necessary for the in-depth understanding of ethical problems in computing. On the other hand, the focus of engineering ethics on the personal and professional has resulted in an apparent reluctance to take very seriously the broader social responsibilities of the engineering profession and questions of technology policy in general, issues that most treatments of computer ethics regard as fundamental to the field.

There is thus a need for serious and ongoing dialogue between engineering ethicists and computing ethicists regarding education and research in their fields. Though the differences in the fields are significant, there already exist a number of mechanisms and models for facilitating such an interchange. For example:

  • The accreditation of computer science programs in the USA has recently been merged into ABET, the organization that accredits engineering programs.
  • IEEE and ACM recently collaborated in the establishment of a code of ethics for software engineers ( Pour, Griss, & Lutz 2000 ).
  • Online resources have become a subject of increasing interest in both engineering and computing ethics.
  • Organizations such as the Association of Practical and Professional Ethics are well situated to facilitate interchanges between engineering and computing ethicists.
  • Professional societies of engineers and computer scientists are in a position to conduct joint conferences on social and ethical issues of relevance to both fields (see for example, Herkert 2002 ).

While engineering ethics and computing ethics were not really born twins—the differences in their analytical perspectives having existed from the origins of each—the metaphor of twins separated at birth is nonetheless appropriate since there is enough commonality in their origins and current status to facilitate mutual learning to the benefit of both. Indeed a (re)union of the two is long overdue.


Table 1. Microethics and Macroethics in Engineering ( Herkert 2003 )

Microethics Macroethics
Source Individual Professional Social
Ladd (1980) micro-ethics professional relationships between individual professionals and other individuals who are their clients, colleagues and employers macro-ethics problems confronting members of a profession as a group in their relation to society (i.e., social responsibility of professionals as a group)
McLean (1993) technical ethics technical decisions and judgments made by engineers professional ethics interactions between engineers and other groups (e.g., managers, engineers, employers) social ethics technology policy decisions at the societal level
Vanderburg (1995) microlevel analysis of individual technologies or practitioners macrolevel analysis of technology as a whole
De George as reported by Roddis (1993) ethics in engineering actions of individual engineers ethics of engineering the role of engineers in industry and other organizations, professional engineering societies , and responsibilities of the profession


Acknowledgement

Portions of this paper are drawn from the authors' prior work as indicated in the references.

References

ABET Computing Accreditation Commission (CAC). Criteria for Accrediting Computing Programs , available at http://www.abet.org/images/Criteria/C001%2004-05%20CAC%20Criteria%2011-18-03.pdf . 2003.

ABET Engineering Accreditation Commission (EAC). Criteria for Accrediting Engineering Programs , available at http://www.abet.org/images/Criteria/E001%2004-05%20EAC%20Criteria%2011-20-03.pdf . 2003.

Association for Computing Machinery (ACM) and the Computer Society of the Institute of Electrical and Electronics Engineers (IEEE-CS). Computing Curricula 2001 , available at http://www.acm.org/sigs/sigcse/cc2001/ . 2001.

Catalano, G. D. et al. "Compassion practicum: a capstone design experience at the United States Military Academy" Journal of Engineering Education 89 (4): 2000, 471-469.

ComputingCases.org. ImpactCS (Impact Computer Science) , available at http://www.computingcases.org/general_tools/curriculum/impactcs.html .

Davis, Ma. The Universal Computer: The Road from Leibniz to Turing . New York: Norton & Company, 2000.

Davis, Mi. "Teaching ethics across the engineering curriculum." available at htttp://onlineethics.org/essays/education/davis.html. 1999.

_____________. Thinking Like an Engineer: Studies in the Ethics of a Profession . Oxford: Oxford University Press, 1998.

Fisher, G. "A 21st Century Renaissance." The Bridge 30 (3/4): 2000, available at http://www.nae.edu/nae/naehome.nsf/weblinks/NAEW-4STLDS?OpenDocument .

Goddard, D. "Teaching why accuracy and clarity are ethics issues." Journal of Engineering Education 90 (1): 2001, 119-121.

Gorman, M., et al. "Transforming the engineering curriculum: lessons learned from a summer at Boeing." Journal of Engineering Education 90 (1): 2001, 143-149.

Gorman, M., Hertz, M., Magpili, L., Mauss, M. & Mehalik, M. "Integrating ethics & engineering: a graduate option in systems engineering, ethics and technology studies." Journal of Engineering Education. 89 (4): 2000, 461-469.

Gorman, M., Mehalik, M. & Werhane, P. Ethical and Environmental Challenges to Engineering . Englewood Cliffs, New Jersey: Prentice Hall, 2000.

Harel, D. Algorithmics: The Spirit of Computing , Cornwall: T.J. Press (Padstow) Ltd. 1987.

Harris, C., Pritchard, M. & Rabins, M. Engineering Ethics, 2nd ed ., Belmont, California: Wadsworth, 2000.

Hayles, N.K. "The condition of virtuality." In Peter Lunenfeld (Ed.), The Digital Dialectic: New Essays on New Media . Cambridge: MIT Press. 2000, 68-94.

Heim, M. "The cybernetic dialectic." In Peter Lunenfeld (ed.), The Digital Dialectic: New Essays on New Media. Cambridge: MIT Press. 2000, 24-45.

Herkert, J. "Integrating Engineering, Ethics, and Public Policy: Three Examples." In David Ollis, Katherine Neeley, and Heinz Luegenbiehl (eds.), Liberal Education in Twenty-First Century Engineering . New York: Peter Lang. 2004, 129-144.

______________. "Professional societies, microethics, and macroethics: product liability as an ethical issue in engineering design," The International Journal of Engineering Education 19 (1): 2003, 163-167.

______________. (ed.) Proceedings of the 2002 International Symposium on Technology and Society (ISTAS'02): Social Implications of Information and Communication Technology . Piscataway, New Jersey: IEEE, 2002.

______________. "Future directions in engineering ethics research: microethics, macroethics and the role of professional societies." Science and Engineering Ethics 7 (3): 2001, 403-414.

_____________. Social, Ethical and Policy Implications of Engineering . New York: Wiley-IEEE Press, 2000.

Johnson, D. Computer Ethics. 2nd ed . Englewood Cliffs, New Jersey: Prentice Hall, 1994.

Jonsen, A. The Birth of Bioethics . Oxford: Oxford University Press, 1998.

Koppell, J. "No 'there' there: why cyberspace isn't anyplace." Atlantic Monthly 286 (2): August 2000, 16-18.

Ladd, J. "The quest for a code of professional ethics: an intellectual and moral confusion." In Rosemary Chalk, Mark Frankel, and S. B. Chafer (eds.), AAAS Professional Ethics Project: Professional Ethics Activities in the Scientific and Engineering Societies . Washington, DC: AAAS. 1980, 154-159.

Layton, E. The Revolt of the Engineers: Social Responsibility and the American Engineering Profession. Baltimore: The Johns Hopkins University Press, 1986.

Leveson, N. & Turner, C. "An investigation of the Therac-25 accidents." Computer 26 (7): 1993, 18-41.

Lynch, W. "Teaching engineering ethics in the United States." IEEE Technology and Society Magazine 16 (4): 1997/98, 27-36.

Lynch, W. & Kline, R. "Engineering practice and engineering ethics," Science, Technology and Human Values 25 (2000): 195-225.

Martin, M. & Schinzinger, R. Ethics in Engineering, 3rd ed. New York: McGraw-Hill, 1996.

Mclean, G. F. "Integrating ethics and design," IEEE Technology and Society Magazine 12 (3): 1993, 19-30.

Mitcham, C. "Computers, Information and Ethics: A Review of Issues and Literature." Science and Engineering Ethics 1 (2): 1995, 113-132.

Moor, J. "What is Computer Ethics?" Metaphilosophy 16 (4): 1985, 266-275.

Nelson, V. The Secret Life of Puppets. Cambridge: Harvard University Press, 2001.

Neumann, P. (Moderator). Forum on Risks to the Computer Public in Computers and Related Systems, available at: http://catless.ncl.ac.uk/Risks .

O'Connell, B. & Herkert, J. "Engineers and Computing: Ethical and Legal Issues." Tenth Annual Meeting, Association for Practical and Professional Ethics, Cincinnati, Ohio, 2001a.

______________. "What Engineers Should Know About the Ethical and Legal Aspects of Computing." Liberal Education Division, American Society for Engineering Education Annual Conference, Albuquerque, New Mexico, 2001b.

Olsen, S. "Cat scanning device may track users online." C\Net News, available at: http://news.com.com/2100-1023-246008.html?legacy=cnet . 2000.

Pour, G., Griss, M. & Lutz, M. "The push to make software engineering respectable." Computer 33 (5): 2000, 35-43.

Pritchard, M. & Holtzapple, M. "Responsible engineering: Gilbane Gold revisited," Science & Engineering Ethics 3 (2): 1997, 217-230.

Roddis, W.M.K. 1993. "Structural failures and engineering ethics." Journal of Structural Engineering 119 (1993): 1539-1555.

Rogers, G. F. C. The Nature of Engineering: A Philosophy of Technology. London: Palgrave Macmillan, 1983.

Sackman, H. "Computers and human problems." In Harold Sackman and Harold Borko (eds.), Computers and the Problems of Society. Montville: AFIPS Press. 1972, 1-30.

Tucker, A., et al. Computing Curricula 1991, Association for Computing Machinery and the Computer Society of the Institute of Electrical and Electronics Engineers. 1990, available at http://www.computer.org/education/cc1991/ .

Universal City Studios, Inc. v. Corley. U.S. App. LEXIS 25330 (2nd Cir. 2001).

Vanderburg, W. "Preventive Engineering: Strategy for Dealing with Negative Social and Environmental Implications of Technology." Journal of Professional Issues in Engineering Education and Practice 121 (1995): 155-160.

Vincenti, W. What Engineers Know and How They Know It. Baltimore: The Johns Hopkins University Press, 1990.

Weizenbaum, J. Computer Power and Human Reason. New York: W.H. Freeman & Co., 1976.

Werhane, P. "A Note on Five Traditional Theories of Moral Reasoning." Darden Graduate School of Business Administration, University of Virginia, 1994.

Whitbeck, C. Ethics in Engineering Practice and Research. New York: Cambridge University Press, 1998.

Winner, L. "Engineering ethics and political imagination." In Paul Durbin (ed.), Broad and Narrow Interpretations of Philosophy of Technology: Philosophy and Technology, Vol. 7. Boston: Kluwer. 1990, 53-64.

Woodhouse, E. "Overconsumption as a challenge for ethically responsible engineering." IEEE Technology and Society Magazine 20 (3): 2001, 23-30.

Wulf, W. "Changing Nature of Engineering." The Bridge 27 (2): 1997, available at http://www.nae.edu/nae/naehome.nsf/weblinks/NAEW-4NHMBD?opendocument .

Notes

1 The term "computing ethics" is employed intentionally to avoid unnecessary disciplinary restriction. However, what follows is specifically directed to ethics as taught within or in conjunction with computer science and engineering departments, which is termed "traditional computing". This is due to reasons of economy. While management information science (MIS) and associated disciplines have generated significant ethical scholarship, they are excluded from present consideration as their orientation may reasonably be placed within the theoretically separate framework of business ethics. Software engineering is also developing a separate disciplinary status. While its criticisms of traditional computing closely track those presented here, the numerous political, ideological and theoretical issues affecting its development place its consideration beyond the limited scope of this paper.

2 The location of mainstream computing within the academic environments of computer science and engineering is not an incontrovertible submission. There are clearly many important initiativesemerging from professional and non-conventional sources. The statement is made for two reasons. First, for most professionals, the academy remains a common and primary source of formal indoctrination. Second, even for those without a formal background, it is a mediator of professional culture through its literature, research and graduates.