SPT v4n3 - Information and Systems Dimensions of Technological Artifacts
View the index of articles available in PDF
Guest Editor: Evandro Agazzi
Guest Editor: Hans LenkINFORMATION AND SYSTEMS DIMENSIONS
OF TECHNOLOGICAL ARTIFACTSLadislav Tondl, Czech Academy of Science
1. THE TECHNICAL WORLD AND ITS DIMENSIONS
From cradle to grave, from our birth till the end of our lives, we are surrounded not only by the world of human beings (i.e., the social world), and the world of nature (i.e., the geosphere and biosphere), but also by the world of artificial human constructs—i.e., the world of artifacts. In this world, sometimes figuratively called "our second nature," a key position is occupied by those artifacts that manifest our knowledge and, of course, also our values, and that are sometimes collectively referred to as "technology" or the "technical world." Stressing this knowledge- and value-related conditioning of technological constructs actually corresponds to the original meaning of the Greek word techne , i.e., abilities, skills, or knowledge for solving a certain problem situation by seeking, and especially creating, adequate means for such a solution. In this sense, each technical object or technical solution has its own information dimensions.
The technical world contains not only objects, i.e., different human-made means, tools, machines and—to take into consideration also the current level of this world—automation and artificial intelligence, but also events and processes that transform material, energy, and information aspects of our situation and that are disseminated and initiated by us, including processes of automatic regulation. At the same time, we should realize that this technical world leads its own specific "life," which continues to be enlivened by human knowledge and development of that knowledge, by accepted value structures, and also by changes in those structures. Naturally, by no means do we want to overestimate these rather metaphorical statements, even though, as with other metaphorical images, many philosophical conceptions of the current technical world have been based on them. Instead, we should aspire to emphasize the usefulness of taking a global approach to individual segments of the technical world, and the need to adopt a systems approach—one which takes into account both intellectual and value dimensions, as well as mutual interactions among intellectual, material, knowledge, and humanitarian aspects of those components.
Seen in this given context, system dimensions mean that individual components, as well as parts of the technical world, mutually affect and condition one another. At the same time, these components have their own life, development, birth and extinction; they occur within a certain temporal rhythm and in a certain direction. When describing, characterizing, or explaining technology, we can, therefore, use certain biological metaphors. In different stages or phases of this life—which have their analogies in what we describe as ontogenesis and phylogenesis, i.e., processes of the birth of an individual and the origin of a type, species, or class—accents on the intellectual, material, energy, and information dimensions; specifications of possible usable applications and limits to those dimensions; problems of short-term and long-term impacts; and all possible circumstances and risks of those impacts tend to be reflected with varying intensity. But the realm of technology, as was aptly pointed out by the French philosopher Jacques Ellul ([1954] 1964) , also has its own life, i.e., the possibility of a relatively autonomous development with an independent duration which need not always correspond with natural ideas about its goals and anticipated functions. It cannot be ruled out that some components of this world may get out of human or social control, as the Czech writer and playwright Karel Capek warned in his vision several decades ago. (In this vision, Capek coined the word "robot" for an artificial technical being.) That such a vision was not so far removed from reality has been brought home to us by the tragedy at Chernobyl, by other accidents, and also by a number of irreversible changes in the natural environment.
We create a technical world in order better and more efficiently to achieve our goals, creating a system of means to harness nature's resources and capacities and to put those resources to a better use, while remaining an integral part of nature and striving for a more perfect exploitation of our resources and capacities. Moreover, we can hardly ignore the fact that the means of this world in many respects control us, affecting our value structures, objectives, ambitions, and directions. We are also controlled by many technical means because without them we cannot imagine our own self-assertion, the fulfilment of our ambitions and goals and the attainment of many values. As a result, we master many elements of the technical world—while still being controlled by them—subjecting ourselves to rhythms and dimensions dictated by technical artifacts, and doing so voluntarily and at our own discretion. We are also controlled by some elements of the technical world without being aware of such control, without considering such control a limiting factor. On the contrary, our mutual relations with the world of technology tend to reinforce our sense of power, thus also boosting our ego. Therefore, it is not always a case of rebellion, in that some technical means get out of human control or cease to meet required and anticipated expectations. What is also involved is that we ourselves transform these means into the goals of our own endeavors; we fully adjust ourselves to them; or we subject ourselves to their behavior. If it is beyond doubt that the technical world and most of its elements help us to gain a lot—expanding attainable horizons, including the horizons of knowledge, and helping us to move forward within these horizons according to invariably finite human and technical possibilities— then we can hardly escape the question as to what we would be deprived of, what we would lose or, at least, what we would have to pay for the loss of undeniably great achievements and benefits.
The technical world and its creation, utilization, and development, controlling its impacts understood in the broadest sense of the term, make it necessary—and this is undoubtedly true of the current level of our world—to involve an adequate level of knowledge, as well as an accepted or acknowledged value structure. That is why connections involving knowledge, human or, better, social behavior, and human-made constructs (i.e., artifacts) are characterized by a word made up of two Greek words, techne and logos .
Viewing the technical world as a system, then, this system is made up of interactions involving three basic subsystems, namely:
—a subsystem of technological knowledge;
—technical actions; and
—technological artifacts.
The development of this system as well as its basic subsystems is neither simple nor linear even though we often associate the term "technological progress" with a good deal of one-sided admiration or even euphoria. Such a development pattern, certainly during this century, has not managed to steer clear of difficulties, pitfalls, and dangers, and this has raised many difficult questions—questions that are virtually impossible to answer or to find a universal pattern of answers for. I have attempted ( Tondl, 1968 ) to paint a global picture showing the lights and shadows, the pros and cons, the achievements and risks brought about by the development of the technical world; I had been invited by Tadeusz Kotarbinski to write a paper on the double face of technology, using the analogy of the double-faced Roman God, Janus. An attempt at offering a global view of the complex fates of technical development can also be found in a study by J. J. Salomon (1992) which links studies of the complex issues associated with the ancient hero Prometheus to an insightful criticism of the illusions of so-called technological determinism or other ideas overemphasizing the impact of technological changes on social development.
2. CAN SPECIFIC FEATURES OF THE TECHNICAL WORLD BE ESTABLISHED?
When discussing the salient features (the substantial properties) of the elements of the technical world, we have no great difficulty in establishing them. They are those human-made or human-adapted objects, events, or processes which help us to meet our goals better and more efficiently; that is, tools and newly created conditions and circumstances of human activities; machinery; and complex systems of such instruments. That is why elements of the technical world are thought to include all that enhances the impressiveness or efficiency of purposefully-oriented activities, regardless of the fact that such elements, events, or processes are totally dissimilar with respect to one another, that they are in principle different, and that their functions have virtually nothing in common (perhaps with the exception of quite general and abstract features). This almost total absence of anything that might resemble the similarity shared by related species or genera has prompted some philosophers who have thought it necessary to talk about the technical world to come up with descriptions of a very universal or abstract nature. As an example, we find in Martin Heidegger (1962) the statements that technology is a "form of unburdening" ( eine Weise des Entbergens ) and that it plays the role of a "plinth" ( das Gestell ). The aptness of such statements certainly need not be doubted; at the same time, we can hardly escape feelings of unease and hesitation when we are called upon to assess the explanatory function of such claims. The same holds true of the statement that the substance of technology lies in a "challenge to nature" ( Herausfordern ).
Other philosophers accompany metaphoric descriptions with critical remarks about the function of technology. For instance, Karl Jaspers (1931) said that technology separates man from history and from direct contact with nature. Efforts to find some kind of general formula or universal pattern to characterize the technical world have led some philosophers to attempt a global description of the basic traits of the technical world in its entirety (even though this world is extremely variegated, a truly heterogeneous whole). A critical analysis of these opinions may be found in a collection of studies published by Hans Lenk and Simon Moser (1973) or in the book on the analytical philosophy of technology edited by Friedrich Rapp (1978) .
Each of these attempts at capturing technology globally emphasizes partial aspects or dimensions, while—and this applies primarily to the current level of the technical world—that world is distinctly multidimensional, having a number of different characteristics whose significance or role varies considerably in different branches. Moreover, conflicting examples may easily be found under individual situations. If one claims that technology separates man from direct contact with nature, then one has to admit that some technologies—as well as measurement and experimental devices—actually help us to examine more profoundly many mysteries of nature, e.g., the micro world or distant outer space. It would probably not be difficult to find other counterexamples of such statements which can therefore be described as global or one-dimensional characteristics or metaphors.
Speaking in defense of such characterizations, one may say that mankind has always sought global, all-encompassing patterns aimed at capturing all the aspects of the world and of life even though ours is always a highly varied and pluralistic world.
If we are reluctant to accept one-dimensional perceptions of the multifaceted and diversified world of technology, especially in its different historical stages, neither will we be prepared to accept off-hand condemnations of technology as a whole, as a world rife with dangers and risks, even the possible destruction of mankind and the human species. We often encounter such one-sided and globally shattering criticisms of scientific and technological development, critiques called "cultural criticism" of technology, including condemnations of "technocracy" or of "technoscientists" in terms of disdain and outright rejection. Naturally enough, such sweeping criticisms in no way contribute to solving the many problems which are—we must admit that—posed by the development of contemporary technology. Such an approach rather seems to reflect faith in a fundamentalist solution, diverting attention from important issues: for instance, issues of the social assessment of new steps in the world of technology; thinking about variant solutions and the selection of optimum variants or alternative technical solutions; efforts to restrict the subordination to one-sided criteria in technical and technological decision-making and assessment; to humanize technical and engineering education; to introduce an appropriate level of responsibility and other cultural, ethical, and value-related principles into contemporary technological thinking.
It is thus impossible to regard as successful any attempts at articulating simple definitions of the term "technology" in a single and all-encompassing formula. In these contexts, Rapp (1978, p. 31) quotes Friedrich Nietzsche as saying that only that which has no history can be defined. If we realize that artifacts which can be described by the term technology include the modest tools of primitive humans—for instance, flint or simple hammers—as well as the instrumentation in today's chemical or biological laboratory or the control room of a large power-generating facility, we must come to the conclusion that any search for a single all-encompassing description offers no hope of success. This naturally does not rid us of the duty to continue seeking generic and species similarities within that sphere of objects or processes that we are prepared to recognize as the "technical world."
3. TECHNOLOGICAL ACTIONS AND TECHNOLOGICAL ARTIFACTS AS DELEGATED INTELLIGENCE
The idea of delegating creative activity and mastering its results actually appears at the very beginning of the Book of Genesis in the Old Testament, and man as "homo faber" is not limited in his creativity. However, there are limits of two kinds:
—limitations imposed by feasibility, including elementary feasibility, caused by the force of natural laws; and relative feasibility imposed by available—and therefore limited—knowledge, means, resources, and capacities of all kinds; and
—limitations imposed by accepted and acknowledged value structures, including cultural and moral values, as well as individual and social responsibility—even in situations where such values are disregarded or violated.
Humans invest, in those activities which may be characterized as technological, elements of a certain intelligence. They specify the goals of such an activity, proposing—and later applying—known procedures, methods, and algorithms, making it possible to attain selected goals. They apply, assert, and practically utilize past scientific findings. All of this leads to the conclusion that technology is, indeed, an applied science. This concept was developed especially by Mario Bunge (1974) .
Two limiting remarks should be added to his theory:
(a) It is beyond doubt that any attained level of technology (viewed as a system of interactions of appropriate technological knowledge, activities, and human-made artifacts) also constitutes an application of past scientific findings. These are, however, applied to a varying extent and with varying intensity in different sectors of the technical world. Furthermore, there are other factors, stimuli, and initiatives at play which are dependent on the global social, political, and values situation. The role of attained knowledge comes to the fore much more prominently during the start-up and development of radical technological innovations; it is involved to a much lesser degree in those innovations and technological changes that repeat already accepted and well-tested models or standards—i.e., in cases of so-called incremental innovations.
(b) The very term "application" is highly ambiguous. It does not always refer to direct use of scientific findings but very often to considerably mediated utilization—an indication of a spectrum of possibilities and limits to such possibilities—while the choice of realized alternatives depends on a number of circumstances, particularly on stipulated goals and priorities, and also on available resources, capacities, and time possibilities.
Given what we have characterized as "delegated intelligence" in the sphere of technological activities, we have to realize that this sector is usually a system of parallel and sequentially arranged activities which link up to one another. It is a system of activities that are unthinkable without information and knowledge prerequisites, including the following in particular:
—delineation of intentions, goals, or requirements, together with a selection of means, procedures, and suitable methods;
—the process of designing and proposing proper procedures for a solution, including feasibility studies;
—implementation of proper procedures;
—processes of utilizing, mastering, or otherwise fructifying created artifacts; and
—termination of activities or liquidation of artifacts following the loss or decline of their utility.
All of these and other activities associated with the conceptual or design preparation of a technological solution, its realization, and use of its results, call for information preparation and information equipment in two senses:
— a priori knowledge equipment on the part of the subjects of such activities, i.e., their sufficient competencies, adequate professional training, and everything usually provided by prior qualifications, including prior experience; and
—being equipped with a posteriori information prerequisites appropriate for the solution of the given task at a level which is recognized as acceptable or sufficient (usually according to accepted qualification requirements and other criteria).
All the types of technological activities, particularly those evolved at current engineering levels, call for the ability to provide a suitable combination, interaction, or interplay of a priori and a posteriori information prerequisites. Having said that, we naturally do not rule out the possibility that common sense or a feel for suitable solutions in a given situation, erudition acquired by long-standing experience, and naturally also the possibility of happening upon a solution suitable for the given situation may assert themselves at various levels of technological activities. It is likewise important that those engaged in technological activities be people of all types, with varying degrees of inventiveness and creativity, people who simply imitate what they have learnt or mastered earlier or who move within the framework of acknowledged stereotypes, people who are capable of combining such stereotypes with their own initiatives, and also trailblazers, people seeking new paths, new solutions, or even new goals for such solutions. It also holds true of different technological branches; this, after all, applies to any intellectual activity. Just as science is not only made up of personalities such as Newton and Einstein, so also technological activities need—beside the Einsteins—hundreds of thousands of skilled experts and millions of capable and erudite technicians.
Generally speaking, one can say that, the more demanding and complex the technological solution, the more important will be its information and design preparation. At the same time, and this applies especially to the initial stages of such preparation, it is ever more important to think in terms of variants of technological solutions and to assess these variants, their possible impacts or the risks associated with them. Preparation of technological solutions, planned technological solutions, and variant technological solutions— and simultaneously, complex multicriterial assessment of these variants—are, therefore, special features of the higher levels of contemporary technological activity.
Discussions of various forms of technological activities and the results of such activities—including projects, their development, experimental or final realizations as delegated intelligence—inevitably raise the question what is it that is actually delegated; what is embodied in those activities and their results? What we call "delegated intelligence" (while in no way challenging the metaphoric nature of such a statement) does not constitute a homogenous set of intellectual elements but rather a complex of interactions and feedbacks of the following three basic subsystems:
First , what is delegated represents knowledge . To be able to participate in calculating the statics of a building project, I must possess the knowledge of basic relations, suitable mathematical methods, appropriate input data, etc. To be able to handle a computer successfully, efficiently utilizing its capacities, I should master an appropriate programming system, i.e., the software of the given computer. As a rule, the sets of necessary knowledge cannot be confined solely to algorithms, instructions, and important guidelines; it is crucial also to know the goals of the given task, demands on the quality of its solution, etc.
Delegated intelligence in technological solutions, in follow-up activities and their results—i.e., in technological artifacts—also includes a second subsystem of intellectual elements, which may be described as value structures . These are recognized, accepted, or— in normative systems—fixed value criteria; views concerning what does or should have priorities; what can be regarded under the given circumstances as acceptable or unacceptable, etc. The importance of different components of value structures may, at the same time, be considerably different in different situations. Some have a limited significance for the particular thematic or problem area (e.g., technological standards, safety limits, etc.), while others can be viewed as generally valid principles.
It is impossible to exclude from possible impact on some technological solutions or technologically important decisions a third subsystem—of influences, factors, or stimulations. This third subsystem forms an integral part of the intellectual sphere but is definitely less rational or less grounded in justified or confirmed knowledge. This is the subsystem of global ideas concerning the world and life, overall orientation which is sometimes characterized as an ideology or global attitude to the world and life. Some attitudes, and particularly some beliefs or ideas about the arrangement of the world and society, are manifest in artifacts which have survived as evidence of those attitudes and ideas—starting with Egyptian pyramids, temples, triumphal arches, or funeral buildings, and ending with huge palaces whose chief function was to serve, once in several years, as a venue for a congress of a totalitarian political party. The impact of such attitudes and ideas usually cannot be fully separated from those factors we have described as value structures, since they are projected into the areas of preferences, aesthetic values, and other factors that are part of technologically relevant decisions. In many cases, these ideological influences or global attitudes are somehow covered up or concealed by the semblance of being matter-of-course ideas, implying that in the given period or under the given situation they are generally acknowledged and accepted.
If what we have called delegated intelligence is, indeed, a variegated complex of intellectual knowledge and value factors that affect or stimulate technological decision-making and the origin and creation of technological solutions and technological artifacts, then the elements of the self-same complex are not without their impact on the sphere of utilization of technological artifacts as well. Until quite recently key attention in technological and engineering thinking and decision-making was focused on the genesis, the scientific or technical or economic justification or substantiation of technological solutions. Now, however, not even the spheres of application, utilization, and control of the social conditions for mastering the technical world and its components can remain unattended. Since virtually all the inhabitants of our planet are in direct or mediated contact with these components and their impacts, this particular problem area should be the subject of general studies and analyses. This problem area is sometimes called the "social control of technology," even though it also covers feedback systems, including what can be called the human control of the technological environment.
4. TECHNOLOGICAL ARTIFACTS AS AN INTERFACE
The idea of characterizing artifacts as an interface between an artifact's internal structure and its external environment—which also includes the human or social environment and thus encompasses the artifact's author and user as well—was actually introduced into the characteristics of human constructs and, consequently, also elements of the technical world by Herbert Simon (1969) . This characterization presupposes that there is, in between its internal structure or basic characteristic, its composition or inner organization, and the external environment surrounding this internal structure, an active network of links, relations, or interconnections of different kinds. As regards technological artifacts, it is important that these interlinks—whose overall designation was adopted by Simon from information technologies—should have not only the nature of information but also the character of a material, energy, and/or information exchange; and this exchange is initiated, disseminated, and controlled by humans, by a set of specific means-end rational technological activities.
The links between the internal structure of technological artifacts and external environment—which includes both external material and energy sources, as well as humans initiating, managing, and performing other user functions—are naturally different in different technological branches. In the case of traditional mechanical technologies they are primarily material and energy sources. As for water and wind mills, as we know from the past century, their function was based on the water table gradient and wind currents, so that such facilities could be installed on waterways and in coastal areas where regular wind currents were known to occur. But even these facilities could be regulated; there were means to contain a stream or to build an appropriate connection to interrupt its course. But a water mill could be functional only when it was possible to bring in grain to be ground into flour. Even these technological systems had something analogous to what is known in information technology as an interface; to what Simon calls more generally a "meeting point."
For the operation of those facilities, certain knowledge—at least minimum instructions on how to start and stop the operation or how to secure its adequate efficiency, etc.—were also needed. Needless to add, perhaps, in more complex and demanding technological systems, a network of information links that make it possible to control complex situations in a larger spectrum with larger sets of possible reactions to external stimuli, is of ever greater importance. Interestingly enough, when creating systems of such information links and control systems, people have come up against the finite possibilities of human capacities for efficient control. This may be aptly demonstrated by the development of aviation before and during World War II, when the proliferation of different indicators and measuring systems—i.e., the great number of the gadgets essential for handling such a complex machine as an aircraft—reached the limits of the managing and control capacities of a human pilot. As a result, the problems of ergonomy were directly and empirically run up against: that is, relations between a managing human subject and the complexity of a technological system; the limited nature of human managing capacities; the fact that some of these information links were absolutely vital for the functioning of that technological system and the safety of its human user while others proved to be less substantial or did not require immediate and direct intervention. These problems resulted in two important trends in contemporary technical and engineering thinking:
—the selection of substantial indicators and criteria for safeguarding the functionality of a given technological system, the control of its reliability and security, and efforts to remove serious risks; and
—automatic control in some managing and control functions, connected with warnings of approaching risks or accident-prone situations.
It seems important today, for the purpose of analyzing, creating, designing, and constructing the information links of a technological system and its human environment, to make a careful assessment of major as well as smaller accidents; to determine the actual causes of such accidents, starting with the tragedies of Bhopal and Chernobyl and ending with recurrent accidents of transport systems, mass communication systems, and the many breakdowns of technological facilities that have grown to be a common part of our everyday life. Such studies should establish both the share and the role of human failure, the underestimation of certain situations (when the likelihood of an accident was thought to be negligible), as well as statistical data concerning recurrent technological breakdowns, and causal analyses of such failures.
The term "interface," introduced into the analysis of the nature and function of a technological artifact, basically indicates that isolated objects, processes, or events are never involved; that these are entities whose nature and functionality is delineated by sets of relations with the overall environment. This environment also includes the human user, his possibilities, capacities, knowledge, values, and goals, as well as the available resources and means, including material, energy, and information resources. This need to take a complex view of the elements of the world of technology is, to a considerable extent, stimulated by the experience of the highest levels of contemporary technology, particularly information technologies. However, also included are concerns relating to the social, cultural, and value environment in which technological artifacts are created, key functions being required and transformed, through some type of social behavior, from means into the goal of such behavior.
At the contemporary level of our technological knowledge, the problem complex which Simon has characterized as an interface consists of at least two major problem areas:
(1) Of considerable importance always is the set of problems concerning regulation and control of technological artifacts. In this respect, a key position is occupied by the relationship between human control and automatic self-regulation, which, naturally, at today's level, covers a wide-ranging set of means, starting with simple regulators (for instance, Watt's steam regulator) and ending with computer regulation and artificial intelligence.
(2) The external environment may invariably affect a technological artifact's inner structure. As a result, these factors should be controlled and maintained in a desirable state. Already decades ago a farmer knew he had to wipe his scythe after work to prevent it from corroding, and to hammer it for further use. Seen in this light, there persists the issue of preserving a functional and usable state, of maintaining adequate functionality within the framework of permissible intervals of such functionality. In biological systems, these procedures are usually described as homeostasis. The principles of homeostasis are equally important for most technological artifacts that are supposed to discharge their functions within an anticipated time interval with sufficient reliability and at a level of acceptable risk.
The introduction of these terms—particularly the terms "interface" and "control," "homeostasis," and some others, that originated in the field of information technologies—into reasoning about the nature and functions of a wider set of technological artifacts, invites us to take up other analogies or metaphorical expressions. It seems that, in many technological branches, analogies are made that relate to what is called, in computer technology, "hardware," "software," or "orgware." This is true even though, in the case of many technological artifacts, it is very difficult to distinguish a technological facility or process from the set of knowledge instructions and/or algorithms that make their efficient control and use possible. Mastering such an area of knowledge and intellectual prerequisites demands protracted preparation. Consequently, it is by no means accidental that in many technological branches the set of these competencies and qualifications must be displayed and documented, undergoing appropriate studies and passing examinations, etc. Together with the growing importance and weight of these prerequisites, the significance of social control, suitable legal regulations, and naturally also individual and social responsibility keep on rising. While we take it as a matter of course that a challenging diagnosis should only be made and subsequent therapy prescribed by a highly qualified medical specialist, a complex technological system must also be left only in the hands of a qualified and highly responsible engineer. Just as medical specialists always have to cope with charlatans in their midst, qualified experts in certain technological branches have to fight the danger of dilettantism, of taking irresponsible risks, or underestimating possible danger signals. After all, it is precisely such an underestimation which constitutes one of the main causes of human failures and the accidents associated with them.
5. INTEGRATION OF KNOWLEDGE AND VALUES IN THE SPHERE OF TECHNOLOGY
I have emphasized that technological activities and technological artifacts may be characterized as a process of delegating intelligence. Such intelligence develops in historical terms, but it also attains, in each period, different and sometimes considerably conflicting forms, and it does not cover only hitherto-acquired knowledge, or even only the results attained by science at a given historical level. In this respect, the objections levelled against Bunge (1974) are undoubtedly justified—even though the development of technologies has always depended to a large degree on scientific findings and to a great extent reflected the level of such findings. But, as has already been emphasized, this development pattern has also been a way of manifesting accepted value structures, prevailing global attitudes to the world and life, and therefore also what is sometimes characterized as ideology, faith, worldviews, ideas concerning the nature and structure of a given society, etc.
Egyptian pyramids are justifiably thought to be manifestations of acquired geometric knowledge, as well as the knowledge of certain mechanical principles embodied in the construction methods of the time. But research projects carried out only in recent decades have expanded our horizons, adding some pieces of astronomical knowledge and certain stellar complexes that symbolized Egyptian gods. Of great importance, for instance, was the Egyptian's faith in the possibility of posthumous life, but only if some conditions, including preserving the body, were met.
It is beyond doubt that—beside necessary knowledge—religious beliefs have been reflected in the conceptions, locations, and overall orientations of temple buildings, mausoleums, and cemeteries. That these influences need not be constant, even within the framework of a single religious belief, is corroborated by the differences in types and concepts of temples dating to different historical eras. But on many occasions these differences could not prevent the ancient temples, from the Roman period, or other buildings of that era from changing their functions—serving, for instance, as Christian churches, as buildings with totally new purposes.
This interconnection of faith or global attitudes and ideologies with technological artifacts is, after all, a characteristic trait that marks modern eras as well. Particularly, architecture has been a very sensitive and ideologically affected sphere. The ideology of some totalitarian systems is reflected in the megalomania of triumphal arches (which imitate ancient buildings) but primarily in "palaces of culture," sports stadiums for hundreds of thousands of spectators, and also buildings that have changed and disturbed the existing natural balance. This applies to large water surfaces, river dams which have prevented the regular pattern of floodings that have safeguarded for millennia the flourishing of the so-called potamic cultures; and the disturbances of the water balance in the rivers of Central Asia; etc. Many of these artifacts have been instrumental in unleashing an irreversible process which has permanently upset the original natural equilibrium, caused serious pollution of groundwater or the drying up of large bodies of water, as illustrated by the Aral Sea. These artifacts were originally meant to demonstrate unlimited human power and mankind's unbridled control over nature as well as some other ideological principles.
It would be wrong to suppose that this information about intellectual or spiritual factors, as well as other socially and historically conditioned priorities, affects all technological branches to the same degree. Furthermore, it is important to single out two more major circumstances:
(1) First and foremost, it is significant to note that these factors assert themselves most prominently in those decisions or in those technologically relevant steps characterized as intellectual, conceptual, or information preparation. A well-known explanation of the difference between a human builder and a bee, which is always noted for its perfect buildings, stresses that unlike a bee a human builder first creates in his head his required, intended, and planned artifact as a plan or a project, as an information model of a future work. (This image, as is well known, was also employed by Karl Marx in his explanation of the nature of human labor in Das Kapital , even though in his global concept of technical and economic development Marx did not appreciate the information aspect of intellectual, and especially knowledge-related, factors.) It is precisely in those phases—the stages of intellectual and conceptual preparation for a technological solution or a technological artifact—that value systems, global attitudes, ideological factors, religious concepts, etc., are engaged most prominently. Consequently, initiation and start-up of technological changes call for a sequence of interconnected activities involving important decision-making and evaluation procedures. Notably the following types of activities are involved:
—distinguishing, recognizing or identifying an ascertained or sufficiently recognized problem situation; recognition involves, among other aspects, the realization that the time has come for solving such a situation;
—emergence of wishes, needs, intentions, goals, and requirements related to the nature of an acceptable solution of the problematic situation— primarily proposals or initiatives for possible solutions;
—identifying, stipulating, or proposing adequate means, methods, procedures, or orientations to possible solutions in view of accepted or acknowledged goals;
—initial study, pre-project conceptual preparation, feasibility studies, launching one's design process;
—solution of problems associated with the possible or anticipated impacts, risks, or difficulties facing implementation; also one's functional procedures, solution of the problems of time horizons relating to anticipated liquidation, etc.
As regards all these types of activities, where intellectual or spiritual factors apply, naturally no less important are the problems associated with the intellectual and especially knowledge equipment of the subjects taking on such activities; their competence, viewed in the broadest sense of the term; their ability to convince others participating in the selection, evaluation, and consequently in basic decision-making about the choice of a particular solution, its start-up, chosen form, and delineation of roles and partial responsibilities.
(2) When considering the impact of the value and spiritual or ideological factors, we should not lose sight of the fact that in the formulation of primary intentions and subsequent decisions these factors appear very often in rather concealed forms, overshadowed by other forms supporting the chosen solutions. Sometimes it is assumed that technological decisions form a matter-of-course part of an accepted value structure, or are fully compatible with that culture. In discussions about the significance, social conditions, and circumstances relating to the acceptability of technological and particularly innovation and investment decisions, the following types of questions must be raised:
—Is a qualified engineer or designer entitled to stand up, openly, against orders and commands if he or she is not convinced of their expediency, usefulness, and, notably, lack of risks?
—What are the limits of civic or democratic controls of technological decisions, planned technological changes, etc.? Is "vox populi," i.e., the usual voice of the majority of the citizenry, a guarantee that an optimum project will be selected or an optimum decision taken?
—Can one rely on the recommendation of independent teams of experts? How to proceed when conflicting or even contradictory views appear even among competent experts?
—Is it not expedient to analyze vested interests, values, and preferences which may be projected onto decision-making about technical changes when selecting and evaluating considered variants?
Questions of this type can hardly be answered by a simple formula. What is most important is that, if questions of this type are posed, a sufficient normative foundation should be prepared for them—notably in the legal and moral sense. Moreover, the 20th century has provided a number of tragic examples, showing just how wrong decisions and assessments in different areas of technological solutions and technological artifacts have resulted in accidents, irreparable damages, and losses. Some of those decisions were affected by ideological visions claiming that monopoly power and ideology could freely interfere with nature, that human control over nature is not subjected to any restrictions, etc. But also many other technical, investment, or innovation decisions—even in cases justified by expert analyses and seemingly competent steps—have failed to avoid the pitfalls of the hitherto prevailing illusions that we can go on exploiting supposedly infinite and inexhaustible natural resources without any adverse effects whatsoever.
The process of integrating knowledge, value-related, and other global, intellectual, or spiritual factors, and some religious ideas, should not be ignored when explaining the genesis and key motives and factors of technologically relevant initiatives, technological changes, and major innovations. The fact is that, due to their nature, these spheres of human decision-making and practical steps have been coming closer to what is characterized in models of human behavior as a "practical sylogism." A substantial feature, typical of this pattern of reasoning, is that the basic premises concern not only knowledge—in the shape of generalizations, laws, hypotheses, and otherwise specified rules, where the premises characterize the topical state, initial data, or a specific situation—but also another circle of data that express intentions, plans, needs, goals, requirements, global attitudes, and other features traditionally described as "subjective." Difficulties occur in constructing acceptable patterns of inferential procedures here due to the fact that these other features are usually bound up with what logical semantics calls "intentional operators," i.e., operators spelling out convictions, attitudes, claims, demands, preferences, etc. One can hardly make do here with traditional deductive (or, to use C. G. Hempel's phrase, deductive-nomological) patterns or statistical explanations. In the patterns used to explain human practical activity, pride of place is always occupied by available feasibility conditions, i.e., concerns relating to useable means, resources, capacities, and other requirements of implementation.
Considering the procedures for integrating knowledge and values—or, rather, spheres of attitudes, global ideas about the world and life, preferences, and diverse forms of conviction—we also cannot ignore that technological, and, consequently, engineering thinking and reasoning, at the same time must integrate, to a certain degree, different types of knowledge. This involves natural science knowledge, knowledge of mathematics, and also the knowledge of possible impacts, effects, or possible risks posed by procedures. It also applies to the knowledge concerned with an area that may be generally characterized as the Promethean complex of contemporary technology, i.e., an awareness of the actual price we have to pay for having initiated technological changes. Since technological processes and technological artifacts always operate in a certain environment, in a milieu whose most important factor is the human user, we can hardly disregard the human and social dimensions of knowledge provided by some branches of the humanities. Seen in this light, contemporary technology, and its multidimensional character, offers one of the most important stimuli for the interdisciplinary approach, as a necessary and desirable integrator of different cultures.
REFERENCES
Bunge, Mario. 1974. "Technology as Applied Science." In F. Rapp, ed., Contributions to the Philosophy of Technology . Dordrecht and Boston: Reidel.
Ellul, Jacques. 1964. The Technological Society . New York: Vintage Books. French original, 1954.
Heidegger, Martin. 1962. Die Technik und die Kehre . Pullingen: Verlag G. Neske.
Jaspers, Karl. 1931. Die geistige Situation der Zeit . Berlin and Leipzig. 5th printing, Berlin 1947.
Lenk, Hans, and Simon Moser, eds. 1973. Techne, Technik, Technologie: Philosophische Perspektiven . Pullach: Verlag Dokumentation.
Rapp, Friedrich. 1978. Analytische Technikphilosophie . Freiburg and Munich: Verlag Karl Alber.
Salomon, Jean-Jacques. 1992. Le destin technologique . Paris: Baland.
Simon, Herbert A. 1969. The Sciences of the Artificial . Cambridge, Mass.: MIT Press.
Tondl, Ladislav. 1968. "Der Januskopf der Technik." In Akten d. XIV. Internat. Kongress für Philosophie , vol. 2. Vienna. Pp. 570-577.