SPT v5n1 - Thinking about Technology and the Technology of 'Thinking about'

Number 1
Fall 2000
Volume 5

THINKING ABOUT TECHNOLOGY AND THE
TECHNOLOGY OF "THINKING ABOUT"

Douglas Allchin
University of Minnesota

Under Joseph Pitt's new definition of technology, philosophy counts as a technology: a tool for making sense of things ( Pitt, 2000, pp. ix, 11, 30 ). He also views technology assessment as essential ( p.15 ). Here, then, honoring the spirit of Pitt's comments, I assess his own philosophy as a technology. I first consider the work his conceptualization of technology is designed to do, then address his effort to craft an epistemology of technology. Finally, I comment on the dynamics of social discourse that shape much of Pitt's discussion.

CONCEPTUALIZING TECHNOLOGY

What is the problem? Human problems, Pitt tells us, fundamentally drive all technology ( p. 14 ). What irks Pitt and seems to motivate this book is soapbox condemnation of technology, what he refers to as social criticism ( pp. vii, 70-82 ). In developing his negative thesis, Pitt targets two problems with thinking about technology.

The first is ideology. Politics, he opines, is trying to masquerade as philosophical argument ( chapter 5 ). It is "quasi-pathological" ( p. 77 ). Pitt's position is, crudely: bias in, bias out. And his ultimate complaint seems to be: how can you deal with such critics? How can you reach joint decisions about technology ( pp. 75, 78 )? Where is the reasoned discourse that informs true philosophy? Pitt wants epistemological and metaphysical suppositions made clear ( p. 71 ). How like a pragmatist ( p. xii ).

As a careful philosopher, of course, Pitt distinguishes between ideology and conceptual schemes. Both reflect standpoints or conceptual orientations. The inherent bias does not itself pose problems. Even science, Pitt's standard for rationality, is theory-bound ( p. 33 ). But a Marxist perspective, for example, may either inform or blame ( p. 53 ). The difference is intent and use. We must decouple the morals from the concepts. Thus, at one level, Pitt merely makes an appeal for respecting the normative/descriptive distinction ( p. 84 ) and, equally, for responsible use of reliable science in description. Worthy caveats.

At another level, Pitt seems to want to banish politics from discussions of technology altogether. Technology is neutral, he echoes repeatedly ( pp. 72, 79, 82, 99 ). That may be true of a baseball bat. A baseball bat may help celebrate athletic prowess or it can express racial hatred. The bat itself, of course, is indifferent. What matters, Pitt would remind us, is the human agent. The human theme is important, and I return to it below. However, it is disingenuous nowadays to claim that Long Island parkway bridges ( Winner 1986, pp. 22-23 ), as well as buildings with no handicapped access, right-handed scissors, or safety bicycles inconsistent with women's dresses ( Bijker 1995 ) are not rife with political overtones, whether intentional or not. These permeate the instantiated meaning of many technologies. We may want to change them, surely. It may take human action to change them. But the very artifacts bias who can and cannot use them. One needs to be able to describe and assess these biases rationally. No one needs advanced technical analysis. Similarly, a gun with a two-part trigger that requires adult-sized hands and mature manual coordination is inherently more immune to use by children. "Guns don't kill people; people kill people." Still, it is harder for a child to kill with a "child-proofed" gun. Such a technology, one might argue, is rationally designed to shape who can use it. The technological artifact is not philosophically idle. One may wonder, then, why Pitt seems to peripheralize social concerns or make them secondary. Let us hope this is not some philosophical gerrymandering. Ultimately, we should welcome Pitt's appeal for discourse with explicit reasons and reliable knowledge. We should welcome an appeal for discourse itself. But not the preemptive boundaries that limit what we can consider relevant or primary.

The second element of soapbox criticism that Pitt finds problematic is the monumental notion of Technology, with a capital T. Technology reified ( pp. 66, 69, 87 ). As in "Technology is taking over our lives!" ( p. 71 ), or "Technology is to blame for the current crisis!" (or, presumably also, "Technology will save us!"). Those who anthropomorphize technology as something with intent, Pitt despairs, can easily succumb to "intellectual hysteria" ( p. 88 ). This motivates, I think, Pitt's provocative definition of technology: "humanity at work" ( pp. xi, 11 ). It could be an e-mail address: humanity@work. Pitt rejects the vernacular notion that technology is mechanical stuff, epitomized by physical artifacts such as nuclear reactors, cellular phones or stone tools. Rather, technology involves " the deliberate design and manufacture of the means to manipulate the environment to meet humanity's changing needs and goals " ( pp. 30-31 ). Ostensibly agreeing with the social constructivists that he repudiates elsewhere, Pitt views technology as a cultural activity. And he spares no effort reminding us frequently about human agency. And human purpose. Technology, perhaps, is our lived experience? Echoes of John Dewey. No wonder, then, that Pitt shudders at any suggestion that technology is autonomous ( chapter 6 ).

Pitt develops the humanist theme to some effect. Technology is not just the canonical, or stereotypical, machines and tools. We knew that. Software, as well as hardware. But Pitt extends the concept even further to social institutions ( pp. 13 note 1, 44 ). A legal and judicial system, for example, is also a tool: a tool for developing and administering justice in a culture. Science is a tool, presumably for developing reliable knowledge. It is thus important to recognize that science has an infrastructure of instruments, as well as institutions that support its purpose and shape its history ( pp. 8-9, 51-65, chapter 8 ): Galileo's telescope, the Medici court, the Hubble telescope, rockets, NASA and its contractors. The expanded concept of technology is potentially liberating. It includes (if I have this right) governments, libraries, schools, recycling programs, city parks, vegetable gardens, refugee support networks, as well as the production of blue jeans, cups of coffee and Big Macs. This is exciting. It opens new vistas for thinking about technology. At the same time, one wonders if one can sustain fresh insights across such wide-ranging analysis. Will the value get lost in the generality? Is this just a plea for a self-reflective society, for philosophy in our daily lives? Or does the model offer tools of analysis in new domains? I hope that Pitt will elaborate in future contributions.

Thematically, then, Pitt reweds technology with humans and their daily lives. One senses that Pitt does so to challenge each individual to share responsibility in our collective technological decisions, in how we manufacture society itself ( pp. 114-15 ). Pitt's model for technology ( pp. 13-15 ), admittedly abstract and idealized, is intended to capture this human element. From some human need, or problem, we engage in a first-order deliberation, or design ( pp. 35-38 ). Next comes more concrete, second-order transformation: the traditional machines, etc. But Pitt-as-philosopher asks: where is the epistemology? What gives rational foundation to this pursuit? Is it in the planning and the design? Perhaps some. But science is uncertain. Information is incomplete. Technology will inevitability fail. So, we resuscitate ourselves with CPR: the Commonsense Principle of Rationality namely, "learn from experience" ( p. 22 ). Pitt thus adds a third feature without which he would find technology (rationally) incomplete: ongoing technology assessment. This claim is central and widely important: technology does not end with the creation of artifacts (see also Tenner 1996 ). We must monitor our creations. Notice the unexpected. Regulate their use. Redesign as we go. And foremost, we should learn from our mistakes when addressing similar problems in the future. Here, we may appreciate why Pitt derides much social criticism. Rather than localize any error and endeavor to repair it, the critics that annoy Pitt want to remove the artifacts or erase the process of technology completely. For Pitt, the criticism is grossly misplaced. Hence, we should take a lesson from the Hubble telescope ( pp. 51-65 ). What a disaster! Well, no design process is flawless. But here the construction firm missed a key opportunity, perhaps responsibility, to find and repair the reflection anomalies before the launch. Later, the error was identified and fixed. We did not scrap the enterprise. Pitt's moral for technology: it is no use crying over spilt milk, but somebody ought to clean up the mess. Rationality and human purpose are welded together in the critical process of technology assessment.

Pitt aims his criticism at technophobes and other critics of technology. But it applies symmetrically to sci-fi romantics and those awed by technology. Technophiles who blindly anticipate only progress, without considering lessons from the past, err as well. They, too, fail to learn from experience. They, too, violate the commonsense principle of rationality. Vigilance should be integral to technology.

Yes, we should learn from our mistakes. Still, most of us would rather not learn from such episodes as Minamata ( Allchin 1999b ), Chernobyl, Bhopal or the Valdez oil spill. If we indeed learn from history, we should acknowledge that we frequently encounter a tragic gap between engaging a technology and later remedying its unforeseen consequences ( Tenner 1996 ). In the tradition of Greek drama, such tragedies should surely sensitize us to technological hubris. Beware technological ambition. Approach new inventions humbly. One might well interpret many social criticisms this way: where Technology with a capital T embodies our cumulative experience with particular technologies (with a lower case t), history, perhaps, teaches us caution, sometimes through painful experience: adopt new technologies

especially large-scale or complex ones carefully, thoughtfully, respectfully. In cases of uncertainty, we may learn to adopt the Precautionary Principle, or design from worst-case scenarios rather than the most likely ones ( Shrader-Frechette 1991 ). Critics of Technology may thus have a rational argument. Appreciating it, though, may involve listening sympathetically to their fears and reconstructing their implicit reasons ( pp. 99, 110 ; see also below).

Pitt's model of technology can seem frustratingly idealized and oversimplified. It can, nonetheless, help us identify where Pitt's own position is flawed. Consider his model as a generalized design for humanity at work. We educate and organize engineers to help us design and, by our own choosing, incorporate consumers and citizens in that process or not. We choose the materials and make the tools, and establish the working conditions under which Nike shoes, oriental rugs, etc., are made. We assess the outcome and adjust accordingly downstream. Ultimately, then, we instantiate the model. But who is "we"? This is perhaps the first question in moving from the ideal norm to the pragmatic concrete. The process will undoubtedly be shaped by the persons participating in it. Who participates? Technology involves resources. Who controls those resources and how will they invest them? Especially, who ajudicates the technology and has the power to effect changes? What induces anyone to be deliberate or rational or democratic about this process in the real world? The very process of technology, as humanity at work, is politicized. So the social critics of technology are not mistaken, after all. Pitt's model and ultimately any philosophy of technology that peripheralizes or eclipses politics is incomplete and dangerously misleading if we want ( p. ix , in the spirit of Wilfrid Sellars) to understand "how things . . . hang together." Though the utopian model ( p. x ) might clarify thinking at one level, it needs to be supplemented paradoxically perhaps by social criticism. One simply asks, " Whose technology?" ( Harding 1991 ). Pragmatically speaking, what social structures do we need to instantiate the normative model? Pitt needs a politics of rational discourse ( pp. 19-20 , Longino 1990 , Hull 1988 , Allchin 1999a ).

Ultimately, then, Pitt's argument against social critics of technology, while identifying important principles, is overdrawn. Where he does succeed, Pitt profiles deficits in rational discourse, rather than in how such critics think about technology (see last section below). Along the way we learn to conceptualize technologies expansively and pragmatically: as humanity at work, ever assessing its course.

AN EPISTEMOLOGY OF TECHNOLOGY

So much for Pitt's negative thesis. He also has a noteworthy positive agenda. Believing that we should decide about technologies rationally, he aims to articulate an epistemology of technology ( pp. vii-viii, chapters 1, 3, 4 ). It concerns first and foremost: questions about what we can know about a specific technology and its effects and in what that knowledge consists ( p. xiii ).

Alternatively, what grounds the reliability of an assessment at any stage of technology? What substantiates the thinking in "thinking about technology"? By widely conceiving technologies as humanity at work, this task becomes quite challenging. Pitt's strategy is, broadly, to compare technology with science, a model of successful epistemic practice ( pp. xii, 34, 52 ). Hence, one might profit from the vast literature in philosophy of science, while using it as a scaffolding for asking about philosophy of technology.

Pitt considers numerous counterparts ( pp. 25-27 ): knowledge ( pp. 28-38 ), explanation ( pp. 41-51 ), structure and function of laws and theories ( pp. 42-45 ), change ( pp. 38-40, chapters 8, 9 ) and realism (determinism; chapter 6 ). Ultimately, he rejects most direct parallels. Technology is different, he contends. It is permeated with uncertainty ( pp. 49, 88-90, 120 ), complexity ( pp. 43-45 ), particular circumstances where multiple laws intersect ( pp. 19, 46-49 ) and true to his definition of technology human agency ( pp. 51-65, 103 ). His alternative epistemology centers instead on practical success, rather than truth ( pp. xii, 5, 32, 40 ), and CPR, the commonsense principle of rationality ( pp. 22, 50 ). He retreats to the more modest domain of engineering knowledge ( pp. 35-38 ) and technical explanation ( pp. 45-51 ), where he can comfortably locate humans making decisions ( p. 65 ).

Pitt's project of drawing epistemic lessons from philosophy of science is well conceived. But it is not fully realized. First, Pitt rarely departs from the logical positivism that most philosophers of science abandoned long ago as woefully incomplete. This fetters him in an abstract, theoretical world at odds with his conception of technology as work. However, science is work, too. This theme has been richly explored by the New Experimentalists (Hacking, Franklin, Ackermann, Galison, among others) and various ethnographers and anthropologists of science (Lynch, Knorr-Cetina, Kohler, etc.), especially those who have struggled with the problem of agency and evidence (Latour, Pickering). Their discussions of laboratory practice, craft skill, the epistemology of instruments , the differentiation of fact and artifact, and the primacy of "effects" over explanation all offer fruitful comparisons. Pitt should surely find their focus on efficacy, rather than truth, congenial to his pragmatic view of technology. While briefly acknowledging this work ( pp. 6-7, 125 ), though Pitt seems not to capitalize on its many potential insights. Perhaps others will find inspiration here.

In addition, Pitt's horizon seems limited by cases from physics and by the Newtonian paradigm closely allied with logical positivism. Venturing into philosophy of biology, for example, one finds extensive discussion of function and adaptation (e.g., Sober 1984 ), apt topics for considering how technological solutions "fit" given problems. There are reflections on organisms and minds as complex systems. Naturalized epistemology, too especially using evolutionary models (e.g., Bradie 1986 , Callebaut 1993 ) suggests how one might address justification and historical change where context is important. Pitt thus frames an appropriate program for "thinking about" technological epistemology, but leaves it for others to complete.

One should note, in particular, that Pitt lists two elements of science for which he finds no counterparts in technology: justification and evidence ( pp. 26-27, 40 ). How fascinating, given that Pitt conceives technology "as primarily an epistemic activity" ( p. 103 ). How, then, does one justify a technological "argument" or explanation? What constitutes evidence for assessing solutions to human problems? Namely, what governs reliability in technology? Pitt tries to finesse the problem by admitting that technology errs. But science errs, too ( Allchin 2000a , Darden 1998 ). This is no excuse for abandoning principles of reliability ( Petroski 1994 , Allchin 2000b ). It does not obviate the need for evidence in CPR. We should expect an epistemology of technology, foremost, to set norms for evidence in justifying the adoption, rejection, development or revision of a given technology. Else we can jettison rationality. Here, Pitt's account is wanting.

Pitt ultimately demurs from the consequences of his own thinking. Namely, if one construes technology as humanity at work, then the problems are intimately human, not just physical. The engineering specifications, in the broad sense, will include values and judgments about "humanity's changing needs and goals" if indeed we ever agree how to frame the problem at the outset. The relevant knowledge, as Pitt himself acknowledges all too briefly ( pp. 43-45 ), will come from the social sciences as much as from conventional engineering texts. It will involve ethical analysis, if justice, equal opportunity, and personal autonomy are goals towards which humanity works ( pp. 82-86 ). (I assume, along with moral philosophers, that moral questions and principles can indeed be addressed apart from parochial ideological interests, and that ethical principles can be objective.) Mere technical explanations in the conventional sense ( pp. 45-51 ) can hardly resolve these issues. Rather, justifying technology involves values. And so our epistemology returns, once again, to the social discourse where we articulate our values and define technological problems.

FRAMING A TECHNOLOGY OF RATIONAL DISCOURSE

Pitt proposes his model as a technology, or tool, for thinking about technology. Philosophy as engineering. Thus, his book works towards empowering readers to analyze and thereby shape the technology of their lives ( pp. 101, 106-107, 137 ). I suggest, however, that if we want someone to fully understand technology, its human dimension, complexities and multiple meanings, we need to show them the ontology of technology in their daily lives. For example, blue jeans ( Allchin 1997 ). Who thinks about growing the cotton, along with the effects of the pesticides and fertilizers? Who considers stone-washing the fabric and the water it pollutes? Who considers fueling the tractors, the gins, the sewing machines and the trucks that move everything all around and the implications for global climate change? Or the oil spills related to producing the gasoline? Who thinks about the laborers, their working conditions or wages? Knowing all that, who thinks through the alternatives? Most persons do not know how to think about technology, I fear, because they are blinded to it. They need to learn how to see. Ultimately, I think Pitt, too, endorses this solution: asking questions about how technological systems are constructed, their histories and how to best assess their merits ( p. 81 ). But this means we need notó not a new model of technology but simply better education and more good case studies. Clocks ( Dohrn-van Rossum 1996 ). Lightbulbs ( Bijker 1995 ). Auto paint ( Allchin 1995 ). When well informed, I think, individuals already have the tools to make their own assessments. Thinking pragmatically, perhaps, we may find out.

I fret far more about the technology for merely thinking about . Or, more precisely, for thinking about at the collective level. Pitt highlights humanity at work. That means humans working together. The process does not belong exclusively, as Pitt notes, to feminists or Marxists, Earth-Firsters or capitalists. The enterprise is (or ought to be) communal. The challenge the technological problem is how to construct a reliable consensus. Here, I believe, philosophers have a major and underappreciated role. That is, philosophers are the architects of effective discourse. The experts for designing a technology of "thinking about." But strategies of abstract logic for individuals will not suffice. Our problem is situated reasoning at a communal level ( pp. 4, 19-20, 29 ) where perspectives vary. We need an effective technology for reflecting jointly, for building consensus, for rational discourse.

Philosophers currently promote, unfortunately, discourse based on an ineffective militaristic and competitive model. For example, they emphasize arguments designed (that is, engineered) to defeat one's opponents. Use strong evidence and powerful reasoning. Anticipate counterattacks . Buttress your argument. Defend your position . It is either-or, win-lose competition. We engineer our culture using the Super Bowl as a model of conflict resolution ( Allchin 1994 ). We let competition in the marketplace substitute for public debate ( pp. 16-17 ; Sagoff 1988 ). And it splinters the society. Witness the adversarial roles and intransigent attitudes Pitt reviles among certain critics of technology ( p. 76 ). Time to enlist CPR. I suggest that we need a consilience model of discourse. One based on accommodating diverse principles, values and evidence. Here, the strategy is to win someone's agreement . One earns another's endorsement through appealing to relevant values that both share.

This involves listening more than arguing. Listening, here, involves two components. First, one must understand someone else's reasoning, especially if they disagree. Sympathetic interpretation is not just a charitable act ( p. 67 ). It is a tool for framing one's own proposals effectively. Listening may entail probing and query: "Let me be sure that I'm interpreting you correctly: did you mean X?" Reasoning should converge, not conquer.

Second, listening involves interpreting the affective subtext and values that motivate the reasoning. These allow one to interpret the reasoning and acceptability of prospective alternatives. (Here, philosophers might learn a bit about the psychology of why or how our brains reason.) This analysis is critical to responding to values. It identifies what matters (locally). (And here I hope I have interpreted and addressed Pitt's values.)

Listening frames the challenge of creative problem-solving. The aim is to accommodate different values. Where there are conflicts, one must invent new solutions. It requires imagination ( Johnson 1993 ). Social discourse itself is an act of engineering. And re-engineering. Sometimes, one must search for deeper shared values and principles on which to base agreement. The discursive framework itself shapes the standards for objective evidence and justification. Again, the goal is to earn approval, not to eclipse the "other." One forges a consensus.

I think this model of consilience through reasoned discourse and creative problem-solving is missing in most philosophy. It is absent from most popularizations of philosophy and philosophy classrooms. So, I suggest, if we want better philosophy of technology, we should begin with better philosophy.

If the problem is indeed rational discourse, as suggested by Pitt's coupled themes of social criticism and epistemology, then the solution involves reshaping the nature of discourse, not blaming the critics. The solution will come, not from a new philosophy for thinking about technology, but from an effective philosophy, or technology, of "thinking about."

REFERENCES

Allchin , Douglas. 1994. "The Super-Bowl and the Ox-Phos Controversy: Winner-Take-All Competition in Philosophy of Science," in PSA 1994 1:22-33.

Allchin , Douglas. 1995. "Ford Plant Case Study." SHiPS Resource Center. URL: http://ships.umn.edu/cases/ford/ (May 3, 2000).

Allchin , Douglas. 1997. "The Ecology of Stone-Washed Jeans." SHiPS Teachers' Network News 8(1):3. Also URL: ships.umn.edu/ethics/jeans.htm (May 3, 2000).

Allchin , Douglas. 1999a. "Values in Science: An Educational Perspective." Science & Education 8:1-12.

Allchin , Douglas. 1999b. "The Tragedy and Triumph of Minamata." American Biology Teacher 61(June): 413-19.

Allchin , Douglas. 2000a. "To Err Is Science." American Association for the Advancement of Science (Washington DC, Feb., 2000). URL: www.pclink.com/allchin/papers/2-err.htm (May 3, 2000).

Allchin , Douglas. 2000b. "The Epistemology of Error." Philosophy of Science Association Meeting (Vancouver, Nov., 2000).

Bijker , Wiebe E. 1995. Of Bicycles, Bakelite and Bulbs . Cambridge MA: MIT Press.

Bradie , Michael. 1986. "Assessing Evolutionary Epistemology." Biology and Philosophy 1:401-50.

Callebaut , Werner. 1993. Taking the Naturalist Turn, or How Real Philosophy of Science is Done . Chicago: University of Chicago Press.

Darden , Lindley. 1998. The Nature of Scientific Inquiry. URL: http://www.inform.umd.edu/EdRes/Colleges/ARHU/Depts/Philosophy/homepage/faculty/LDarden/sciinq/ .

Dohrn -van Rossum, Gerhard. 1996. History of the Hour . Chicago: University of Chicago Press.

Harding , Sandra. 1991. Whose Science? Whose Knowledge? Ithaca NY: Cornell University Press.

Hull , David. 1988. Science as a Process . Chicago: University of Chicago Press.

Johnson , Mark. 1993. Moral Imagination . Chicago: University of Chicago Press.

Longino , Helen. 1990. Science as a Social Process . Princeton NJ: Princeton University Press.

Petroski , Henry. 1994. Design Paradigms . Cambridge: Cambridge University Press.

Pitt , Joseph C. 2000. Thinking About Technology . New York: Seven Bridges Press.

Sagoff , Mark. 1988. The Economy of the Earth . New York: Cambridge University Press.

Shrader -Frechette, Kristin. 1991. Risk and Rationality . Berkeley: University of California Press.

Sober , Elliot, ed. 1984. Conceptual Issues in Evolutionary Biology . Cambridge MA: MIT Press.

Tenner , Edward. 1996. Why Things Bite Back: Technology and the Revenge of Unintended Consequences . New York: Knopf.

Winner , Langdon. 1986. The Whale and the Reactor . Chicago: University of Chicago Press.