Journal of Industrial Teacher Education logo

Current Editor: Dr. Robert T. Howell  bhowell@fhsu.edu
Volume 37, Number 4
Summer 2000


DLA Ejournal Home | JITE Home | Table of Contents for this issue | Search JITE and other ejournals

Cognitive Psychology as the Basis for Technical Instruction? I Don't Think So

Andrew E. Schultz
University of Nebraska–Lincoln

In the Vol. 36 No. 1 issue of the Journal of Industrial Teacher Education, Herschbach (1998) wrote about cognitive psychology research and the value of using its theoretical constructs in technical instruction. Herschbach's central thesis for this article was that the integration of academic and technical skills can be achieved in ways that engage students in the construction, use, and reformulation of knowledge across fields of inquiry.

This response to Herschbach's article takes issue with his assertions about the proposed union of technical education and cognitive psychology, not because this marriage is a bad idea, but, rather, because Herschbach has not fully examined the assumptions underlying the theories of cognitive psychology or the consequences of implementing them in technical education. Once a fuller examination is undertaken, his recommendations can be seen as a mere shift of audience, and not the paradigmatic or revolutionary step forward that he suggests.

Two unexamined assumptions undermine cognitive psychologists' efforts to explain how we think. One unexamined assumption is that of universality: that there is a universal, generalized structure of how people think. If this assumed universal structure exists, it must spring from innate sources; therefore, it must be a product of evolution. The second unexamined assumption is that of progress: that educators can teach students how to think better, that there is one "best way" to structure thinking and the job of a school is to foster progress towards this "best" way. In other words advocates of cognitive psychology in education assume a progressive element to an evolutionary adaptation—human thought. The two assumptions are problematic and at odds with one another. This article will deal with these problems, assumption by assumption.

Is There a Universal Structure for Human Thought?

Cognitive psychology pursues the idea of a universal structure as if it were a fundamental truth. The normative, experimental methodology of cognitive psychology practically compels researchers to view difference in this unitary fashion, as variance around a normative mean. To conduct this sort of research, the researcher first finds a pool of subjects, frequently undergraduate students, and makes intense efforts to ensure that they are randomly selected. Through random selection, it is assumed that a representative sample can be obtained that will contain the same variance from the norm as the sample's population. But in the case of cognitive structure, structure—if such a thing truly exists—is massively and pervasively shaped by experience. To study the innate structure, researchers would need to see that structure without the societal, cultural, and technological shaping that occurs naturally during an individual's development. The empirical pursuit of this structure through random sampling is therefore an impossibility. If sampling were the only problem with this sort of research, perhaps we could infer the innate cognitive structure from study of the adult cognitive structure of individuals with similar life experience. This is simply not the case.

Neurologists make it clear that the human thinking apparatus is haphazardly shaped in the womb, in infancy, in early childhood, in preschool, and elementary school, and then in the sturm und drang of adolescence; thus, cognitive structures exhibit almost an infinity of differences (Gazzaniga 1998). Any parent who tends a fetal alcohol syndrome child or a developmentally delayed baby is acutely aware that any misstep in any moment of the prolonged period of development that shapes human thought may impinge upon what thoughts that that infant is later capable of bearing. Factors like nutrition, genes, and washes of hormones or of illicit or ill–advised drugs that flood the womb may alter a fetus' neurological equipment and his or her later cognitive capabilities. Defining a cognitive structure is like aiming at a moving target that is changing shape rapidly and randomly while moving rapidly towards an undefined end. Cognitive structure cannot be defined by the static, one dimensional model that cognitive psychology is attempting to identify.

Even if we accept the assumption that there is a normative development of the human thinking apparatus, developmental psychologists note fairly broad ranges of achievement within certain stages of development in children and adolescence. What cognitive psychology studies is predominantly the mature, adult thinking apparatus. We in secondary industrial education teach adolescents and young adults. Is their thinking the same as adult thought? Clearly not, and yet Herschbach suggests that these theories are appropriate to that audience.

Another aspect of the development of cognitive processes is what is known as critical periods. During development in humans and animals, critical periods occur during which certain information must be present and coded in by the developing infant or the infant will fail to develop certain attributes. For example, the ethnographer Conrad Lorentz noted that the imprinting of goslings with their mother's visage is critical (Bleibtreu, 1968). Kittens raised in an experiment allowing them to see through only one eye during the first months of their life, experienced a change in the neuronal wiring of the visual cortex. The ability to see only through one eye became a permanent condition, even after the patch was removed and a perfectly functioning other eye was available for seeing (Gazzaniga, 1998). Japanese babies are able to babble the "L" sound as infants. However, if this pronunciation is not reinforced, the behavior is extinguished from their language repertoire, and they are subsequently unable to pronounce "L." Donald (1991), reporting on his biological theory of language development, noted that "the language device matures at its own rate, during a definable 'critical period'" (p. 23). Similar studies have shown that people raised without the concept of "square" have great difficulty as adults in attaining that concept or in making sense of portrayals of our rectilinear world. Similarly, nonhearing people who regain their hearing have great difficulty understanding spoken sounds. Nonsighted people who regain their vision have more difficulty interpreting the messages that our eyes send to our brains than do persons who have always been sighted. How can we define a normative cognitive structure when we all very probably have deficits in neurological structure that have resulted from the unknowing pruning of neuronal networks by the failure to reinforce innate capabilities during our infancies?

Among life scientists, one often hears the notion, "phylogeny recapitulates ontology". The crux of this idea is that one can observe in the development of a zygote, the progress from single–celled animal through all evolutionary stages, until it arrives at the mature, bilaterally symmetrical, vertebrate structure we see in adult humans. Among evolutionary psychologists, the idea persists that humans recapitulate in thought as well as in physiology. Mithen (1996), for example, articulates a cautious case for the recapitulation of cognitive ontology. Donald (1991), too, argues that recapitulation is commonly believed to occur.

But even if we accept the developmental aspect of thought, what does the end product of this development look like? Is our brain largely a memory and recognition device, or is it inherently a problem–solving, mathematics– doing, logic–using, thinking machine that resembles a computer? Cognitive psychology would have us believe that the human thinking apparatus is more like the latter. By examining the written record of human thought, we can see that there certainly seems to be a penchant for accounting, at least during historic times, but that there is not inherently a penchant for mathematics. It is also clear that although there seems to be a human thirst for explanation of the inexplicable, it is not necessarily manifested in science and hypothesis–testing. Certainly there is a compulsion towards communications, but only for oral, symbolic language; writing and reading are not genetically predetermined. And, although there seems to be a knack for tinkering and tool use as well, this is not necessarily an innate ability for engineering. If we consider the historical record, we can see that our brains have been preformatted as general apparatuses, which are more likely be used to make astrological charts or to probe sheep entrails for portents of the future than to construct algebraic matrices. If there is a universal structure, it must be one that accommodates the dross of humanity, as well as its sagacity. However, our current notions of what is good, i.e., problem solving, critical thinking, and a capacity for analysis, keeps us from perceiving the truth, that whatever structure we humans inherently have, it is likely be very general and not highly specialized. Some few may be able to use their minds for very specialized work, but this is not evidence that it is so for everyone.

Before Gutenberg invented the printing press, our brains were largely devoted to memorization, as they had been for hundreds of thousands or even millions of years (Burke, 1985). Cognitive psychologists tend not to study the variety of processes of memory that preliterate peoples exhibited or that exist in the human population now. Is it not more likely that the preliterate mind is a closer reflection of the naturally occurring cognitive structure than the postliterate mind? Neisser (1981) reported that, " The primary scientific goal of the study of memory, then, is to understand the underlying architecture of that universal system…[the] three separate memories' [sic] have become the commonplaces of cognitive theory: the sensory register, the short–term store and the long term store…the existence of some underlying machinery…is rarely called into question" (pp. 377–378). In other words, cognitive psychology has little incentive to study extraordinary memory; it would only undermine the notion of universality. Burke (1985) reported that prior to Gutenberg and the era of mass literacy, professors were required to memorize 100 lines of poetry in one hearing, in order to qualify for a doctorate. How was a mind trained to accommodate such memory feats? Why can we not do such a thing now? Is it logical to assume that we are some new kind of creature, some higher order of human? No. Instead, 500 years ago, the technology of books enabled us to begin using our brains in a new fashion, primarily for information manipulation rather than primarily for information storage. The resulting 500 years of technological advance that we have enjoyed as a result are truly astounding. But it is folly to assume that we are a new kind of thinker, that our way of studying thinking is insightful, or that we suddenly have had a miraculous advance in our understanding of the way that people think. Our insights are better understood as the mere reconciliation of the ideas that we have about the way we should think with the temporary realities of the age in which we live.

It is no surprise that Herschbach (1998) stated that "cognitive psychology is the psychology of the computer age," for not only is the technology of our times clearly a mirror of the way we think, but the way we think is also clearly mirrored in our technology. The Germans have had a saying for this phenomena for quite some time: "Die Arbeit macht den Mann, der Mann macht die Arbeit." In other words, the Germans have long recognized that technology changes the way that we think and that the way that we think changes our technology. What is surprising, however, is that Herschbach blithely accepts the pervasive influence of computers as profound. This is ethnocentric thinking.

Does Herschbach's (1998) statement truly mean that our thinking apparatus is and always has been the way that we think it is now? No, given an evolutionary perspective on thinking about how we think, the reader can see that much current thinking is flawed. Moreover, Herschbach has fallen into an appealing intellectual trap in which, rather like what happened to the boy in the children's book, The Never Ending Story, the world that the he imagines becomes real as he reads it, and yet the pages are blank until he imagines it. In his seminal work, The Mind Past, Gazzaniga (1998) theorized and provided evidence for a left–brain "interpreter" that invents post hoc, quasi–rational explanations for behaviors that are controlled by the unconscious right–brain. This provocative thesis calls into question the validity of much of what cognitive psychology proffers.

The reader should ask, "Why does an analogy of our thinking apparatus to an information–processing model work now, when it never did before? Did the reasoning of Socrates, Aristotle, St. Augustine, Shakespeare, or even B. F. Skinner resemble information processing?" One would suspect that it did not, else they would have mentioned that it did—that it had attributes that we now characterize as information processing. Boorstin (1998), for example, suggests that Plato's Dialogues, which reflect Socratic reasoning, are a written account of thinking before literacy. In other words, they are a reflection of oral reasoning. The answer to the questions posed at the beginning of this paragraph is that our technology for thought is evolving, and that Socrates, Aristotle, St. Augustine, Shakespeare, and B. F. Skinner each may represent different stages in that evolution, which is occurring outside of the body. It is not our general purpose thinking apparatus that is evolving; the neurological equipment of modern man does not differ from his paleolithic counterpart. Technology, therefore, should also be considered evolution that occurs outside of the body.

When we examine thought from a cognitive science perspective, we are inventing cognitive science; it is a reflection of the cognitive scientist's manner of thinking. When the researcher is a neurologist, the physiological mechanisms of thought are examined. When behaviorists examine behavior as the progenitor of thought, overt behavior is examined. When computer scientists try to enable a computer to mimic human thought, they are using an anology that makes sense to them. A Gestaltist's thinking about thought, logically, follows the manner in which that scientist thinks. The same could be said for Socrates, Plato, and St. Augustine; Chaucer and Bede; or Dante, Milton and Shakespeare.

Cognitive science and virtually all attempts to understand human thought are inevitably culture bound and flawed. They are a reflection of our time and our technology. That we want to constrain a truly unique and infinitely malleable feature, thought, to a single model is a ultimately a dim idea, doomed to failure. It is the antithesis of intelligence.

Is Human Thought Progressive?

The notion of progress is a troubling one for educators and cognitive psychologists alike. As teachers we want to improve our pedagogy. We want to improve our students' level of achievement and understanding. We want to make our educational efforts consistent with the most up–to–date research. On the surface, evolutionary progress is intuitively appealing. As Deacon (1997) noted, "progress seems to be implicit in natural selection" (p. 29). On the other hand, most recognize that there is something inherently zany about the idea of improving the way that people, as a species, think. In his weekly radio show on National Public Radio, Garrison Keillor alludes to this zaniness when he notes that his mythical Lake Woebegone is the city "where all the women are strong, the men are good looking, and all the children are above average." In other words, more intelligent people survive and reproduce in Lake Woebegone, but not elsewhere, otherwise, the children of Lake Woebegone would still be just average.

Given the work of Darwin (1859), it is clear that the idea of progress is antithetical to evolution. As Rudgley (1999), stated, "much of what is called progress is simply the result of the accumulation of knowledge" (p. 7). Evolution is a random process whereby chance genetic variation allows success or failure in varying environments. As Gazzaniga (1998) put it, "the brain exists to make better decisions about how to enhance reproductive success" (p. 5). But the variance that exists in the brain is not progressive. There is no evolution towards becoming super beings or super thinkers. Indeed, there is little evidence that intelligence or thinking skills are anything but an anomaly in the record of evolutionary adaptation. In no other species save homo sapiens, is there any inkling that intelligence or thinking or language or tool use is of any significance in terms of survival of species. No birds are evolving towards becoming super birds, no lizards are becoming literate, no squirrels will win Nobel prizes. And in the same fashion, it is a misconception to assume that we will become smarter as a species because it is our destiny, our genetic heritage to do so.

Evolution, then, does not work the way we would like to think it works. If our minds are evolutionary adaptations, they certainly are not evolving in the direction which we assume and want, unless the forces of evolution have been suspended for human beings. Darwin (1859) would strongly suggest that this is not the case.

To consider evolution from another perspective, let's look at gardening. I love to garden and I save seeds. Occasionally I accumulate a bunch of partially–filled packages of seeds, all three to five years old of say, tomatoes. In my cruel, heartless, cavalier fashion, I'll broadcast all of these leftover seeds in some sunny spot in my garden. It is fascinating to watch the results. Most of the seeds germinate in five to eight days. Of these, some are spindly, some are deformed, some are leggy and some are perfect, robust, healthy, vibrant, vigorous tomato plants. What's more remarkable, however, is that over the course of the spring and summer, odd tomato plants keep cropping up. I deduce from these observations, that there is some significant survival mechanism at work—that there is no one ideal time for germination, because in some years the hot weather doesn't come until later.

That variance is at work in our thinkers too. In every class I teach, I find noteworthy variety in thinking. My job as teacher, however, is not to weed the classroom, leaving only those who could benefit from one manner of cultivation, rather it is to maximize the learning of each.

Conclusion

In sum, although Herschbach (1998) did a fine job of recapitulating the research of cognitive psychologists over the last half century, when he prescribed incorporating these theoretical results into educational practice, he stepped over the line from skeptical researcher to enthused cheerleader.

Herschbach ignored too much in his article to warrant the action so enthusiastically advanced. Gazzaniga (1998) stated flatly, "The abilities to learn and think come with the brain" (p. 59). Neisser, (1982) with equal aplomb, asserted that there is much more variance in memory than can be accommodated by a universal model of memory (p. 379). Herschbach's assumptions trample over the debate as to over whether we are vocational or general educators, over whether group instruction is just a practical, educational reality or the bane of education, and, the perennial bugaboo, over what we teach or why. To go, without some prolonged and significant deep thinking, from learning theory to advancing notions about how to teach is an adventure in folly teachers no longer wish to endure. We have made many a stumble in the last 20 years. Indeed our pattern of educational effort looks more like the half–hearted efforts of channel surfers, than of concerned, committed educational policy–makers.

This response to Herschbach's article does not deny the merits of cognitive psychology. It has many merits and makes some limited sense for this, our limited time. With the passage of time, however, cognitive psychology will prove no more successful than the idea of the geocentric universe. Therefore, we should be exceedingly careful of reformatting, yet again, the manner in which teachers teach.

Is cognitive psychology the panacea for technical education that Herschbach suggests? Probably not. Will it work for some students? Certainly. It may even prove to be the ideal vehicle for technical teachers to use for some significant number of their students.

Herschbach (1998) reported that " Cognitive psychology asks 'How can the most meaningful learning take place?' rather than 'What is the most important content to learn?'" (p. 57). He stated that cognitive psychology has shifted the debate from the teaching of specific content to how learning itself occurs, and how instruction best can be applied. Does it really? It is not difficult, however, to think of situations where competence is of more concern than meaning. In choosing a heart surgeon or brake mechanic, for example, the patient or the driver, respectively, are more comforted by a mastery of the content.

Herschbach (1998) concluded his article with an assertion that should never be uttered. As a fellow dispassionate seeker of truth, this writer shudders for him. He stated: "If technical educators are going to become a part of what is probably the most revolutionary educational development in our time, their work, too, is going to have to be informed by a cognitive perspective which places primary emphasis on thinking and making meaning" (p. 57). Is cognitive psychology's take on how thinking happens really revolutionary? Probably not, and I cannot concur in Herschbach's analysis.

Just as I play at god in my garden, we often play at a god–like understanding as educators, and we make our society, schools, and culture comfortable or uncomfortable for certain portions of the population. We do so with the best possible intent, but largely unthinkingly. As we educators learn about cognitive psychology, which has indeed revealed some remarkable aspects of how our most attractive thinkers think, we need to consider its implications for our disciplines.

What are some of the notable byproducts of modern thought: an industrialized society, a sheep–like consumer mentality, and avid materialism, and a self–confident quasi–rationality? Do we really want to continue in this direction?

This is not the writing of some Luddite rejecting the benefits of technology, the cognitive psychology methodology, or our material culture. On the other hand, one should be wary of those who "know", who profess to understand, or who shape our schools to foster those temporary, culturally– and technologically–bound truths of the moment. The monoculture of schools weakens us, makes us terribly vulnerable to the vagaries of evolutionary forces. These forces are gathering strength on the horizons of the next century. Collectively, we wealthy industrialized Westerners are dancing an economic dans macabre, as did the unfortunates in the plague riddled medieval times or the cabaret crew in Germany between the wars. Our lifestyles are unsustainable and terribly precarious, as Burke, (1978) so eloquently illustrated. In a recent Chronicle of Higher Education article, Rees (1999) notes, "renewable resource scarcities of the next 50 years will probably occur with a speed, complexity, and magnitude unprecedented in history" (p. B5).

As we reach the end of our 1000–year era, this industrialized, technology–driven society, which was spawned in late medieval scholasticism and came of age in the Renaissance, must either change dramatically or perish. The inevitable consequences of over consumption and over population are dramatically before us. Only the masses of ostrich–like demagogues entrenched in positions of power refuse to see the disaster lying ahead.

Nowhere must this change be more dramatic than in our schools. Instead of narrow training or narrow educating, we must broaden our curriculum and methodology to foster self–reliance, self–sufficiency, and a more parsimonious nature. We must succor real creativity and individualism. We are at the point where we must sacrifice our individual self–interest to save our nation and world. One of such sacrifices is to use our enormous capital to fund an educational system that turns out the thrifty, creative, adaptive impulses that will enable our children to survive in a world of want, which is just over the horizon for us as Americans and which is already a reality in much of the nonindustrialized world.

Author

Schultz is an Assistant Professor in the Center of Curriculum and Instruction at the University of Nebraska–Lincoln.

References

Bleibtreu, J. N. (1968). The parable of the beast. New York: Macmillan.

Boorstin, D. J. (1998). The seekers: The story of man's continuing quest to understand his world. New York: Random House.

Burke, J. (1978). Connections: An alternate view of change. Boston: Little, Brown.

Burke, J. (1985). The day the universe changed. Boston: Little, Brown.

Darwin, C. (1859). On the origin of species by means of natural selection. London: John Murray.

Deacon, T. W. (1997). The symbolic species: The co–evolution of language and the brain. New York: W. W. Norton.

Donald, M. (1991). Origins of the modern mind: Three stages in the evolution of culture and cognition. Cambridge, MA: Harvard University Press

Gazzaniga, M. S. (1998). The mind's past. Berkeley: University of California Press.

Herschbach, D. R. (1998). Reconstructing technical education. Journal of Industrial Teacher Education, 36 (1), 36–61.

Mithen, S. (1996). The prehistory of the mind. New York: Thames & Hudson.

Neisser, U. (Ed.) (1982). Memory observed. San Francisco: W.H. Freeman.

Rees, W. E. (1999). Life in the lap of luxury as ecosystems collapse. The Chronicle of Higher Education, XLV (47), B4–B5.

Rudgley, R. (1999). The lost civilizations of the stone age. New York: The Free Press.


DLA Ejournal Home | JITE Home | Table of Contents for this issue | Search JITE and other ejournals