SPT v11n1 - Nanomachine: One Word for Three Different Paradigms
Nanomachine : One Word for Three Different Paradigms
Bernadette Bensaude-Vincent and Xavier Guchet
Centre d’histoire et de philosophie des sciences
Université ParisAbstract
Scientists and engineers who extensively use the term “nanomachine” are not always aware of the philosophical implications of this term. The purpose of this paper is to clarify the concept of nanomachine through a distinction between three major paradigms of machine. After a brief presentation of two well-known paradigms - Cartesian mechanistic machines and Von Neumann’s complex and uncontrolled machines – we will argue that Drexler’s model was mainly Cartesian. But what about the model of his critics? We propose a third model - Gilbert Simondon’s notion of concrete machines – which seems more appropriate to understand nanomachines than the notion of “soft machines”. Finally we review a few strategies currently used to design nanomachines, in an effort to determine which paradigm they belong to.
Key words: complexity, concretization, machine, nanotechnology, philosophy
1. Introduction
The convergence of nanotechnology, biotechnology, information technology and cognitive sciences, officially encouraged by the NSF under the label NBIC since 2002, has been prepared by a number of multidisciplinary collaborations. Among them, the 1997 Albany Conference on “Biomolecular motors and nanomachines”, aimed at exchanging information and ideas between the research community of physicists, chemists and biologists, suggests the meeting point is the notion of machine. Five years later the convergence between nano-engineering and molecular biology materialized in the form of an electronic circuitry using a living bacterium. 1 Engineering and hybridizing inorganic and organic materials to design functional structures is now one of the most promising technological routes that will presumably produce common artefacts in the next few decades. Whatever the potential of such hybrid artefacts, nanotechnologies and biotechnologies are presently converging is their linguistic practices. The metaphor of the machine is undoubtedly the pivot of their convergence.
On the one hand, in the biology community the machine metaphor has superseded all alternative metaphors, such as the image of the cell as a society, for instance. Cells’ molecular components are described as tools or machines operating at the macromolecular level: Ribosomes are assembly lines, myosins are motors, polymerases are copy machines, proteases and proteosomes are bulldozers, membranes are electric fences, and so on. Although biologists generally agree that living systems are the product of evolution rather than of design, they describe them as devices designed for specific tasks. It is not that descriptions of organisms and cells as little factories are quite novel. Such metaphors were occasionally used for teaching or popularizing purposes. But following the introduction of the genetic code in the early times of molecular biology these metaphors became more than expository devices. Now the machine seems to be a heuristic model, guiding the interpretation of experiments. Even though a number of biologists confess that the model is not to be taken literally and that the notion of program is just a cliché 2 , they use the metaphor as a convenient language, providing clues about the inner functioning of living systems. On the other hand, nanotechnology can be seen as the outcome of the new approach to nature initiated and developed by Materials Science and Engineering since the 1960s, with the core notion of “design”. Materials, unlike matter, are “for something”. Their structure has been processed to perform a specific task. The functional approach reconfigured the intellectual space by merging Science and Engineering. 3 It has also affected the language of chemists and materials scientists who adopted the terms “devices”, “motors” and occasionally “machines” because they are concerned with the design of functional structures. 4 In looking for multi-functional and efficient materials they frequently take their inspiration from nature: spider silk, abalone shell, or lotus leaves provide engineers with model materials that they seek to mimic by their own ways and with their own tools. Some of them describe nature as an “insuperable engineer” and use such phrases as “nanosciences aim at investigating …how matter self-industrializes”. 5
The convergence of nanoscience and biology is nurtured by the shared assumption that nature works as human beings do: All its operations are supposed to be based on “devices”, designed to achieve specific functions, although scientists and engineers are unable to ascribe a definite function to each part of each “natural device”. It is not a trivial assumption. However, it is striking that the users of such metaphors do not care for refining their underlying assumptions and are content with a rather vague notion of machine. They use the terms “machine”, “machinery”, and “device”, more or less interchangeably. As the machine metaphor spreads to molecules, proteins, cells … the concept looses in comprehension what it gains in extension. Since we know that linguistic practices matter, that metaphors are not neutral and have an impact on technological choices. 6
This paper is an attempt at clarifying the notions of machine used by nanoscientists in various contexts and outlining the philosophical assumptions underlying such linguistic uses. What do nanoscientists mean by molecular motor or molecular machinery? Is it just a convenient metaphor or is it a heuristic model for understanding how nature works? And what kind of machine do they have in mind: a classical mechanical system such as Cartesian automata or something like complex systems “made up of many elements interacting in nonlinear ways”, with unpredictable and spontaneous behaviors (the so-called “emergent properties”)? 7 This alternative deserves particular attention because of the controversial issue at stake. Part of the concern about NBIC is related with the possibility of making molecular machines that would be out of control because of their capabilities for self-organization, self-reparation and self-replication. The latter prompted the famous grey goo scenario -– the putative result of the action of replicators breeding out of control. The relations between complexity and uncertainties about the future have been emphasized in particular by Jean-Pierre Dupuy. 8 He argues that by achieving complexity, converging technologists are doomed to behave as sorcerer apprentices, or at least to engage technological practices in an era of non-control. It is therefore important to closely examine what kind of nanomachines are being described and designed. Are they classical machines shrunk to the scale of atoms and molecules or are they complex systems that would gradually have the capacities to escape the control of their creators? In other words, how will nanomachines affect our relation to the material world?
After a description of the Cartesian paradigm of mechanistic machines and Von Neumann’s paradigm of complex and uncontrolled machines – we will argue that Drexler’s model was mainly Cartesian. In order to understand the model of his critics we propose a third model - Gilbert Simondon’s notion of concrete machines. We will then review a few strategies currently used to design nanomachines in an effort to determine which paradigm they belong to.
Preliminary definitions
In this paper we take the terms nanoscience and nanotechnology in their broad and common uses, as “the study of phenomena and manipulation of materials at atomic and macromolecular scales, where properties differ significantly from those at the larger scale”. 9 This definition retaining two aspects - the length scale and the emergence and exploitation of size-sensitive properties – embraces the craft of artefacts “atom by atom” or by manipulating a single molecule. It also includes certain aspects of materials science, supramolecular chemistry and bioengineering, fields that antedated the emergence of nanoscience and have extended their scopes to the nanoscale. In this broad perspective, the core project of nanotechnologies is to take advantage of the properties emerging at the scale of nanometer and to turn nanostructures into functional materials. 10 Science and technology are thus tightly interwoven. Making nanomachines and knowing how atoms and molecules behave are indistinguishable programs.
While we choose to adopt a loose notion of nanotechnology we need more precision for the notion of machine. The standard definition of nanomachine (also called nanite) as “a mechanical or electromechanical device whose dimensions are measured in nanometers” is too loose for our analysis. 11 Let us start with more refined definitions.
The term “device”, coming from the French term devis itself forged on the Latin verb dividire (to divide) does not include parts 12 . It is “a thing made for a particular purpose, especially a mechanical or electronic contrivance”. Like machines devices are made on purpose, to the point that it is the only idea retained in the second meaning listed in the OED “a plan, a scheme or trick”. But even when a device involves various operations, there is no effort at creating a sequence generating one movement after the other.
The term “machine” coming from the Greek mekhos , which gave mekhanê , retains the connotation of trickery. It means contrivance, something ingenious and even cunning. In medieval times it was associated to forgery. According to Hugh of Saint-Victor the term machina derived from moicheia (adultery). The machine feigns to perform a natural work, like the adulterer feigns and pretends to be a husband. 13 Machines and alchemical operations were both considered as hubris , as illegitimate attempts at overtaking nature, and challenging God’s creation. The current OED definition includes two meanings. The first one - “ i) an apparatus using mechanical power and having several parts for performing a particular task” - emphasizes that machines have a finality, they are meant for a specific function; the latter one - “ ii) an efficient and well-organized group of powerful people” - , is close to the French term “machination”, meaning a stratagem or conspiracy. In both cases, a machine necessarily requires multiple components. In Engines of Creation , chapter 1, Eric Drexler quoted the definition provided by The American Heritage Dictionary of the English Language : “Any system, usually of rigid bodies, formed and connected to alter, transmit, and direct applied forces in a predetermined manner to accomplish a specific objective, such as the performance of useful work.” 14 Three aspects are noticeable in this definition: i) a machine is made on purpose out of rigid or stable components ; ii) a machine is something which converts energy and transfers forces in a specific direction ; iii) a machine is meant to produce work, to perform useful tasks. All machines whether they be simple machines like levers or combustion engines or information machines fulfill at least the three requirements. Nanomachines will have to do the same if they pretend to be machines.
Cartesian and complex machines
Within this notion of machine two paradigms have been distinguished. The earlier paradigm, which stabilized in the seventeenth century, was modeled on the mechanical automata described by Descartes and materialized by artists such as Jacques de Vaucanson, among others. The more recent one is the paradigm of complexity, supported in particular by John Von Neumann at the Hixon Symposium (Caltech) in September 1948.
A Cartesian automaton - such as pumps, gears, levers- is a multi-component machine designed to produce movements. It can be divided up into parts, like difficulties in Descartes’s first rule of the Discourse of method . To its designer, a Cartesian machine is transparent, perfectly understandable and predictable, without mystery. The designer (clockmaker or engineer) has a full control over his machine because he has designed each component and their details. The Cartesian machine is partes extra partes , each part being independent has to be assembled to the others (wired, clipped or welded). Each individual component is ascribed a definite function, which is its raison d’être . The parts are independent but they have no individuality. They contribute to the whole but the whole does not maintain them. 15 Each part is necessary, none is sufficient. Each one is made on purpose to fit into the system and has to be adjusted to the others. A machine is exquisitely functionalized in all its details. As the French philosopher George Canguilhem argued a machine is much more teleological than living organisms. 16
Von Neumann’s General and Logical Theory of Automata 17 was developed as an alternative to the model of the central nervous system shaped by the cyberneticians Warren McCulloch and Pitts. They had described the brain as a computing machine, a communication network of elementary arithmetical calculators (neurons) that compute a function of their antecedents. This machine would work provided neurons be activated by stimuli beyond a critical point. Von Neumann emphasized that it was still possible to describe the behavior of McCulloch’s logical machine in a finite number of words. The structure of the machine was much more complicated than the model describing its behavior. But what about automata who have a behavior so complicated that it is impossible to characterize it fully in a finite number of words? In that case, Von Neumann argued, it would be simpler to describe its structure. The best model of the automaton would be the automaton itself. This is a complex machine. Instead of designing a structure to perform a predefined task (the function determining the structure), you have to build the structure in order to know what is capable of.
The contrast between Cartesian machines and complex machines also concerns the part/whole relationships. Cartesian machines are artificial totalities, i.e., the parts exist prior to the whole and the whole is nothing but the sum of its components. Cartesian machines, just as Mc Culloch’s computing machine, are devices transforming inputs into outputs. By contrast, complex machines are close to natural totalities. Unlike aggregates whose unity is accidental, they are made up of various elements interacting in loosely determined ways, and resulting in non-linear effects. Complex automata are autonomous, self-organized totalities made up of several integrated levels with a hierarchy of structures. From the interaction between the elements, a spontaneous and collective order emerges. The properties of the machine are novel and non-deducible from the properties of the elements. In return, the emergent order imposes constraints on elementary interactions. “The whole and its elements therefore mutually determine each other” 18 This codetermination relies on feedback loops between the various levels, and specifies the notion of complexity in artificial and natural automata. 19
Finality makes a third major difference between Cartesian and complex machines. A Cartesian machine is heteronomous, as the purpose is not the machine itself. The intention is part of the definition of the machine: such a machine is designed to perform a defined task. The machine pre-exists in the mind of the designer. It thus instantiates the subjective notion of finality: the designer’s intentions are embedded in the mechanism, which is just their materialization. A perfect machine will be the one presenting a strict isomorphism between the subjective goal and the objective mechanism.
By contrast, a complex machine is autonomous in the sense that it is not translating any subjective goal. The major feature of a complex machine is that it escapes from the control of its inventor. Its behavior is strictly unpredictable, so that one has to wait and see the machine in action in order to know how it behaves. As Dupuy often emphasizes, in complex machines the designer’s purposes have to be superseded by the machine. The fear of the sorcerer’s apprentice subdued by his own creation is not a potential hazard, an accident. It is the very essence of complex machines. Von Neumann himself prophesied that “the builders of automata would find themselves as helpless before their creations as we ourselves feel in the presence of complex natural phenomena”. 20
According to Dupuy, this sort of machine is what nanoengineers have in mind. The lack of control is an essential feature of nanotechnology, although it is not necessarily linked to the existence of self-replicating devices such as Drexlerian replicators.
“In keeping with that philosophy the engineers of the future will not be any more the ones who devise and design a structure capable of fulfilling a function that has been assigned to them. The engineers of the future will be the ones who know they are successful when they are surprised by their own creations […].It will be an inevitable temptation, not to say a task or a duty, for the nanotechnologists of the future to set off processes upon which they have no control”. 21
Most scientists and engineers active in the field of nanotechnologies are willing to demarcate their projects from what they view as speculations and fantasies. It is important however to examine if the design of complex machines is part of their program. With the conceptual distinction between Cartesian and complex machines in mind, we can now review the current literature on nanotechnology to see if there are candidates for the latter category.
Drexler’s molecular manufacture
Drexler is an obvious candidate. As early as 1986, his prophecies of “molecular manufacture” were guided by the description of proteins and ribosomes in terms of machinery, and as a postgraduate, he studied in the laboratory of Marvin Minski, a leading figure of Artificial Intelligence 22 . According to Otavio Bueno, Drexler’s views of self-replicating nanorobots were inspired by Von Neumann. 23 His argument is based on the evidence of a few references to Von Neumann in Engines of Creation and on an interview with Drexler. However the influence of Von Neumann on Drexler is far from obvious.
Drexler started with a conventional definition of machine in Chapter 1, 24 and he often claimed that his molecular manufacture was the extrapolation of today’s automated factories to the smallest scale, by a process of ‘mental shrinking’. "Just as ordinary tools can build ordinary machines from parts, so molecular tools will bond molecules together to make tiny gears, motors, levers [...] and assemble them to make complex machines". 25 He described molecules as rigid building blocks, similar to the parts of tinker toys to be assembled like the elements of Lego construction sets. The functions performed by the various parts of molecular machinery are also essentially mechanical. They position, move, transmit forces, carry, hold, store, etc. The assembly process itself is described as a “mechanosynthesis”, positioning the components with a mechanical control.
However there are four occurrences of the phrase “complex machines” in Engines of Creation . One relates to protein machines: “the forces that stick proteins together to form complex machines are the same ones that fold the protein chains in the first place”. 26 The others are related to artificial machines: "Just as ordinary tools can build ordinary machines from parts, somolecular tools will bond molecules together to make tiny gears, motors, levers [...] and assemble them to make complex machines" 27 ; the third one concerns the feasibility of nanotechnology and assemblers, “the heart of the case rests on two well-established facts of science and engineering. These are (1) that existing molecular machines serve a range of basic functions, and (2) that parts serving these basic functions can be combined to build complex machines”. 28 Finally, each advanced assembler can contain “an average of one hundred atoms – enough parts to make up a rather complex machine” . 29
The three references to artificial complex machines derive from bioengineering, which globally rests on the view of cells as factories full of individual machines. In Drexler’s view, genetic engineers have full control on the individual machines. They pick and place them, they reengineer DNA and proteins in order to perform pre-determined specific tasks. In short, they rely on a Cartesian paradigm. Although he never refers to Descartes, Drexler shares his famous claim that the combinations of the visible parts of our machines are analogous to the combinations of the tiny (of course Descartes didn't say "nano") invisible components of animal organisms. 30 "Molecules have simple moving parts, and many act like familiar types of machinery". 31
Drexler nevertheless stressed a big difference between cells and artificial machines. Unlike our machines, natural molecular "machines" (in cells) are self-assembling. If we put the different parts of a car in a big box, and if we shake the whole, we never get a car. Drexler’s program comes down to reduce this ultimate difference. Unlike bulk technology, molecular technology allows a way for parts to self-assemble. Tomorrow’s nanoengineers will design artificial nanomachines, new protein tools that will be able to assemble parts. They will act like automated machine tools programmed by punched tapes. These programmable protein machines inspired by ribosomes and enzymes, will bond molecules together with great precision. They will be made of a tougher stuff than the soft and weak molecular machines of the cell.
“Protein machines will thus combine the splitting and joining abilities of enzymes with the programmability of ribosomes […] Enzyme-like second-generation machines wil be able to use as “tools” almost any of the reactive molecules used by chemists – but they will wield them with the precision of programmed machines”. 32
Drexler’s programmed assemblers have nothing in common with Von Neumann’s automata. The universal assembler is not self-replicating. It needs material and energy, and instructions for use. His molecular manufacture made partes extra partes , with assembling process, is a mixture of conventional mechanics and computer science. A complex machine in Drexler’s view is just an aggregate of simple machines. Insofar as he relies on the view of both natural and artificial machines as systems reducible to their parts, Drexler has no choice but to describe the assembly process by analogy with a macro manufacture.
Descartes’s analogy between living beings and artificial machines presupposed the fiction of an artisan-God manufacturing natural bodies parts after parts. Indeed Drexler does not explicitly need such metaphysical requisite, although his nano-fingers have the creative power of God’s finger. Drexler’s world is in the hand of a magic engineer, the so-called “replicator”, which inspired the grey goo scenario based on a process of uncontrolled self-replication. A replicator is made of a reader, a tape, several assemblers and other nanomachines. According to Richard Dawkins (quoted by Drexler), a replicator is a thing that makes a copy of itself. RNA molecules and cells qualify. Replicators manufacture nanosystems by means of assemblers, such as cells manufacture proteins by means of ribosomes, and they are supposed to bridge the gap between human and natural machines. 33 Drexler suggests a sort of “network of factories” forming a selfexpanding, self-replicating system. In such a system, “robots could do all the robots-assembly work, assemble other equipment, make the needed parts, run the mines and generators that supply the various factories with materials and power, and so forth”. 34
Here automated engineering and molecular manufacturing are closely intertwined. But could we go further and characterize replicators as complex machines in the sense of Von Neumann? Replicators have two remarkable features of complex machines: autonomy and self-replication. Drexler remained elusive on the feasibility of his replicators. He just mentioned that: “the chief requirement will be programming the first replicator, but AI systems will help with that. The greatest problem will be deciding what we want”. 35 It comes to no surprise that the controversy raised by Drexler focused on the feasibility of his self-replicating nanorobots. As Whitesides argued: “The assembler, with is pick-and-place pincers, eliminates the many difficulties of fabricating nanomachines and of self-replication by ignoring them”. It is clear that Drexler did not really explore the feasibility of such complex machines. In fact, Drexler confessed that his concept of molecular manufacture does not require self-replicating nanorobots, when confronted to the public anxieties raised by this fiction, he admitted "I wish I had never used the term 'grey goo’”. 36 The fact that he could so easily drop his replicators, suggests that they were just one more independent piece of his machinery, performing a specific task. They were parts of a Cartesian machine.
To sum up, Drexler’s molecular manufacture is described as a collection of independent parts even in its effort to include attributes of complex machines. His grand vision basically rests on a mechanical view of machines combined with the literary theme of the uncontrolled robot. The choice of the term “robot” coined by Karel Capek in the context of utopian (or dystopian) literature, is an indication that his work belongs to the literary genre of science fiction rather than to technical literature on automata. The image of the grey goo revitalized a literary tradition expressing the public’s fear of technology. 37
Drexler’s model has been submitted to merciless critics by chemists such as Richard Smalley and George Whitesides, and other scientists who clearly established that Drexler’s model of machine was inadequate to operate at the nanolevel. 38
Soft Machines or Concrete Machines
Drexler’s machines have been proved non feasible because they are not adapted to the special features of the nanoworld. As Whitesides emphasized a nanoscale submarine would be impracticable because of Brownian motion, which would make useless all efforts to guide the submarine. However neither Smalley nor Whitesides did try to promote an alternative concept of machine. 39
Philip Ball pointed to chemistry as an alternative to the mechanical approach:
I feel that the literal down-sizing of mechanical engineering popularized by nanotechnologists such as Eric Drexler - whereby every nanoscale device is fabricated from hard moving parts, cogs, bearings, pistons and camshafts - fails to acknowledge that there may be better, more inventive ways of engineering at this scale, ways that take advantage of the opportunities that chemistry and intermolecular interactions offer. 40Richard Jones, another critic of Drexler’s machines tried to delineate the profile of more plausible nanomachines. His concept of “soft machines” was a clear response to Drexler rigid machines and mechano-synthesis. Whereas Drexler’s assemblers were downsized versions of familiar machines, Jones stresses that nanomachines cannot be small-scaled versions of industrial macromachines, because of the special physics of the nanoworld. “Physics is different in the nanoworld, and the design principles that serve us so well in the macroscopic world will lead us badly astray when we try to apply them at these smaller scales”. 41 It means that engineers will have to abandon their familiar frameworks. Jones encourages a decisive step : to start addressing the question « how artefacts will function » prior to “how are they to be made », 42 His conviction is that the model for nanoengineering lies in biology. Jones argues that biological soft machines are not the outcome of “the unhappy consequences of the contingencies of evolution”, rather they are “the most effective way of engineering in the unfamiliar environment of the very small”. 43 In his view, biological mechanisms and materials have been designed at the nanoscale, they are perfect to work at that level, they are completely adapted to the special physics of the nanoworld, even though they are not always efficient at the macroscale.
A steam engine is better than a horse, strong and lightweight aluminum alloy is a better material to make a wing out of than feather and bone […] Big organisms like us consist of mechanisms and materials that have been developed and optimized for the nanoworld, that evolution has had to do the best it can with to make work the macroworld”. 44Biology would be then the unique model for engineering at the nanoscale. Therefore Jones outlined the general principles of biological molecular processes and pointed out three major differences between the bio-machines and human conventional technologies. a) Instead of channelling the traffic with tubes and pipes, living systems take advantage of Brownian motion, which moves molecules around and continuously bombard nano-objects. b) Living systems do not use rigid molecules like synthetic chemists: molecules easily change shape and conformation. c) The constraints in building machines at the molecular level differ from those of “bulk technology”. Inertia is no longer a crucial parameter but surface forces – viscosity- becomes a major constraint that prompts nano-objects to stick together. 45 Whereas Drexler considered the distinctive features of the nanoworld as obstacles to be overcome by means of tricks, Jones insists that nanomachines will have to do with Browian motion. Nanomachines will not be designed until engineers abandon their “conventional engineering” and invent new concepts of machines. The key is to understand that “a different feature of the physics that leads to problems for one type of design may be turned to advantage in a design that is properly optimized for this different world”. 46 The properties characteristic of the nanoscale, which are problems for conventional machines, will have to be used as positive opportunities by nanoengineers. Jones thus contrasted two “design philosophies” to make nanoscale artefacts. Conventional design is based “on the principles that have served us so well on the macroscopic scale would rely on rigid materials, components that are fabricated to precise tolerances, and the mutually free motion of parts with respect to each other. As we attempt to make smaller and smaller mechanisms, the special physics of the nanoworld - the constant shaking of Brownian motion and the universal stickiness that arises from the strength of surface forces - will present larger an larger obstacles that we will have to design around”. 47 Nanodesign should be based on the principles used by cell biology, labeled ‘soft engineering’. “The advantage of soft engineering is that it does not treat the special features of the nanoworld as problems to be overcome, instead it exploits them and indeed relies on them to work at all”. 48
Changing obstacles into positive principles of work is exactly what the French philosopher Gilbert Simondon called “concretization”. In his famous book Du mode d’existence des objets techniques (1958), Simondon elaborated a new concept of machine, which differed both from the cartesian model of mechanistic machines and from Von Neumann’s concept of complex machines. He started with a general distinction between abstraction and concretization. A machine is “abstract” when each part has been designed independently, each one for a definite and unique function. Cartesian machines are typical abstract machines because the concept of the machine in the designer’s mind precedes the machine itself. The operations performed by the machine result from its conceptual consistence: there is nothing more in the machine than in the designer’s mind. And of course the machine has to be built before it starts to operate.
By contrast, a concrete machine would not be deduced from general principles. Its feasibility depends on its operating conditions rather than on scientific principles. In fact, it is the machine itself, which creates the conditions required for its operation. The environment where the machine will operate is not an external feature or a simple parameter that engineers have to take into account in the design process. The milieu is not something to which the machine will have to be adapted; it is an intrinsic aspect of the design of the machine. A concrete machine works precisely because of (and not despite) its association with a specific environment.
Simondon illustrated the contrast between abstract and concrete machines with the example of a hydraulic power station, known as Guimbal’s turbine. The problem was to build an electric generator, small enough to be immersed into a water pipe. The major obstacle was the heat produced by the generator, which would cause its explosion at a critical point. Conventional engineers would typically look for all physics principles in order to reduce the size of the generator and subsequently prevent its explosion; then they would adapt the system for underwater conditions. The machine resulting from this conventional design is what Simondon labeled an “abstract” machine. By contrast the “concrete engineer” will imagine how an immersed generator would work, instead of striving to make the generator smaller and smaller before introducing it in a water pipe. The generator has to be in a container filled up with oil. It is supposed to be coupled to the turbine by means of an axe, and immersed into the pipe. In this configuration, water will perform various functions : it supplies power to the turbine, it keeps the machine working; it also exhausts the heat generated by the rotation of the turbine. Oil is also multifunctional: it lubricates the generator; it conveys the heat released by the generator to the surface of the container, which is cooled by the water; and it prevents water to come into the container, due to the difference of pressure between oil and water. The two liquors thus cooperate : the faster the turbine and the generator are rotating, the greater the agitation of oil and water will be, and the better is the cooling of the system. As Simondon emphasized, the aqueous milieu determined the design of the generator. The Guimbal turbine would never work in open air: it would explode. The concrete machine is tightly associated with its environment (in this case, the couple oil and water). Simondon calls individu technique (a technological individual) such a machine because it is self-conditioned, it does not exist as a possible machine prior to being in operation. Since the interactions between the various elements of the machine are not deducible from any set of scientific laws, technology is not science-based. It follows that there is always more in a working machine than in the mind of its inventor.
At this point Simondon introduced a second distinction between “constitution” and “invention”. In his view, the constitution of artefacts is just the materialization of an abstract machine. All effects can be deduced from the analysis of the concept of the machine. Design and operation are two independent tasks. By contrast, to “invent” a machine, is not just assembling logical functions and then put the system in action. The machine is designed according to its operating conditions and in fact, it invents its own environment. The associated environment cannot be anticipated and becomes integral part of the machine. Therefore the "mode of existence" of a “technological individual” cannot be defined prior to its functioning. 49
Simondon’s concrete machines thus deeply differ both from Cartesian machines and from programmed automata. They are not built partes extra partes but invented straight off by envisioning, “imaging” the feedback loops between the machine and its milieu associé . But do they also differ from Von Neumann‘s complex machines? To a certain extent, the system made up by a concrete machine and its associated environment is complex. First, since the machine is self-conditioned, it is autonomous and Simondon suggested that concrete machines were close to the mode of existence of natural beings and that engineers should deal with them as they do with living beings. Second, concrete machines are unpredictable since their inventors will not know how to make the machines until they actually start building them. However, unlike Von Neumann’s complex automata, Simondon’s concrete machines are not self-replicating and their unpredictability does not mean that they are out of control. Never did Simondon suggest that we were about to face a terrifying lack of control over human artefacts. On the contrary, the incorporation of special features of the associated environment into the machine, and the conversion of external data into essential working conditions (such as oil and water in the example of Guimbal’s turbine) warrant a better control on the system. Indeed the machine supersedes the plan that its inventor had in mind, but it never supersedes the inventor. More precisely, by contrast with Von Neumann’s approach to complexity, a concrete machine still relies on the reference to a human subject. Such a machine involves the very special ability of human beings to stress analogies between biological and technological operations. Simondon assumed that we can invent self-conditioned machines because we are ourselves self-conditioned living beings. To be sure, Simondon’s subject is no longer a Cartesian maître et possesseur de la nature . Nevertheless concrete machines rely on human subjects.
To sum up this section, the strong similarity between Simondon’s concrete machines and Jones’s soft machines rests on two key ideas: looking first at how the machine will function and turning obstacles into conditions. However thanks to its additional features - individuality, incorporation of the milieu, and reference to a human subject - Simondon’s notion of concrete machine may provide us with more robust conceptual resources for understanding what is going in nanotechnology, than Jones’s metaphorical notion of soft machines.
Now that the controversy raised by Drexler seems to be closed, and Drexler marginalized, it is time to examine what kind of nanomachines are being effectively designed in laboratories and (maybe for the near future) in manufactures. Are nanoscientists and engineers designing conventional Cartesian machines, or are they aiming at creating uncontrolled machines in the sense of Von Neumann and Dupuy, or something more akin to Simondon’s concrete machines? Let us look at a sample of machines described in scientific publications. Of course the purpose of this review is not to make a kind of “philosophical evaluation”. It is rather aimed at encouraging reflections on the ways of designing nanomachines.
Nanorobotics and Smart Structures
In September 2004 many newspapers reported a “mechanical miracle”. Metin Sitti, director of the Nanorobotics Lab at Carnegie Mellon University built a tiny robot that walks on water like water spiders. This artificial insect was inspired by the mode of locomotion of the Gerridae , a variety of water striders recently studied by an MIT team, which move at 1m/s, the equivalent of 700km/h. Sitti’s prototype raised great excitement because it could be equipped with chemical sensor to detect contaminants in water or with a camera to act as a spy. But what kind of machine is it? The body is made of carbon fibers linked to eight steel-wire legs coated with water repelling plastic. Its “muscles” are flat plates of piezoelectric material. The power is supplied and controlled through three circuits. The “miracle” is precisely that it is a simple automaton. As Setti emphasized those insects have no brain, they don’t need brain with such simple control. 50 Indeed it is a tiny insect–1 gram – but it is not nano, at all. Using only piezoelectricity (the property of changing shape under pressure to produce electricity) for the actuator, it does not rely on size-dependent properties.
Building up true nanorobots confronts us with a communication problem. How to exchange instructions, energy or information with nano-scale objects? Their manipulation with macroscopic instrument such as the STM is just a primitive stage. More refined tools have to be invented in order to « translate » information in quantum physical terms understandable by a nanoscale objects. This is undoubtedly a major challenge for narobotics. Yet it will lead neither to concrete nor to complex machines.
The basic principles of such robots are borrowed from Automated Engineering. They consist of a sensor, a processor and an actuator. The functions being more or less similar to those of humans these items are named ‘‘smart” or “intelligent structures’’. They are so interesting for technological applications that they have been one of the major goals of materials science over the past decade. However, these robots do not require complexity. Smart structures of Micro- Electro-Mechanical Systems (MEMS) are like Cartesian machines. One material acts as a sensor; another one as an actuator; and a third one—generally silicon—is the processor. Access to the nanoscale would increase the performances of microsensors since they could exploit the huge surface of nano-objects in order to detect biochemicals or contaminants. Ideally a nanorobot should be made of one molecule playing the role of a sensor, the next a processor, and a third an actuator. Such an ideal robot would nonetheless still be designed like a Cartesian partes extra partes machine with a component for each specific task and would have none of the features of complex machines or concrete machines.
Molecular motors
Molecular machines are extremely fashionable. Following the design of a variety of tools - gears, rotors, levers, tweezes, switches – in the 1990s, the design of motors has been a major concern since 2000. In fact, prior to the take off of nanoscience, a few molecules capable of moving and rotating had been designed by supramolecular chemists. For instance, the rotaxanes designed by Jean Pierre Sauvage as early as 1983 with a macrocyclic ring trapped onto a “thread” by two bulky “stoppers”, were initially considered as curiosities resulting from a difficult and low-yielding synthesis. The chemists who rest on the principles of chemical topology to interlock those molecules used to describe them as “architectures” rather than as machines. Over the past decade, the few exotic molecules became a whole collection of molecular machines whose synthesis has been made easier thanks to the use of non-covalent (hydrogen bonds or metalligand bonds) interactions, with the help of templates to hold the molecular precursors in the correct orientation. 51
Another example - the molecular wheelbarrow - will help to “anatomize” a molecular motor designed from bottom-up. The designers of the molecular wheelbarrow use the phrase “technomimetic molecules”, since their project was to recreate at the molecular level the functions of macroscale machines. Interestingly they define the molecular machine as a “molecule responding to the orders of its operator”. Whether the operator is the tip of a STM, another molecule or a human hand, the concept is the same. The molecular machine is under control and it has no autonomy whatsoever. The purpose of such challenges is less to make useful technological artefacts than to understand the properties of isolated molecules. After a first attempt at designing a non-directional rotor made of decacyclene in 1998, 52 Christian Joachim and his group reported the design of a uni-directional rotor. It uses a C60 molecule bouncing between two electrodes to transport individual electrons from the source to the drain. The dissymmetric position of the molecule allows the control of the rotation movement. The wheelbarrow consists of the rotor (C60), a stator and a ball-joint (ruthenium atom). Its body itself is an organometallic structure shaped as a three-leg piano stool. The wheelbarrow does not move as its designers predicted. And the identification of the obstacles is probably the most interesting result that they could get. One reason is the molecule flexibility. Instead of standing rigid like crystals, it changes “like Dali’s famous clocks”. A second and major obstacle comes from quantum fluctuations that prevent the stabilization of the device. There is no way to control such fluctuations. Molecular designers have to make with it. Here may be is a promising pathway to generate a concrete machine capable of taking advantage of contingent fluctuations to achieve a specific task assigned to the machine by the designer.
Molecular Electronics
Up to this point we have only considered machines performing mechanical functions. What about machines performing logical tasks such as storing information, or even computing? Would molecular electronics be a more serious candidate for concretization?
Embedding computing capacities in a single molecule has been a dream since the dawn of computer age. In 1974, Mark Ratner (New York University at that time, now Northwestern) and Ari Aviram of IBM envisaged building computers from bottom-up by turning individual molecules into circuit components. This remained a thought experiment (and a stimulating dream) until the 1980s when the scanning-tunneling microscope (STM) came into use. 53 Over the past decades a host of molecular electronic devices have been designed. And the breakthrough of 2001 was connecting those devices to make a circuitry. Indeed the step from the device level to the circuit level was a major achievement legitimizing the term nanomachine. However we are still far from both complex and concrete machines. The nanocircuit is nothing more than a collection of independent parts, each one performing a particular task. It is an "abstract" machine meant for an external purpose. There is no indeterminacy apart from the conventional margins of failure. To achieve a real move towards a non-Cartesian machine, one would have to get rid of the concept of circuitry and to design a radically new concept of electronic machine. Such a problem was clearly formulated by Christian Joachim:
The machine that we are trying to design has no parts. Our aim is precisely to get rid of parts, be them electronic devices or Qbits. Mechanics is still practiced in a sensorial space with parts to assemble. Such was also the case in the early times of molecular electronics. We had to divide the circuit into small parts: molecules, quantum bits. But it turned out that it is difficult to control the whole system on a wafer. Now, we are exploring a partless approach. In quantum dynamics, we deal with the space of states and no longer within the usual space. The approach is formally similar to that in thermodynamics of computation. We need to be out of equilibrium, at the quantum level by preparing the molecule in a non-stationary quantum state. The molecule has to be out of equilibrium in order to have it performing a task. But it is costly in design because we have to maintain the quantum evolution out of decoherence during one computation cycle. It is also costly in control because we have to control the full quantum trajectory in a gigantesque state space for each logic function. 54This project points to a new sort of machine. Will it be a complex or a concrete machine? The answer would be premature.
Wet Technology
Over the past decades molecular biologists and biophysicists have jointly investigated the motors that move muscles, sperm and cells, in living systems for a variety of medical applications. These natural phenomena are invariably described by analogy with human technology. 55 The conditions for proteins such as myosins, kinesins and dyneins to be motors have been studied for many decades, but now biologists and nano-engineers want to know how exactly they operate at the molecular level. In this respect, the research field now established as bionanotechnology differs from the research tradition in biomechanics initiated by D’Arcy Thompson. The structures and processes displayed in biology came to epitomize a new technological paradigm often labeled “wet technology” since operations in living systems are usually performed at room temperature, in aqueous milieu with soft materials much more flexible and versatile than the parts of our rigid machines.
The Bioengineering Nanotechnology Initiative launched in 2002 by the US National Science Foundation prompted a reorganization of research with interdisciplinary teams aiming at identifying the molecular components of living systems, and understanding the process of their synthesis in situ in order to take inspiration from them. Understanding the ways of nature and exploring new technological avenues merge into one single research program. In this program, it is more or less tacitly assumed that understanding one biological motor comes down to understanding a fundamental process because nature tends to use and re-use the same solution for a problem. And it is more or less expected that the access to the “fundamental” level secured by molecular biology will provide us with THE bottom-up method that nature and art can share. Nanotechnology and molecular biology rest on the same epistemological credo that a detailed knowledge of structure will lead to the control of functions and sometimes even processes. 56 As long as such programs tend to capture an essential structural element and rely on it while neglecting all the messiness created by molecular agitation at the nanoscale, they are not really leading to a new technological paradigm. Whatever the promise and prowess of the sophisticated nanomedicines under study, from a philosophical perspective they look extremely conventional.
At the cross-road between biology and nanotechnology, two different strategies are being used: either re-engineering biological machines for making artefacts or mimicking them, making artefacts inspired by technical solutions displayed in nature.
Since the mechanisms designed by living systems are the best candidates for the title of complex machines, it is tempting and promising to take hold of them and divert them for technological purposes. But are we sure that re-engineering machines designed by living systems in order to perform tasks they are not meant to perform, will help build complex machines?
Molecular recognition is the most enviable property that engineers seek to use for the design and synthesis of all kinds of machines. DNA is a very efficient tool for building nanomachines, provided it be re-engineered for technological purposes. For instance, branched DNA molecules – instead of linear sequences – with sticky ends can be used as scaffolding to organize the components of nanoelectronics. DNA can also be used to produce mechanical devices because it is robust. But the huge organizational potentialities of DNA cannot be efficiently exploited unless DNA is combined with inorganic components such as nanotubes or nanocrystals whose physical properties are directly needed for applications. The “soft machines” designed by nature are not directly fit for the conditions of dry technology. Researchers have begun to harness biological structures to optimize existing functions of nucleic acids and proteins or to create new ones. As Ronald Breaker argued, “the challenge for biochemists is to take RNA and DNA beyond their proven use as polymers that form a double helix”. 57
Although this option is sometimes considered the most promising for commercial applications, 58 from a technological perspective it may be deceiving. First nanobioengineers tend to isolate a few interesting mechanisms from their context of operations and they overlook the difference between the contexts of human design and nature’s design. The former relies on plans and aims at standardization - while evolution is a blind process generating variability through mutation and recombination over a long period of time and later selecting a few structures. As Steven Vogel emphasized, each domain has acquired a coherence and consistency, a rationality of its own, so that it maybe a nonsense to pick up a few local recipes and try to copy them. 59 Moreover, the current examples of hybrid devices relying on the convergence of technologies are just designed by aggregation of functions. They are deduced from scientific principles and built up partes extra partes . Hybridization comes down to downplay the complex machines “invented” by nature in order to turn them into simple Cartesian machines. Hybrid machines are “constituted” rather than “invented”. Even the grandiose programme aimed at making hybrid machines or robots assisting, repairing human bodies and brain, through the convergence of nanotechnology, biotechnology and cognitive science, belongs to the old Cartesian paradigm, since the basic assumption is that living organisms are “chemical computers” i.e. machines with internally stored information. 60 The brain itself is described as a machine ruled by algorithms. 61 The “mechanization of the mind” may well lead to building useful devices but less plausibly to complex machines or concrete machines.
The alternative strategy - biomimetism - has been first developed by materials scientists who realized that nature had built multifonctional and highly performant structures and that could well draw lessons from nature. This approach resulted in the design of a number of already commercialized structures as well as to better understanding of biomineralization in marine organisms or the production of fiber by spiders. However this approach does not exclusively belong to nanotechnology, since it is based on the clear recognition that the performances of natural structures are due to their hierarchical structure, and consequently involve multiple length scales.
The interest of chemists for processes as well as structures has prompted their attention – and admiration - for the process used by cells to reproduce when they divide. “Self-assembly is a process in which molecules or parts of molecules spontaneously form ordered aggregates, usually by non covalent interactions”. 62 Self-assembly involves two major features. First, it is a spontaneous process. Components of living systems assemble without intervention of orders coming from outside. Instructions for the design of the “machine”are built in the components, and the environment is involved as a component. Second self-assembly uses reversible interactions, i.e. non-covalent bonds. The continuous thermal agitation allows molecules to move around, in order to adjust and re-adjust. These reversible arrangements are crucial to obtain aggregates without defect.
Self-assembly is more similar to self-organization than to conventional engineering. Creating order out of disordered moving elements is so typical of life that it has long been ascribed to a mysterious vital force. Today molecular biologists rather look at protein folding or the formation of lipid bilayers as exquisite and optimized mechanisms. Yet self-assembly remains a process of making things through generation rather than through engineering . Instructions are buit-in the components, instead of being provided by an external program or engineer. To what extent self-assembly could be considered as a technological process of “invention” or “concretization”?
Because of its spontaneity, self-assembly has encouraged the perspective of a new era of technology without human subject. In 1995, Whitesides believed in a future of autonomous machines:
“Our world is populated with machines, non living entities assembled by human beings from components that humankind has made. Our automobiles, computers, telephones, toaster ovens and screwdrivers far outnumber us. Despite this proliferation, no machine can reproduce itself without human agency. In the twentieth century, scientists will introduce a manufactured strategy based on machines and materials that virtually make themselves.” 63
However this autonomy is extremely limited. First, the various techniques of self-assembly developed by chemists and biologists over the past decades are not self-replicating techniques. Moreover far from suggesting a process of making things without human intervention the techniques of self-assembly display treasures of ingeniosity: playing with weak forces with energies close to thermal agitation (such as H-bonding, Van der waals, electrostatic, capillary, hydrophobic and hydrophilic bonding), building templates to grow the aggregate with geometrical constraints… To be sure nanoscientists and nanoengineers are learning a lot from biology, but they are not simply “mimicking” natural processes. They are using all possible resources from thermodynamics and of chemistry in order to take advantage of molecular interactions for creating order out of disorder, in view of making useful things. So far however, most molecular self-assembly strategies have been confined to static devices, resting on equilibrium at minimum of energy. For inventing “concrete machines” the next step should be making dynamic systems that turn the obstacle of molecular agitation into conditions for the machine to operate. 64 Just as Guimbal designing his turbine chemists and nanoengineers will have to imagine functional structures as “individuals” with their own associated environment.
Conclusion
This paper is only a preliminary attempt at a conceptual clarification of nanotechnology. However it may be useful for the current debates about the so called revolutionary nanotechnology. Drexler claimed that nanomachines would open up a new technological era, but his own “engines of creation” rather suggest the resilience of the old Cartesian paradigm. Although self-assembly and biomimetism may lead to more “concrete machines”, most nanomachines currently designed are old wine in new flasks. Dealing with individual molecules does not necessarily entail that a deep revision of conventional engineering methods.
The debates over the control of nanomachines seem to be undermined by a confusion between two distinct notions : Von Neumann’s complexity, which would result in undeterministic and uncontrolled machines and Simondon’s “technological individuality”, which would result in deterministic machines associated with their environment and consequently under better controlled than conventional machines.
In our view, the most immediate dangers do not come from self-replicating nanorobots. They may come from the uncontrollable interactions between the various nanomachines that are being designed and the environment. The relations between machines and their associated environments, between the technosphere and the biosphere have not been seriously investigated and should be paid more attention.
In its ambition to explore the nanoworld by making machines, nanoscience may be seen as the continuation of the chemists’ multisecular endeavour for knowing nature through making artefacts. In this respect, tnanoscientists and engineers tend to dissolve the unity of nature constructed by classical mechanism and the grand narratives provided by Newton or Einstein into a multitude of tiny machines. Nanoscientists hold the local but they loose the global view. The famous slogan “shaping the world atom by atom” associated with an image of space is misleading. It diverts the attention from the fact that a jungle of nanomachines is not a cosmos. How those nanomachines fit together and how they operate into a complex system is still unclear.
Acknowledgements
We are very grateful to Dr Christian Joachim, Pr Jean-Pierre Dupuy and an anonymous reference for their critical comments on earlier version of this article. Part of the research for to this essay has been funded by the programme Bionanoethics of the Agence nationale de la recherche (Project n°NT05-4_44955).
References
Ball, P., 2002: ‘Natural Strategies for the molecular engineer’, Nanotechnology , 13, 15-28.
Ball, P., 2003: ‘Nanotechnology in the Firing Line’, Nature , December, 23, .
Baum, R., 2003: ‘Nanotechnology. Drexler and Smalley make the case for and against molecular assemblers’, Chemical & Engineering News , 81, N°48, 37-42.
Bensaude-Vincent, B., 2001. “The Construction of a Discipline : Materials Science in the U.S.A”, Historical Studies in the Physical and Biological Sciences ,31, part 2, 223-248.
Bensaude-Vincent, B., Arribart, H., Bouligand, Y., Sanchez, C., 2002: “Chemists at the School of Nature”, New Journal of Chemistry , 26, 1-5.
Boncheva M., Whitesides G.M., 2005: “Making Things by Self-Assembly”, MRS Bulletin , 30, oct 2005, 736-742.
Breaker, R., 2004: ‘Natural and Engineered Nucleic Acids as Tools to Explore Biology’, Nature , 16 dec 04 p. 838.
Breen, T. L., Tien. J., Oliver, S.R., Hadzic T., Whitesides G., 1999: ‘Design and Self-Assembly of Open, Regular, 3D Mesostructures’, Science , 284 (7 May), 948-951.
Bueno, O., 2004: “Von Neumann, Self-Reproduction and the Constitution of Nanophenomena” in Baird, D. and al. eds, Discovering the Nanoscale , IOS Press, 101-118.
Canguilhem, G., 1952: “Machine et organisme” in La connaissance de la vie , Paris, Hachette, quoted from the fourth edition Paris, Vrin, 101-128.
Canguilhem, G., 1979: “Le tout et la partie dans la pensée biologique ”, Etudes d’histoire et de philosophie des sciences , Paris, Vrin, 319-334.
Drexler, K.E., 1981: ‘Molecular engineering: An approach to the development of general capabilities for molecular manipulation’, Proceedings of the National Academy of Sciences , 78, N°9, chemistry section, 5275-78.
Drexler, K.E., 1986: Engines of Creation , Anchor Books. Quoted from the 2nd ed. 1990.
Drexler, K.E., 1992: Nanosystems. Molecular machinery, manufacturing and computation , Palo Alto, John Wiley & sons.
Drexler, K.E., 2001: ‘Machine-Phase Nanotechnology’, Scientific American , Sept. , 74-75.
Dupuy, J.P., 2000: The Mechanization of the Mind , Princeton N.J., Princeton University Press.
Dupuy, J.P., 2004: “Complexity and Uncertainty”, in Foresighting the New Technology Wave , High-Level Expert Group, European Commission, Brussels.
Fox Keller, E., 1995: Refiguring Life. Metaphors of 20th century Biology New York, Columbia University Press.
Guchet, X., 2005: Les sens de l’évolution technique , Paris, Editions Léo Scheer.
Joachim, C., Gimzewski G., Aviram A, 2000: “Electronics issuing hybrid-molecular or monomolecular devices”, Nature , 408, 541-48.
Joachim, C., 2005: “To be nano or not to be nano ?”, Nature Materials , 4, February, 105-109.
Jones, R.L., 2004: Soft Machines , Oxford University Press, Oxford, New-York.
Kaminuma, Tsuguschika (eds),199: Biocomputers. The Next Generation for Japan , New York, London Chapman Hill.
Lafitte, J., 1932: Réflexions sur la science des machines , Librairie Bloud & Gay, Paris.
Leigh, D.A., Wong J., Dehez F., Zerbetto, F., 2003: ‘Unidirectional rotation in a mechanically interlocked molecular rotor’, Nature , 424,174-179.
Lehn, J.M.,1985: ‘Supramolecular Chemistry: Receptors, Catalysts and Carriers’, Science , 227, 849-56.
Maurel, M.C., Miquel, P.A.: 2001, Programme génétique : concept biologique ou métaphore ?, Editions Kimé, Paris
Merkle, R.,1992: ‘Self Replicating Systems and Molecular Manufacturing’. www.zyvex.com/nanotech/selfRepJBIS.html
Neumann, J. Von, “The General and Logical Theory of Automata”, in Cerebral Mechanism in Behavior: The Hixon Symposium, New York, John Wiley and Sons, 1951
Newman, W., 1989: ‘Technology and the Alchemical debate in the Late Middle Ages’, Isis , 80, 423-445.
Phoenix, C., Drexler, K.E., 2004: ‘Safe exponential manufacturing’, Nanotechnology , 15, 869- 872.
The Royal Society and the Royal Academy of Engineering, 2004, Nanoscience and Nanotechnology : Opportunities and Uncertainties. London, Document 19/04 ; http///www.nanotec.org.uk.
Note : The url provided above returned invalid results.
Relevant information may be found at the following link:
http://royalsociety.org/Saunier, C., 2005: L’évolution du secteur des semiconducteurs et ses liens avec les micro et les nanotechnologies , rapport Assemblée nationale (N°566) et Sénat (N°138), Paris, Assemblée nationale, 3 vols.
Sauvage, J.P.: “ Les nanomachines moléculaires : de la biologie aux systèmes artificiels et aux dispositifs” . http://culturesciences.chimie.ens.fr/NanomachinesJPSauvage.pdf
Seeman, N.C., Belcher, A.M., 2002: Emulating biology: building nanostructures from the bottom up Proceedings of the. National Academy of Science USA , 99 , 6451-6455 .
Simondon, G., 1989 : Du mode d'existence des objets techniques , Aubier, Paris.
Whitesides, G.M., 1995: ‘Self-Assembling Materials’, Scientific American , sept : 146-149.
Whitesides, G.M., 2001: ‘The Once and Future Nanomachine’, Scientific American , Sept : 78-83.
Whitesides, G.M. Wrong A.P. 2006: “The intersection of Biology and Materials Science”, MRS Bulletin , 31, January 2006, 19-27.
Wood, S., Jones, R.A.L., Geldart, A., 2003: The social and economic challenges of nanotechnology Economic and Social Research Council Report available at www.esrc.ac.uk/esrccontent/DownloadDocs/Nanotechnology.pdf Note : The url provided above returned invalid results.
Visit the homepage at:
http://www.esrcsocietytoday.ac.uk/ESRCInfoCentre/index.aspxZhang, S., 2003: ‘Fabrication of novel biomaterials through molecular self-assembly’, Nature Biotechnology , 21, N°10: 1171-78.
_____________________________
1 In 2002 NASA Argonne laboratory made circuits smaller than micro-circuits by using genetically modifying proteins extracted from high temperature tolerant bacteria as templates to create hexagonal patterns on which nanoparticles of gold were added.
2 See Maurel, M.-C., Miquel, P.-A. (2001)
4 Supramolecular chemists for instance used such metaphors before the term nanotechnology was coined. See for instance Jean Marie Lehn (1985). This paper has been a source of inspiration for Drexler, see Drexler, E. (1986) p. 244.
5 See for instance Saunier, C. (2005) vol 1 p. 65. On p. 70, one can read « DNA computer tries to take inspiration from a rather efficient model of computer existing in nature, i.e. living organisms ».
6 According to J.L . Austin’s theory of speech-acts, the function of language is not only descriptive but performative. The scientific effectiveness of metaphors in biology is illustrated in Fox Keller, E. (2005) . It is important to try to assess the impact of this loose terminology on the future artefacts that will be manufactured. In particular, the machine metaphor may express a deep change in the relations between nature and artefact that would consequently affect the patent policy.
7 Dupuy, J.-P. (2000) p. 7
9 The Royal Society and the Royal Academy of Engineering, 2004 .
10 This broad meaning of nanoscience is in stark contrast with Joachim’s narrow definition of nanoscience. Joachim, C. (2005)
11 This definition was used by George Whitesides in his criticism of Drexler. Whitesides, G. (2001)
12 Most nanoscientists do not care for the difference between a machine and a device, even though many of them emphasize that the goal is actually to let a single molecule functioning for a specific task.
13 Jerome Taylor ed. The Didascalion of Hugh of Saint Victor (New York, Columbia University Press, 1981, pp. 55-56, quoted from Newman, W. (1989) , p. 424.
14 The American Heritage Dictionary of the English Language , edited by William Morris, Boston, Hougthon Mifflin, 1978, in Drexler, E. (1986) p. 5.
15 See Canguilhem, G. (1979) , « La partie et le tout dans la pensée biologique » and his distinction between the technological model and the political model (le tout est aussi au service des parties, l’organisme entier contribue à la vie des cellules)
19 Von Neumann’s talk was not the sole attempt at the Hixon Symposium to introduce the notion of complexity against McCulloch’s constructive approach. The neurophysiologist Karl Lashley, and the embryologist Paul Weiss also argued that the brain was not a computing machine, and rather was a continuous field with emergent features. Although Lashley and Weiss’s approach to the nervous system was clearly antireductionist (irreducible to their components), it was not holistic: complex totalities are neither reducible to the properties of their parts, nor Leibnizian monads whose unity is substantial. Between reductionism and holism, between nominalism and substantialism, the theory of complexity offered a third model of the whole/parts relations. Unlike Von Neumann however, Lashley and Weiss drew a sharp boundary between living and non-living beings. For them, complexity was the exclusive property of biological systems whereas Von Neumann assumed that complexity could be embedded in artificial automata.
20 Dupuy, J.-P. (2000) p. 142
21 Dupuy, J.-P., Grinbaum, A. (2004) p. 8
22 Drexler got his PhD laboratory at MIT in Marvin Minski’s, who in turn had been supervised as a doctoral student by Von Neuman. Minski wrote a preface for Engines of creation in 1986
24 Drexler (1986) p. 5
25 Drexler, E. (1986) p. 12. See also Drexler, E. (2001), p. 74.
26 Drexler, E. (1986) p. 10.
27 Drexler, E. (1986) p. 12. See also Drexler, E. (2001) , p. 74.
28 Drexler, E. (1986) p. 17.
29 Drexler, E. (1986) p. 56.
30 Descartes (1637), Discours de la méthode , 5th section.
31 Drexler, E. (1986) p. 102
32 Drexler, E. (1986) p. 14
33 Drexler, E. (1986) p. 56: “Some of these replicators will not resemble cells at all, but will instead resemble factories shrunk to cellular size. They will contain nanomachines mounted on a molecular framework and conveyor belts to move parts from machine to machine. Outside they will have a set of assembler arms for building replicas of themselves, an atom or a section at a time”.
34 Drexler, E. (1986) p. 54
35 Drexler, E. (1986) p. 121
36 Phoenix, C., Drexler, E. (2004)
37 Daniel P Thurs and Stephen Hilgartner rightly noted that the threat of the expansion of the grey goo is the mirror image of the threat of an uncontrolled public opinion – like the luddites or the opponents to GMOs refusing new technologies. See Conference on nano-ethics, South Carolina, March 2005.
38 See articles by Richard Smalley, George Whistesides, Robert Buderi in Scientific American , Sept 2001. Chris Phoenix, “Of chemistry, Nanobots and Policy”, Center for Responsible nanotechnology, December 2003.
39 When Whitesides asked "What is a machine?”, he contented himself with a very traditional answer. “A machine is a device for performing a task”. It has “a design, it is constructed following some process, it uses power, it operates according to information built into it when it is fabricated”. [ Whitesides, 2001 , p. 78]
40 Ball, p. (2002) , p. 16
41 Jones, R. (2004) , p. 85
42 See Jones’ Softmachine Blog : entry Wednesday, June 29th, 2005 « Debating the feasibility of nano-manufacturing»
43 Jones, R. (2004) , p. 2, 3
44 Jones, R. (2004) , p. 6, 7
45 Jones, R.A. (2004) , p. 56-86.
46 Jones, R. (2004) , p. 86
47 Jones, R. (2004) , p. 127
48 Jones, R. (2004) , p. 127
50 http://www.me.cmu.edu/faculty1/sitti/nano/index.html. Le Monde , mercredi 15 septembre 2004, p. 25
Note : The url provided above returned invalid results.
Relevant information may be found at the following link:
http://nanolab.me.cmu.edu/51 As an example of the use of hydrogen bonds see : Leigh, D. A., Wong, J. K. Y., Dehez, F. Zerbetto, F. (2003) . For an overview of molecular motors see ?
52 Joachim, C. and al, Science 281 (1998) 531-33.
53 For a historical sketch of molecular electronics see Joachim, C., Gimzewski, G., Aviram, A. (2000)
54 Personal interview, Toulouse, February 15, 2005.
55 The “power station” fuelling “living motors” is the ATP synthase. It provides the chemical energy that proteins transform into mechanical energy for cellular locomotion, division, maintenance and intracellular traffic
56 This shared assumption is noticed by the anonymous editorialist of « why small matters », Nature Biotechnology , 21, number 10 (October 2003) p. 1113. The research program conducted by the Curie Institute in Paris on Myosins aiming at unveiling their atomic structure with the help of X-Ray Cristallography exemplifies the assumption from structure to functions.
58 For instance Ball, P. (2002)
59 Vogel, S. (1998) in particular chapter 14 on the contrasts between nature and technology.
62 Boncheva, M., Whitesides, G. (2005) p. 736
63 Whitesides, G. (1995) p. 146