LogoCalvin, W.H., 1997; The Six Essentials? Minimal Requirements for the Darwinian Bootstrapping of Quality.
Journal of Memetics - Evolutionary Models of Information Transmission, 1.
http://www.fmb.mmu.ac.uk/jom-emit/1997/vol1/calvin_wh.html


This is the author's original formatting; if you prefer
the numbered paragraphs of technical manuals,
switch to reading the official version from the
Journal of Memetics web site.

THE SIX ESSENTIALS?
Minimal Requirements for the
Darwinian Bootstrapping of Quality

William H. Calvin

University of Washington
Seattle WA 98195-1800 USA
WCalvin@U.Washington.edu
http://www.WilliamCalvin.com

Abstract: Selectionism emphasizes carving patterns, memes remind us of minimal replicable patterns, but a full-fledged Darwinian process needs six essential ingredients to keep going, to recursively bootstrap quality from rude beginnings. While there may be situations ("sparse Darwinism") in which a reduced number suffice, another five ingredients, while not essential, greatly enhance the speed and stability of a Darwinian process. While our best examples are drawn from species evolution, the immune response, and evolutionary epistemology, the Darwinian process may well be a major law of the universe, right up there with chemical bonds as a prime generator of interesting combinations that discover stratified stabilities.


Since Richard Dawkins' The Extended Phenotype got me to thinking about copying units in the mid-1980s, I have been trying to define a cerebral code (the spatiotemporal firing pattern that represents a word, image, metaphor, or even a sentence) by searching for what can be successfully replicated in the brain's neural circuitry, a minimum replicable unit.
      I indeed found such circuitry (it implies that the firing pattern within several hundred minicolumns of neocortex, contained in a 0.5 mm hexagon, is such a copying unit). But to explore creativity in higher intellectual function, I wanted to see if the resulting copies could compete in a Darwinian manner, the process shaping up quality as it goes. And that forced me to try and boil down a lot of evolutionary biology, attempting to abstract the features that were essential (for what I came to call "the full-fledged Darwinian process") from those that merely contributed to speed or stability.
      This isn't the place to describe the neural outcome -- it's in The Cerebral Code and, more briefly, in the seventh chapter of my other 1996 book, How Brains Think -- but this does seem an appropriate place to review what I started calling "The Six Essentials." They seem applicable to a wide range of problems within memetics9 as the field attempts to cope with evolutionary models of information transmission. For a more general history of memetics, see the useful bibliographies22 of McMullin, Speel, and Wilkins; I will only mention a few (mostly cautionary!) contributions from neuroscience along the way.


Selectionism

Looking back into the history of biology, it appears that wherever a phenomenon resembles learning, an instructive theory was first proposed to account for the underlying mechanisms. In every case, this was later replaced by a selective theory. Thus the species were thought to have developed by learning or by adaptation of individuals to the environment, until Darwin showed this to have been a selective process. Resistance of bacteria to antibacterial agents was thought to be acquired by adaptation, until Luria and Delbrück showed the mechanism to be a selective one. Adaptive enzymes were shown by Monod and his school to be inducible enzymes arising through the selection of preexisting genes. Finally, antibody formation that was thought to be based on instruction by the antigen is now found to result from the selection of already existing patterns. It thus remains to be asked if learning by the central nervous system might not also be a selective process; i.e., perhaps learning is not learning either.
Niels K. Jerne, 1967

The term "selectionism" covers a wide range of cases, ranging from fancy biology with sexual selection to examples that are called "selective survival" because they lack any notion of replication. Brain development offers many examples of this simple end of selectionism.

If the removal of connections or cells is carried too far (a common problem in carving wood blocks for printing), there may be no way back (unless, of course, unmodified copies survive elsewhere). Selective strengthening of interconnections, in the face of a culling process, probably accounts for most neural examples of selectionism.
     Quality can emerge from such carving or selective strengthening. For example, the perception of a speech sound involves the creation of a prototype category that standardizes the meaning despite a range of variations, and selective survival of synaptic connections within a neural feltwork is thought to contribute to categorical perception. But -- and this is the important point for memetics -- there is nothing recursive about this type of quality enhancement.
     Which of the selectionist examples should also be called "Darwinian"? I won't review the Darwinian claims except to note that, if we are to blame anyone for the frequent confusion of selective survival with the full Darwinian process, we would have to start with Charles Darwin himself, who named his multipart theory (more in a minute) to emphasize one particularly novel aspect: Natural Selection.
     I don't want to seem to be prescribing what's "Darwinian" and what isn't, but I think that we must be cautious about ascribing recursive bootstrapping of quality (what I take to be the heart of the matter, what makes evolution so interesting) to any process that has a few elements of the process that Darwin and followers have worked out over the last 160 years. Simulations may eventually demonstrate a semidemihemiquasi-Darwinian bootstrap, another self-organizing process that gets better and better -- but, until the capacities of sparse solutions are well demonstrated, caution is in order.

Sparse Darwinian Possibilities

There are two "halfway houses" which may prove to be more interesting than environmental carving of patterns. First, since brain development (to continue the earlier story) is never really over (it just slows down, and the gene repertoire may shift), and since new synapses may form during adulthood, one is initially reminded of a biological population with replacement and growth -- and Darwinian shaping up. But observe that there isn't a pattern being replicated with variations; there isn't a population of such patterns competing with other patterns, etc. -- which is what population usually means, not merely a number of constituents comprising the pattern being carved.
     While Gerald Edelman (in his 1987 book, Neural Darwinism; see my book review in Science7) has such a population lacking patterned individuals, he goes beyond selectionist carving in an interesting, nontraditional way: he has a notion of interacting maps, that shape up one another in a manner rather like the sometimes creative back-and-forth interactions between author and editor (my analogy, not his -- as is my perhaps shopworn name for it, revisionism). I have a difficult time identifying either an individual unit, or a distinctive copying mechanism for it, in Edelman's lots-of-neurons notion of a "population," even if his re-entrant loops are reminiscent of generations. His differential amplification via re-entrant loops, however, is undoubtedly an important process (I particularly like it for the consolidation of episodic memories20).
     On closer inspection, neither developmental patterning nor Edelman's reentry fits my concept of Darwin's process. Populations -- in ecology and evolutionary biology and immunology -- usually involve lots of patterned individuals somehow making near copies of themselves, all present at the same time, interacting with one another and with the environment.
     Yet analogies always leave something out; we don't expect them to be perfect fits, exactly the same thing. As the poet Robert Frost once said, we have to know how far we can ride a metaphor, judge when it's safe. That's exactly our problem in memetics, and why Edelman's notions have proved controversial. When, then, are we forced to ascribe, to a candidate such as developmental patterning or reentry, the potential for the recursive bootstrapping of quality that we associate with Darwin's process, which we regularly see operating on the biological species and the antibody?
      To approach an answer to that question, it will be useful to enumerate what has contributed to Darwin's process, while trying to strip it of the biological particulars -- and then ask how well it could limp along with a reduced number of components (what I've started calling "sparse Darwinism").


The Full-fledged Darwinian Process

The six essentials aren't a settled issue. What I was aiming for, however, were the essential ingredients of an algorithmic quality-improvement process20, stated in a way that didn't impose a lot of biological preconditions. I wanted, for example, to avoid making use of the genotype-phenotype21 distinction, or a universal translation table like the genetic code; I wanted to describe a process, not make an analogy. John Holland's computational technique10 known as the "genetic algorithm" comes close to what I had in mind, but Holland was trying to mimic recombination genetics in a computational procedure for discovering solutions, and I wanted to abstract more general principles that avoided the presumption of recombination.
     Since many of us think that (properly defined) the Darwinian process is a major law of the universe, right up there with chemical bonds as a prime generator of interesting combinations that discover stratified stabilities2, we want it to be able to run on different substrates, each with their own distinctive properties that may, or may not, correspond to those seen elsewhere. So our abstraction should fit the species evolution problem, as well as the immune response, but also be independent of media and time scale. Here, paraphrased from The Cerebral Code, is what I ended up with:

    1. There must be a pattern involved.

    2. The pattern must be copied somehow (indeed, that which is copied may serve to define the pattern). [Together, 1 and 2 are the minimum replicable unit -- so, in a sense, we could reduce six essentials to five. But I'm splitting rather than lumping here because so many "sparse Darwinian" processes exhibit a pattern without replication.]

    3. Variant patterns must sometimes be produced by chance -- though it need not be purely random, as another process could well bias the directionality of the small sidesteps that result. Superpositions and recombinations will also suffice.

    4. The pattern and its variant must compete with one another for occupation of a limited work space. For example, bluegrass and crab grass compete for back yards. Limited means the workspace forces choices, unlike a wide-open niche with enough resources for all to survive. Observe that we're now talking about populations of a pattern, not one at a time.

    5. The competition is biased by a multifaceted environment: for example, how often the grass is watered, cut, fertilized, and frozen, giving one pattern more of the lawn than another. That's Darwin's natural selection.

    6. New variants always preferentially occur around the more successful of the current patterns. In biology, there is a skewed survival to reproductive maturity (environmental selection is mostly juvenile mortality) or a skewed distribution of those adults who successfully mate (sexual selection). This is what Darwin later called an inheritance principle. Variations are not just random jumps from some standard starting position; rather, they are usually little sidesteps from a pretty-good solution (most variants are worse than a parent, but a few may be even better, and become the preferred source of further variants).

Neural patterning in development is a sparse case: just a pattern and a multifaceted environment. There is no replication of the pattern, no variation, no population of the pattern to compete with a variant's population, and there's nothing recursive about achieving quality because there's no inheritance principle.


Example: History as a Darwinian Process

History qua history -- what it includes, what it leaves out, and how these change over time -- provides us with a memetic example of these six essentials at work. Of the many happenings, some are captured in patterned sentences that describe who did what to whom, why, and with what means.

     Some of these patterns are retold (copied), often with little confusions (variation) and conflations (superpositions). Alternative versions of stories compete for the limited space of bookstore shelves or the limited time of campfire storytelling. There is a multifaceted environment that affects their success, the association of the described events to those of everyday life. In particular, the environment contains mental schemas and scripts; as Aristotle noted and all four-year-olds demanding bedtime stories seem to know, a proper narrative has a beginning, middle, and end -- and so "good stories" fare much better in the memorized environment. (Especially those conveyed by historical novels that strengthen the narrative aspects!) Finally, because historians rewrite earlier historians, we see Darwin's inheritance principle in action: new variations are preferentially based on the more successfully copied of the current generation of historical stories, and so history has a drift to better and better fits to language instincts (such as chunking and narratives) because current relevance is shifting and ephemeral.
After many generations, only those stories of timeless relevance are left alongside the likely-ephemeral contemporary ones.
      Quality emerges, in some sense, as in the way that the nine-part epic tales studied by Misia Landau (youth sets out on a quest, fails, returns, sets out again with a helper, survives a new set of trials and tribulations, finally succeeds and returns home, and so on12) seem to have emerged in many cultures from the retelling of simpler narratives, generation after generation. Our modern origin stories, the anthropological scenarios about human ancestors during the tribulations of the ice age climate changes, are even said to follow the epic template!
     Can history, as we know it, run on a reduced set -- say, without the inheritance principle? (Imagine storytellers always reviewing a videotape before telling the story again, so variations were always done from an unchanging "standard version.") Certainly, a pattern that copied and varied, with retelling biased by resonances with current memories of the current population, would be impressive -- but the anchoring of the center of variation to the standard version would keep stories from drifting very far and prevent the recursive bootstrapping of quality.
     Suppose that, instead of eliminating inheritance, we loosened the environmental influence -- say, individuals' memories20 for unique episodes faded within a year. The often-told tales would simply drift, adapting to current concerns, losing those of the antepenultimate generation. It would be about like the whale songs that drift from one year to the next. What you would lose, lacking a good memorized environment that persisted a lifetime to overlap several generations, would be shaping up of quality (those timeless stories with universal relevance, the resonance with episodes recurring only twice in a lifetime, and so forth).
     My first "knock-out mutation" sounds, of course, like what we try to train scholars to do ("Avoid secondary sources! Read the original!"), while my second is merely an exaggerated version of the ahistoricism of preliterate societies16 (the Navajo emigrated from the Yukon to the American Southwest about 500 years ago, but this great migration has been lost to them, recovered only through a linguistic and genetic analysis of the Athabascan peoples). History, however, is not merely the retention of facts: it involves detecting patterns and attempting to understand them -- and this involves making good guesses and refining them. That intellectual endeavor is, I suggest in How Brains Think, another full-fledged Darwinian process.
      Competition between concepts is, of course, one of the ways in which science advances; evolutionary epistemology13 treats this as a Darwinian process. The advance of science differs from ordinary history because the environment biasing the competition between concepts involves a broad range of testing against reality.

Nonessentials: Catalysts and Stabilizers

There are another five features that, while not essential, do notably influence the rate of evolutionary change:

    7. Stability may occur, as in getting stuck in a rut (a local minimum -- or maximum -- in the adaptational landscape). Variants happen, but they're either nonviable or backslide easily.

    8. Systematic recombination (crossing over, sex) generates many more variants than do copying errors and the far-rarer point mutations. Or, for that matter, nonsystematic recombination such as bacterial conjugation or the conflation of ideas.

    9. Fluctuating environments (seasons, climate changes, diseases) change the name of the game, shaping up more complex patterns capable of doing well in several environments. For such jack-of-all-trades selection to occur, the climate must change much faster than efficiency adaptations can track it (more in a minute).

    10. Parcellation (as when rising sea level converts the hilltops of one large island into an archipelago of small islands) typically speeds evolution. It raises the surface-to-volume ratio (or perimeter-to-area ratio) and exposes a higher percentage of the population to the marginal conditions on the margins.

    11. Local extinctions (as when an island population becomes too small to sustain itself) speed evolution because they create empty niches. The pioneers that rediscover the niche get a series of generations with no competition, enough resources even for the odder variants that would never grow up to reproduce under any competition. For a novel pattern, that could represent the chance to "establish itself" before the next climate change, for which it might prove better suited than the others.

There are also catalysts acting at several removes, as in Darwin's example of how the introduction of cats to an English village could improve the clover in the surrounding countryside: The (i) cats would (ii) eat the mice that (iii) attack the bumble bee nests and thus (iv) allow more flowers to be cross pollinated. (You can see why I call these the "Rube Goldberg Variations.")

The Augmented Darwinian Set

Although a Darwinian process will run without these catalysts, using Darwinian creativity often requires some optimization for speed. In the behavioral setting I analyze in my two 1996 books, quality must be achieved within the time span of thought and action.
     Accelerating factors are the problem in what the French call avoir l'esprit de l'escalier -- finally thinking of a witty reply, but only after leaving the party. Some accelerating factors are almost essential in mental darwinism, simply because of the time windows created by fleeting opportunities, and so this "augmented Darwinian set" may also prove to be important for other memetic applications of the universal Darwinian process.


DISCUSSION

I am proposing a standard Darwinian set (six ingredients, in my formulation), with nonstandard cases often described via the sparse and augmented sets. As with Edelman's reentry, some cases may be both sparse and have a novel feature like revisionism (mixed cases). I was delighted to discover that my (neocortical circuitry) candidate process was not only capable of implementing all six essentials, but stability and the four catalysts as well.
     At what point can we carry over the traditional implications of the best-studied case, the species-evolution Darwinian process, to a candidate process? My present answer would be: When the six essentials are present, and no obvious stability or relative-rate issue seems to be precluding "progress," we are then entitled to predict that our candidate process is capable of repeatedly bootstrapping quality.
     The extent of "coverage" of memetic theories varies widely. For example, I was able to spend much of my last chapter of The Cerebral Code discussing the Darwinian implications of minor circuit malfunctions for a broad range of pathophysiology such as seizures, hallucinations, delusions, amnesia, déjà vu, and so forth. My point is that candidate processes in other memetic fields are also likely to be judged by similar nonevolutionary considerations, so we must remember that possessing the six essentials is only a "threshold" consideration, mostly relevant to the sorts of quality that can be bootstrapped -- and for how long that improvement can continue.

Stratified Stability and Relative Rates

One coverage issue that seems relevant, however, is whether new levels of organization emerge from the candidate evolutionary process. Can, for example, a candidate process form categories? Can it progress to evolving analogies4 or metaphors? Are these new levels ephemeral, or stable for awhile?
      Jacob Bronowski spoke of "stratified stability" and observed2, "The stable units that compose one level or stratum are the raw material for random encounters which produce higher configurations, some of which will chance to be stable....Evolution is the climbing of a ladder from simple to complex by steps, each of which is stable in itself." Does the process self-limit when reaching an angle of repose18, so that piling on another layer is self defeating? Does the process backslide in a catastrophic manner, requiring something like the Weismannian barrier between genotype and phenotype21 to provide a ratchet?
     Relative rates play an important role in any process involving change, one that can trivialize or magnify. Relative rates of expansion are the major principle underlying most household bimetalic-layer thermostats, and it is a familiar principle in development (the way curved surfaces are made is to have two sheets of cells in contact, one growing faster than the other). And we've already seen two examples here: the history example of episodic memories that faded quickly when compared to the generation time and lifespan, and again in #9 where climate changes had to be much faster than adaptations could track, if jack-of-all-trades abilities were to accumulate in the face of competition from lean, mean machines.

Repackaging the Essentials

Someone will surely try to condense my six essentials to a phrase more memorable than "a pattern that copies with occasional variation, where populations of the variants compete for a limited workspace, biased by a multifaceted environment, and with the next round of variations preferentially done from the more successful of the current generation." Indeed, Alfred Russel Wallace did a pretty good job back in 1875 ("...the known laws of variation, multiplication, and heredity... have probably sufficed....")14.
     It's just that I make explicit the pattern, the work space competition between populations, and the environmental biases. As noted in #2, I'm trying to avoid lumping where I know that splitting is going to be required later, to deal with some important partial cases. Wallace shows us that only three items cover a lot of essential ground, and there are surely other profitable ways to split and lump, if context allows the inference of other factors. A list of essentials -- at least one that aspires to some universality -- can't omit the context, can't skip over the potential confusions. Bronowski once observed1 that, even if six sentences might serve to sum up one of his lectures, the rest of the hour was really essential for disambiguating the meaning of those summary sentences. The name of the game here isn't compression but abstraction, an abstraction that is just general enough to cover a number of situations that differ widely in media and time scale -- but not so abstract as to lose important associations.
     Of course, all the definition in the world can be upset by one little existence proof, a simulation of a self-organizing quality bootstrap that runs on a different set of principles. Until then, we are simply trying to clarify our thinking about the creation-of-quality process we know best, the one first stumbled upon by Charles Darwin.



Notes and Bibliography
  1. Jacob Bronowski, The Origins of Knowledge and Imagination (Yale University Press 1978, transcribed from 1967 lectures), p. 105.
  2. Jacob Browonski, The Ascent of Man (Little, Brown 1973), pp. 348-349. In introducing stratified stability, Bronowski says: "Nature works by steps. The atoms form molecules, the molecules form bases, the bases direct the formation of amino acids, the amino acids form proteins, and proteins work in cells. The cells make up first of all the simple animals, and then the sophisticated ones, climbing step by step. The stable units that compose one level or stratum are the raw material for random encounters which produce higher configurations, some of which will chance to be stable.... Evolution is the climbing of a ladder from simple to complex by steps, each of which is stable in itself."
  3. William H. Calvin, How Brains Think: Evolving Intelligence, Then and Now (BasicBooks 1996).
  4. William H. Calvin, The Cerebral Code: Thinking a Thought in the Mosaics of the Mind (MIT Press 1996). In searching for Hebb's cell assembly (a committee of neurons whose firing represents a color, word, thought, etc.), I looked for neural circuitry in the brain that was capable of copying spatiotemporal firing patterns. The top layers of the newer parts of cerebral cortex indeed have recurrent excitatory circuitry that should produce synchronized triangular arrays of neurons which extend themselves over a few centimeters, or recruit distant populations via corticocortical pathways. Collections of such arrays constitute a spatiotemporal pattern with great redundancy. The smallest "tile" in this mosaic, which has no redundancy within it, is hexagonal in shape and about 0.5 mm across; it probably has a few hundred independent elements. This is a minimal replicable unit (and a candidate for a cerebral code for, say, a word or concept); a hexagonal mosaic of it can compete with another pattern's hexagonal mosaic for territory in association cortex.
  5. Richard Dawkins, "Selective neurone death as a possible memory mechanism," Nature 229:118-119 (1971).
  6. Richard Dawkins, The Extended Phenotype (Oxford University Press 1982).
  7. Gerald M. Edelman, Neural Darwinism (BasicBooks 1987), and Bright Air, Brilliant Fire (BasicBooks 1992, especially chapter 9 where he addresses his critics). The book review is: William H. Calvin, "A global brain theory," Science 240:1802-1803 (24 June 1988).
  8. Robert Frost, in Selected Prose of Robert Frost, edited by H. Cox and E. C. Lathem (Collier 1986), pp. 33-46.
  9. Douglas R. Hofstadter, Metamagical Themas (BasicBooks 1985).
  10. John Holland, Adaptation in Natural and Artificial Systems, revised edition (MIT Press 1992)
  11. Niels K. Jerne, "Antibodies and learning: Selection versus instruction," in The Neurosciences: A Study Program, edited by G. C. Quarton, T. Melnechuk, & F. O. Schmitt (Rockefeller University Press 1967), pp. 200-205 at p. 204.
  12. Misia Landau, "Human evolution as narrative," American Scientist 72:262-268 (May/June, 1984).
  13. Ernst Mayr, This is Biology (Harvard University Press, 1997), p.99.
  14. Alfred Russel Wallace, "The limits of natural selection as applied to man," chapter 10 of Contributions to the Theory of Natural Selection (Macmillan 1875).
  15. John Z. Young, A Model of the Brain (Claredon Press 1964). His "The organization of a memory system," Proceedings of the Royal Society (London) 163B:285-320 (1965) introduces the mnemon concept in which weakened synapses serve to tune up a function. A later version is his "Learning as a process of selection," Journal of the Royal Society of Medicine 72:801-804 (1979). The modern chapter of mental darwinism starts in 1965 with Daniel C. Dennett's D. Phil. thesis, published as Content and Consciousness (Routledge and Kegan Paul 1969) but it all began more than a century ago with William James, "Great men, great thoughts, and the environment," The Atlantic Monthly 46(276):441-459 (October 1880).
  16. Ahistoricism is discussed in the first chapter of my book, The River That Flows Uphill. But Loren Eiseley said it best: "Man without writing cannot long retain his history in his head. His intelligence permits him to grasp some kind of succession of generations; but without writing, the tale of the past rapidly degenerates into fumbling myth and fable. Man's greatest epic, his four long battles with the advancing ice of the great continental glaciers, has vanished from human memory without a trace. Our illiterate fathers disappeared and with them, in a few scant generations, died one of the great stories of all time."
  17. An algorithm is a simple systematic procedure for solving a problem, usually involving repeated steps, e.g., long division (try multiplying the divisor by two and subtracting; if the remainder is too large, try 3, etc.).
  18. The angle of repose is a geological term for how steep-sided a pile can become before little landslides remove any further additions.
  19. Bootstrapping ("Pulling yourself up by your own bootstraps") is a term that describes the ability of simple patterns to bring forth more complicated ones. A familiar example is the start-up procedure for a computer: booting involves a simple program stored in a ROM chip that, in turn, reads and starts up the operating system from a hard disk -- which, in turn, starts up the application programs.
  20. Episodic memories are those of unique events, such as a particular conversation; they're much more difficult to maintain than memories of repeated events. Consolidation of memory is the often-fallible process of making short-term "volatile" memories into more permanent long-term memories with an enduring structural basis. I suggest, in chapter 6 of The Cerebral Code, that Edelmanian revisionism, between a hippocampal map and a cortical map, would be a useful way of "learning" (via repeated trials) what was originally only a unique episode and embedding it in neocortex. Like a photographic development process, it is likely that, in the manner of those three-million-year old footprints that were preserved by volcanic ashfall (cement on the fly preventing the usual erosion and backfilling), both culling and fixation are involved in biological memories.
  21. Weismann's genotype-phenotype distinction in biology is not a necessary condition for a Darwinian process, as recent experiments on "RNA evolution" have shown (there isn't a body that lives and dies, carrying the genes along, but rather patterns directly exposed to environmental selection). Envelopes such as bodies (phenotypes) are an example of stratified stability; they nicely illustrate why strict one-trait-at-a-time adaptationist reasoning is insufficient. Genes often live -- and die -- in a collection called an "individual," which means that success is often via particular combinations of traits.
  22. Memetic history is covered in a number of places; see, for example, the excellent webbed bibliographies of Barry McMullin, Hans-Cees Speel, and John S. Wilkins.

copyright ©1997 by
William H. Calvin