Austhink is a critical thinking research, training and consulting group specializing in complex reasoning and argumentation. 

 Austhink Home

 

 

The Red Queen and the Slingshot

Paul Monk on the history of the human mind

“…we can understand neither ourselves nor our world until we have finally understood what language is and what it has done for our species.”

- Derek Bickerton (1990) [i]

“There is no step more uplifting, more momentous in the history of mind design, than the invention of language. When Homo sapiens became the beneficiary of this invention, the species stepped into a slingshot that has launched it far beyond all other earthly species in the power to look ahead and reflect.”

-         Daniel Dennett (1996)[ii]

“Lengthening approach distances, as African herds evolved to become more wary since early [Homo] erectus days, would have gradually required more accurate throws (just double your target distance and it will become about eight times more difficult). This is the Red Queen Principle, as when Alice was told [in Through the Looking Glass] that you had to run faster just to stay in the same place. The Red Queen may be the patron saint of the Homo lineage.”

-         William Calvin (2004)[iii]

To borrow a phrase from Nietzsche, our species has built its cities on the slopes of Mount Vesuvius and sailed its ships into uncharted seas.[iv] We ‘live dangerously’. We feel amazed at the achievements, appalled by the savageries of our kind. We are, like Hamlet, awed at ourselves as the paragon of animals, but disgusted by ourselves, when we contemplate our defects and depravities.[v]

At the root of both our achievements and depravities, is the human mind.  It defines what we are. More precisely, it enables us to engage in the act of ‘defining’ (and redefining) things at all, ourselves included.[vi] But what is this thing we call ‘mind’? Where did it come from? That question has preoccupied shamans, priests, philosophers and cognitive scientists for millennia.

“The concept of something known as ‘mind’ and regarded as somehow problematic goes back to the dawn of recorded thought”, wrote linguist Derek Bickerton in 1990. “Were there two parallel realities, one physical and one not? Was one set of phenomena unreal?...Over the centuries, almost every possible answer to these questions has been espoused by someone. No solution seems to have worked, least of all the solution of claiming that there was no problem.”[vii]

The longstanding debates about an ‘apparent’ versus a metaphysical world and about the mind (or soul) and body showed certain constant themes, he suggested. Few people, however, “seem to have suspected that the mind-body problem and the problem of language origins might turn out to be related.”[viii] What if the very nature of language was such that it created both ‘mind’ and the metaphysical puzzles philosophers have wrestled with for thousands of years?

What was needed was a cognitive archaeology - an archaeology of the evolution of primate and hominid cognitive capacities and of the neurological foundations of language itself. No philosopher, from Plato to Heidegger, was equipped to undertake any such task. It has only become possible in recent decades, with rapid advances in both the cognitive and archaeological sciences. Those advances have gradually revealed the origins of the human mind within the natural order of things.[ix]

Colin Renfrew’s Archaeology and Language (1988), Bickerton’s Language and Species (1990), Merlin Donald’s Origins of the Modern Mind: Three Stages in the Evolution of Culture and Cognition (1991), Steven Mithen’s The Prehistory of the Mind: A Search for the Origins of Art, Religion and Science and Donald Johanson and Blake Edgar’s From Lucy to Language (1996), Terence Deacon’s The Symbolic Species: The Co-Evolution of Language and the Brain (1997), and Ian Tattersall’s Becoming Human (1998) were all excellent syntheses of the ongoing scientific work.

William Calvin is at the forefront of this work[x]. In 2000, he co-authored a book with Bickerton, called Lingua ex Machina: Reconciling Darwin and Chomsky with the Human Brain.[xi] In it, they argued that an adequate theory of what language is could not, with Chomsky, simply posit an innate language module and leave it at that. It had to supply an evolutionary pedigree for such a module. They offered the elements of just such a pedigree.

What they were looking for was the evolutionary origin of the neural circuitry that makes language possible at all. Their finding had to be consistent with several given realities. First, that hominids had evolved over an enormous period of time (some seven million years) compared to that in which Homo sapiens had overrun the biosphere (50,000 years). Second, that evolution works by mutation, adaptation (or exaptation)[xii] and accumulation, not through linear development or holistic, teleological design. Third, that hominids had had large brains for hundreds of thousands of years before there was any evidence of the kind of thinking we associate with language.

What all this meant, they suggest, is that the syntax which makes language what it is had to be an emergent property of a number of other neural and physiological capacities that had evolved over a very long period of time. “It might be like adding a capstone to an arch,” they suggested, “which permits the other stones to support themselves without scaffolding and, as a committee, they can defy gravity.” Their task, as they explained, was to conceive of “the scaffolding that could have initially put such a stable structure in place”.

In A Brain For All Seasons (2003), Calvin argued that abrupt climate flips and selection pressure on pre-sapient hominids to enhance their capacity for accurate throwing may have played crucial roles in the development of human brains and specifically in paving the way for language. The ‘scaffolding’ consisted of the physical and neurophysiological changes that such pressures induced in large-brained hominids, over the past two million years or more.

In his new book, A Brief History of the Mind, he sketches out not only these hypotheses about how the brain evolved a sapient mind, but how that mind has since invented human culture  - and the challenges it faces in the 21st century. It’s an extraordinarily ambitious project. He describes it as having been inspired by Stephen Hawking’s A Brief History of Time and David Fromkin’s The Way of the World, which attempted similarly ambitious tasks.[xiii]

Nothing in the book is more intriguing than Calvin’s reiteration of the hypothesis that selection pressures for accurate throwing could do much to explain the emergence of a capacity for syntax in the brain. Of the tasks facing pre-sapient hominids, he argues, “It is only the accurate throws that make a lot of demands on the brain compared with what the great apes can do.”[xiv] This is because accurate throwing requires elaborate coordination of many muscles and an intentional stance that can produce flexible variations of such coordination at any given instant.

This takes us back before language, looking at real, non-metaphysical processes which might have paved the way for it. In other words, it overcomes the mind-body problem, by reconstructing a time when there was human body without ‘mind’ as we know it, then showing how this ‘mind’ can have emerged within the body. Better still, it shows how language itself may have emerged, on an unforeseen basis, within a brain trying to cope with various other challenges.[xv]

After all, it is the appearance of syntax - the rules governing the construction of sentences - itself that needs to be explained in naturalistic terms, if we are to account for the emergence of language and with it ‘mind’ from among the primates. More precisely, it is structured thought that requires explanation. As Calvin puts it, “syntax is not the only type of structured thought, and perhaps not even the first to evolve. Other structured aspects of thought are multi-stage planning, games with rules that constrain possible moves, chains of logic, structured music and a fascination with discovering hidden order, with imagining how things hang together.”[xvi]

It’s vital to remember that Calvin is not saying all this was caused by accurate throwing alone. A set of conditions had to be in place before the ‘capstone’ of structured thought or syntax could be slipped into place. The archaeological evidence clearly indicates that early humans engaged in group hunting, camp-building, tool-making of a kind requiring joint attention and intentional instruction[xvii] and thus social interactions of a sophistication unknown among apes and monkeys.

All this, plus the physiological changes entailed in upright posture (including the freeing up of the throat and chest muscles and nerves for more subtle vocalizations)[xviii], the change of diet from vegetarian to high meat content (allowing for a shrinkage of the gut and a diversion of blood resources to the growing brain)[xix], and the development of the neocortex (which is all about forming new associations between things) within the brain, plainly were among these conditions.

Even so, how could hundreds of thousands of years of relentless work on accurate throwing have hauled the capstone up the scaffolding and dropped it into place, as it were? Calvin makes three basic claims. First, that just getting set for such a throw “involves a major amount of reassignment of cerebral resources, quite unlike most cognitive tasks.”[xx]

Second, that there is considerable neural overlap between hand/arm, oral/facial and linguistic queuing or ‘planning’ in our brains.[xxi] Third, that the more accurate casting of projectiles would have had both immediate, tangible pay-offs (“your family eats high calorie, non-toxic food for additional days of the month”[xxii]) and a long growth curve (if you keep improving, you get better returns).

This third claim prompts one of his most colourful analogies - the one I’ve used as an epigraph. As African herds became more and more wary of canny, biped hunters with projectiles, they would move away at any hint of human approach. The task of the hunters became progressively more demanding. Like Alice, in Through the Looking Glass, they had to run harder and harder just to stay in the same place.

More precisely, they had to throw more and more accurately, over greater and greater distances to keep getting their meat. Hence Calvin’s quip that Lewis Carroll’s Red Queen might well be considered the “patron saint of the Homo lineage” because ‘she’ put our kind on a treadmill which forced us to develop post-pongid neural processes. Far from keeping us in the same place, these finally launched us, in Daniel Dennett’s phrase, as if from a slingshot, into our strange and wondrous sapient condition.

However, we are very much ‘through the looking glass here’ and you are, in all probability, still wondering how Calvin deduces that accurate throwing for two million years gave us art, agriculture, writing, philosophy and science within 50,000 years. He puts it like this: “The structuring that I have in mind for higher intellectual functions is not just a simple chain of events and intermediate products. It is more like a symphony - and that reminds me of another important symphony that the brain had been producing for a million years before intellect. Accurate throwing…involves a structured plan. Indeed, planning a throw has some nested stages, strongly reminiscent of syntax.[xxiii]

To achieve the desired effect, the brain must coordinate about a hundred muscles, from the spine via the shoulder, elbow and wrist to the fingers, be able to pick exactly the right moment to cast and execute in an eighth of a second “in just the right way or …miss the target and go hungry.”[xxiv] Calvin offers the intriguing observations, based on this root, that “the notion of being impelled down a path is very strong in us”[xxv] and that many of our cognitive processes involve arbitrary frameworks of “allowed moves, against which possible moves must be checked before acting.”[xxvi]

That linguistic capacity took so long to develop from this root is less problematic than one might intuitively think. There are other cases of capability long antedating development. For example, as Calvin points out, “we were likely capable of structured music…just after the big transition (around 50,000 years ago) long before Western music got around to using it about a thousand years ago. Just because novel symphonies of hand-arm movement commands had been getting better and better for a million years doesn’t mean that secondary use for spoken language had to happen.”[xxvii]

 He does believe, however, that proto-language very likely existed long before the throwing ‘syntax’ was exapted to create full-blown language. What is unclear is when and to what extent such proto-language emerged - given that chimps, for example, from whom we diverged 7 million years ago, use various common sounds to register warnings as specific as ‘eagle’ and ‘snake’.[xxviii] Yet, by this account, (full-blown, syntactical) language did not emerge until relatively late in the development even of Homo sapiens.

What is clear is that, once it did emerge, it made us a formidable competitor for any other creature on the planet - as thousands of species have since found. The slingshot that Dennett understandably celebrates made us to other beasts what the cyborgs and rebellious robots of current science fiction are to us (in our imaginations) - uncanny beings of higher and masterful intelligence, entirely capable of exterminating us at will and even bent on doing so, for reasons of their own.

We are uncanny. We need to remind ourselves of that. Our greatest literature reflects an awareness of how strange we are - strangers in a strange world. The world is full of extraordinary things, Sophocles wrote, 2,400 years ago, but nothing is more uncanny than human beings.[xxix] That literature is itself the product of millennia of ‘slingshot’ momentum, though the oldest myths of pre-literate human beings already exhibit a sense that we are unusual and somehow acquainted with ‘gods’.

Calvin’s eighth through thirteenth chapters briefly trace the beginnings of this slingshot trajectory, including, in one chapter (the eleventh) of just twelve pages, how we ‘civilized ourselves’ by inventing agriculture, writing and ‘mind medicine’. If I have a complaint about his book, it is that right here he might have done better to take a little more space to cover crucial and fascinating topics that we now need to understand clearly, in order to grasp what we are.

I have in mind the origins of writing and mathematics, in particular, and the curious history of their highly uneven development. He does not really address the question of how writing was invented and deals only perfunctorily with its cognitive implications and limitations. He is plainly aware of the findings of the best scholarship, as he indicates by simply remarking that “writing developed from tax accounting about 5,200 years ago in Sumer”.[xxx] Given how immense a step this was, however, it merited at least a paragraph or two to spell out what the breakthrough development took.

He does not discuss the rise of mathematics at all.[xxxi] This is a strange omission for a natural scientist. Without mathematics, we could not measure and analyze the trajectory of a stone fired from a slingshot, however well we might be able to fire it and however much we might marvel at the force so achieved. We have achieved altogether extraordinary things through the apprehension and application of mathematical ideas in the past few thousand years. They require explanation.

The twentieth century physicist Eugene Wigner famously spoke of the “unreasonable effectiveness of mathematics in the natural sciences”, by way of arguing that mathematics could not be simply the invention of human thought.[xxxii] There are those who argue that it is, nonetheless, just such an invention. These are the so-called mathematical humanists, as distinct from the Platonists and formalists.[xxxiii] It is disappointing, even in a brief history of the mind, to find no reckoning with this important and profound subject.

This is all the more so because Calvin’s derivation of syntax from bodily motion and neural exaptation bears a decided resemblance to the argument of the mathematical humanists George Lakoff and Rafael Nunez that abstract mathematical ideas have arisen as a nested set of conceptual metaphors rooted in bodily experiences of spatial perception, tactile encounter, movement and containment. As they declare, “Mathematics is deep, fundamental and essential to the human experience. As such, it is crying out to be understood. It has not been.”[xxxiv]

What Calvin does do, on the other hand, is provide a sensible and accessible reflection on the cognitive roots of many of our confusions and failings. Even more importantly, in his final chapter, ‘The Future of the Augmented Mind’, he argues for a down to earth approach to coping with the defects of the mind we have acquired by natural selection - now that millennia of feedback, via our record systems, has enabled us to get some perspective on how it works.

In his characteristically plainspoken style, he comments, “much of the higher intellectual function seems half-baked; what you ordinarily see is a prototype rather than a finished, well-engineered product. Perfect you don’t get; not from Darwinian evolution. And the quality controls are spotty. But culture - especially education and medicine - can sometimes patch things up, if society works hard enough at it.”[xxxv]

He offers quite a catalogue of ‘bugs’ in the ‘prototype’ - far more and much better documented than are offered in the old Christian concept of ‘original sin’. These include our susceptibility to colourful rhetoric, our fateful capacity for working in really large scale operations which overwhelm our emotional autonomy and critical faculties, our “compulsive search for coherence” often leading us to “finding hidden patterns where none exist”, our susceptibility to being blindsided by our logic to the extent that we act on errors or prejudices with great conviction.[xxxvi]

Our memories are selective and unreliable, our decision-making easily swayed by the last thing to have made a vivid impression on us, our intuitions about logic, probability and causation are powerful but flawed in quite numerous ways, with consequences that are magnified, not diminished by our creation of complex social and cognitive ways of life. Our capacities for dissociation and deception, including even self-deception, complicate our affairs and all too often issue in cruelties and moral vacuities that have terrible consequences.

On top of all this, we have, as Carl Sagan expressed it, both created a civilization enormously dependent on science and technology, and “arranged things so that almost no-one understands” them - “a prescription for disaster.” The challenges we now face are such that unless we can invent better means for teaching critical thinking and scientific understanding and applying both to public policy, we are at risk of being overwhelmed by ‘crashes’ that will occur in part because of the breakneck and accelerating speed with which we are now traveling - on our uncanny (if not monstrous) slingshot trajectory.

Calvin believes we need to augment our cognitive abilities and aim to culturally ‘debug’ the prototype mind we inherited. His pivotal point is the claim that “Very little education or training is currently based on scientific knowledge of brain mechanisms…To imagine what a difference [such knowledge] could make, consider the history of medicine…One century ago, medicine was still largely empirical and only maybe a tenth had been modified by science…It is only a slight exaggeration to say that the transition from an empirical to a semi-scientific medicine has doubled life-span and reduced suffering by half.” He believes similar gains can be made in education.

Yet education at the beginning of the twenty first century is “largely still empirical and only slightly scientific”, almost as much so as medicine was before 1800. “We know some empirical truths about education, but we don’t know how” good leaning and advanced critical thinking skills “are implemented in the brain, and thus we don’t know rational ways of improving them.”[xxxvii] New information technologies and even genetic engineering could, he believes, enable us - if used wisely  - to make gains in education similar to those made in medicine since 1800.

‘Wisely’ is, of course, the key word. For as long as we self-described ‘sapient’ human beings have wondered about the origins of the world and the ‘mind-body problem’, we have also sought wisdom for dealing with one another and the natural world. Many still seek such wisdom - and find a modicum of it - in the old religions of the Iron Age, or in some version of half-baked primitivism. We need a larger vision now, though. We need a universal cognitive humanism - a wisdom based not on platitude, sentiment or good old time religion, but on a deep understanding of what makes us what we all are.

What Calvin’s little book offers is a door way into a such a larger understanding. If you seek wisdom for our time, this is a good place to start. If, of course, what you really seek is consolation, you may find it elsewhere. But the time has come when, in order to overcome our savageries, our defects, our depravities, we surely need to dig deeper, learn profoundly and find ways of thinking that serve us all better. Otherwise, we run an increasing risk of our hour strutting and fretting upon the stage ending, as Hamlet ends: the stage strewn with corpses and the prince’s luminous mind snuffed out - after all too brief a history.


 

[i] Derek Bickerton Language and Species, University of Chicago Press, 1990, p. 257.

 

[ii] Daniel Dennett Kinds of Minds, Basic Books, Science Masters, New York, 1996, p. 147.

 

[iii] William H. Calvin A Brief History of the Mind: From Apes to Intellect and Beyond, Oxford University Press, 2004, p. 70.

 

[iv] Friedrich Nietzsche The Gay Science - With a Prelude in Rhymes and an Appendix of Songs, #283, translated, with commentary, by Walter Kaufmann, Vintage Books, New York, 1974, p. 228.

 

[v] William Shakespeare The Tragedy of Hamlet, Prince of Denmark, Act II, Sc. ii, ll. 304-320. The remarks are addressed by Hamlet to Rosencrantz and Guildenstern: “…I have of late, but wherefore I know not, lost all my mirth, foregone all custom of exercises: and indeed it goes so heavily with my disposition, that this goodly frame the earth seems to me a sterile promontory, this most excellent canopy the air, look you, this brave o’erhanging firmament fretted with golden fire, why it appeareth nothing to me but a foul and pestilent congregation of vapours…What a piece of work is a man, how noble in reason, how infinite in faculties, in form and moving how express and admirable; in action, how like an angel; in apprehension, how like a god: the beauty of the world, the paragon of animals: and yet to me, what is this quintessence of dust? Man delights not me, no, nor woman neither, though by your smiling you seem to say so. ” The mood is famously summed up in the words of the guardsman Marcellus, “Something is rotten in the state of Denmark.” Act I, Sc. iv, ll. 89-90.

 

In his Shakespeare: The Invention of the Human (Riverhead Books, 1998), Harold Bloom hailed Hamlet - not the play, but the character - as “too immense a consciousness for Hamlet” and “the leading Western representation of an intellectual” (p. 383). “The phenomenon of Hamlet,” he went on to remark, “…is unsurpassed in imaginative literature” (p. 384); he is “the most intelligent character in all of literature” (p. 388). He suggests that Shakespeare poured his own will and spirit into Hamlet, as into no other character. He then comments: “Whatever his precise relation to Shakespeare might have been, Hamlet is to other literary and dramatic characters what Shakespeare is to other writers: a class of one, set apart by cognitive and aesthetic eminence.” (pp. 412-13) “Hamlet’s linguistic scepticism coexists with a span and control of language even greater than Falstaff’s, because its range is the widest we have ever encountered in a single work…the copiousness of Hamlet’s language…utilizes the full and unique resources of English syntax and diction.” (p. 423). James Joyce’s Stephen Dedalus, in Ulysses, he reminds us, “scarcely distinguishes between Shakespeare and Hamlet.” (p. 429)

 

This raises the question of who, exactly, William Shakespeare was. For nothing we know about the grain merchant and bit part actor from Stratford has the slightest thing to do with Hamlet, prince of Denmark, or a capacity to write such a play, any more than the content of the sonnets shows even the faintest resemblance to the life of the man from Stratford. It is considerations such as these that led Joseph Sobran, in 1997, to argue that the man from Stratford was not the author of the plays and sonnets at all. His name had simply been appropriated as a literary pseudonym, a nom de plume, by Edward de Vere, the seventeenth Earl of Oxford, whose life matches that of Hamlet, that of the implicit author of the English history plays, as well as of the tragedies and comedies set in Italy and of the poet of the sonnets to an extraordinary degree (Alias Shakespeare: Solving the Greatest Literary Mystery of All Time, Free Press, New York, 1997, 311 pp).

 

Offered a chance to debate Sobran’s claim with nine other writers, in a special edition of Harper’s, in April 1999 (pp. 35-62), Harold Bloom wrote an hilarious, but utterly irrelevant screed, dismissing the Oxfordians, without in any way addressing their argument. Yet Sobran’s thesis is fascinating and, at a minimum, deserves a serious response.

 

[vi] One of the great, neglected classics of contemporary philosophy is Robert Brandom’s Making It Explicit: Reasoning, Representing and Discursive Commitment (Harvard University Press, 1994). It begins with a marvellous reflection on what makes us the kind of being that we are. It is too long to have incorporated into the essay, but too beautiful not to incorporate into its subtext. It reads as follows:

 

                “‘We’ is said in many ways. We may be thee and me. We may be all that talks or all that moves, all that minds or all that matters. Since these boundaries are elastic, we have a task of demarcation: telling who or what we are, distinguishing ourselves from the other sorts of objects or organisms we find in our world. Saying who we are can contract to an empty exercise in self-congratulation - a ritual rehearsal of the endless, pitiable disabilities of clockworks, carrots, cows and the clan across the river. Such a mean-spirited version of the demarcational enterprise is not forced on us by the way things are, however.

For what we are is made as much as found, decided as well as discovered. The sort of thing we are depends, in part, on what we take ourselves to be. One characteristic way we develop and make ourselves into what we are is by expressing, exploring and clarifying our understanding of what we. Arbitrary distinctions of biology, geography, culture or preference can be and have been seized on to enforce and make intelligible the crucial distinction between us and them (or it). But philosophical thought is coeval with the impulse to understand ourselves according to a more principled, less parochial story - and so to be a more principled, less parochial sort of being.”  (p. 3). It is tempting to quote much more, but the key point concerning the mind as that which defines us in defining itself is captured, I think, by these two opening paragraphs of Brandom’s master work.

 

[vii] There is a marvellous passage in Nietzsche, in which he attempted to summarise the history of these philosophical debates from Plato to nineteenth century materialism. It seems worth reproducing it here:

 

“How the ‘Real World at Last Became a Myth’

HISTORY OF AN ERROR

 

1.        The real world, attainable to the wise, the pious, the virtuous man - he dwells in it, he is it. (Oldest form of the idea, relatively sensible, simple convincing. Transcription of the proposition ‘I, Plato, am the truth.’) [In the original German, Nietzsche plays on the meanings of truth (Wahrheit) and real world (wahre Welt)]

2.        The real world, unattainable for the moment, but promised to the wise, the pious, the virtuous man (‘to the sinner who repents’) (Progress of the idea: it grows more refined, more enticing, more incomprehensible - it becomes a woman, it becomes Christian…)

3.        The real world, unattainable, undemonstrable, cannot be promised, but even when merely thought of a consolation, a duty, an imperative. (Fundamentally the same old sun, but shining through mist and scepticism; the idea grown sublime, pale, northerly, Konigsbergian) [Nietzsche is alluding here to Immanuel Kant, who lived and died in the northern German city of Konigsberg]

4.        The real world - unattainable? Unattained, at any rate. And, if unattained, also unknown. Consequently, also no consolation, no redemption, no duty: how could we have a duty towards something unknown? (The grey of dawn. First yawnings of reason. Cockcrow of positivism.) [By positivism here, Nietzsche seems to have meant empiricism, philosophy founded on observation and experiment].

5.        The ‘real world’ - an idea no longer of any use, not even a duty any longer - an idea grown useless, superfluous, consequently a refuted idea: let us abolish it! (Broad daylight; breakfast, return of cheerfulness and bon sens [good sense]; Plato blushes for shame; all free spirits run riot.)

6.        We have abolished the real world: what world is left? The apparent world perhaps?...But no! with the real world we have also abolished the apparent world! (Mid-day; moment of the shortest shadow; end of the longest error; zenith of mankind; INCIPIT ZARATHUSTRA.” [This is where Zarathustra begins. Zarathustra being the eponymous hero of Nietzsche’s famous work of philosophical fiction  - phi fi as distinct from sci fi? - Thus Spake Zarathustra, in which he used the name of the ancient Persian inventor of metaphysical dualism, Zarathustra, or Zoroaster, as the name of a character who systematically repudiates the dualist worldview.]

 

Friedrich Nietzsche Twilight of the Idols/The Antichrist, translated by R. J. Hollingdale, Penguin, 1968 (1982) pp. 40-41.

 

[viii] Derek Bickerton Language and Species, University of Chicago Press, 1990, pp. 196-97.

 

[ix] My own first attempt to digest a lot of this was written up in an essay called ‘Christianity as Antiquity and the Cathedral of the Mind’, published in Quadrant, in December 1998 (pp. 35-41), as a Christmas reflection.

 

[x] For a direct access to the range of  Calvin’s work, including work in progress, see www.williamcalvin.com.

 

[xi] William H. Calvin and Derek Bickerton Lingua ex Machina: Reconciling Darwin and Chomsky with the Human Brain, MIT Press, 2000.

 

[xii] As Ian Tattersall puts it, “…perhaps the most important lesson we can learn from what we know of our own origins involves the significance of what has in recent years increasingly been termed ‘exaptation’. This is a useful name for characteristics that arise in one context before being exploited in another, or for the process by which such novelties are adopted in populations.” ‘How We Came to be Human’ Scientific American, December 2001, p. 43.

 

[xiii] William H. Calvin A Brief History of the Mind: From Apes To Intellect and Beyond, Oxford University Press, 2004, Preface, p. xiv. Hawking’s little book is famous, if only as something many people talk about, even buy, but never actually read. Fromkin’s history is surely less well known, though it may have been actually read by more people, being rather less daunting. He begins with a chapter headed ‘Becoming Human’ and it is not until several chapters later, looking at the dawn of the modern world, that he has a chapter headed ‘Achieving Rationality’. He is an Enlightenment man and a modernist and his account of our condition will not much please the devotees of post-modernism and deconstruction. The Way of the World: From the Dawn of Civilizations to the Eve of the Twenty-First Century, Vintage Books, New York, 1998.

 

[xiv] ibid p. 47.

 

[xv] This is a good context in which to reflect on Nietzsche’s famous polemic against ‘reason’, as it had been practiced within the limits of traditional philosophy, from Plato to Kant and beyond:

 

“You ask me about the idiosyncrasies of philosophers?…There is their lack of historical sense, their hatred of even the idea of becoming, their Egyptianism. They think they are doing a thing honour when they dehistoricize it, sub specie aeterni [from the view point of eternity] - when they make a mummy of it. All that philosophers have handled for millennia has been conceptual mummies; nothing actual has escaped from their hands alive. They kill, they stuff, when they worship, these conceptual idolaters - they become a mortal danger to everything when they worship. Death, change, age, as well as procreation and growth, are for them objections, refutations even. What is does not become; what becomes, is not…Now they all believe, even to the point of despair, in that which is. But since they cannot get hold of it, they look for reasons why it is being withheld from them. ‘It must be an illusion, a deception which prevents us from perceiving that which is: where is the deceiver to be found?’ - ‘We’ve got it’, they cry in delight, ‘it is the senses! These senses, which are so immoral as well, it is they which deceive us about the real world. Moral: escape from sense deception, from becoming, from history, from falsehood - history is nothing but belief in the senses, belief in falsehood. Moral: denial of all that believes in the senses, of all the rest of mankind: all of that is mere ‘people’. Be a philosopher, be a mummy, represent monoto-theism by a gravedigger mimicry! - And away, above all, with the body, that pitiable ide fixe of the senses! Infected with every error of logic there is, refuted, impossible even, notwithstanding it is impudent enough to behave as if it actually existed!’…” Twilight of the Idols/The Antichrist, translated by R. J. Hollingdale, Penguin, 1968 (1982) p. 35.

 

[xvi] Op. cit. pp. 89-90.

 

[xvii] “Stone tool-making is first seen about 2.6 million years ago in Ethiopia, and the earliest of the bigger-brained Homo species have been traced back to 2.4 million years. So, first it’s tool-making, then the spin-off.” Ibid. 27. Yet, by Calvin’s own account, this must have been rudimentary. As he remark, “There isn’t much instructive joint attention outside of modern humans.” Ibid. p. 72.

 

[xviii] Finer control of tongue and chest muscles and breath developed markedly between 1.6 and .4 million years ago, coincident with (very slow) refinement in projectile crafting and use. By 400,000 years ago, big-brained pre-sapient hominids in Africa and Europe were wielding spears several metres long, with hafted shafts, plainly designed with some skill for throwing, not merely thrusting. This was a major development compared with the sharpened stones used by earlier hominids for well over a million years before that. Ibid. pp. 53-57.

 

[xix] “Some marsupials devote less than 1 per cent of what their heart pumps to servicing their brains. In the average mammal, it is about 3 per cent. In humans, it is close to 16 per cent. If something had to give, it was likely the overextended (pongid) guts. An ancestral diet of low quality food, on this argument, keeps bigger-brained variants ‘vegetating’ while digesting a meal.” Ibid. p. 25.

 

[xx] Ibid. p. 48.

 

[xxi] Ibid. p. 40.

 

[xxii] Ibid. p. 39.

 

[xxiii] Ibid. pp. 94-95.

 

[xxiv] Ibid. p. 96.

 

[xxv] “What is it we do that is so special?” Robert Brandom asks. “The answer to be explored here - a traditional one, to be sure - is that we are distinguished by capacities that are broadly cognitive. Our transactions with other things, and with each other, in a special and characteristic sense, mean something to us, they have a conceptual content for us, we understand them in one way rather than another. It is this demarcational strategy that underlies the classical identification of us as reasonable beings. Reason is as nothing to the beasts of the field. We are the ones on whom reasons are binding, who are subject to the peculiar force of the better reason.” Making It Explicit: Reasoning Representing and Discursive Commitment, Harvard University Press, 1994, pp. 4-5. After reading Calvin, Brandom’s remarks read quite differently than if one has merely read Aristotle, Descartes and Kant. The last sentence, in particular, becomes almost hair-raisingly suggestive of remote antiquity’s shaping of our cognitive being.

 

[xxvi] Ibid. pp. 91-92.

 

[xxvii] Ibid. p. 99.

 

[xxviii] Ian Tattersall argued last year that “the dead hand of linear thinking still lies heavily on palaeoanthropology”, but that there have been, at a minimum, 20 species of hominid and that clearly “the story of human evolution has not been one of a lone hero’s linear struggle.” “Over the past five million years”, he wrote, “new hominid species have regularly emerged, competed, co-existed, colonized new environments and succeeded - or failed. We have only the dimmest of perceptions of how this dramatic history of innovation and interaction unfolded, but it is already evident that our species, far from being the pinnacle of the hominid evolutionary tree, is simply one more of its many terminal twigs.” (emphasis added)

 

He then asks, how is that we have ended up alone. We can see unambiguously from the archaeological record that our direct ancestors were incomparably more inventive than earlier hominids, including Neanderthals. The sapient hominids 50,000 to 20,000 years ago - the last phase of the most recent ice age - created art “in the form of carvings, engravings and spectacular cave paintings; they kept records on bone and stone plaques; they made music on wind instruments; they crafted intricate personal adornments; they afforded some of their dead elaborate burials with grave goods…and their living sites were highly organized, with evidence of sophisticated hunting and fishing.” We have “no direct clues as to the nature of the interaction between the two species”, he reflects. “In light of the Neanderthals’ rapid disappearance and of the appalling subsequent record of H. sapiens  (us), though, we can reasonably surmise that such interactions were rarely happy for the former. Certainly, the repeated pattern found at archaeological sites is one of short-term replacement, and there is no convincing

biological evidence of any intermixing of [species] in Europe.” ‘Once We Were Not Alone’, Scientific American, Special Edition ‘New Look at Human Evolution’, August 2003, pp. 20-27, quotes from pp. 24, 26.

 

In an exchange of letters in the Australian press on the nature of human evolution, in January 2001, anthropologist Nick Modjeska recorded an intriguing story, suggesting that pre-sapient hominids may have had such an unhappy encounter with our lot in New Guinea some 22,000 or more years ago (which would suggest that they had survived there long after they had died out elsewhere and, specifically, until our kind arrived in the islands and wiped them out): “Thirty years ago, I was an ANU postgraduate collecting oral traditions in a remote part of Papua New Guinea’s Southern Highlands. Among the stories I recorded were several that told of epic battles between the first humans to arrive in the region and a pre-existing population of ‘wild men’, described as cannibals - ogres without the power of speech. The ‘wild men’ were eventually exterminated and the ‘real men’ became the ancestors of today’s population.” The Australian, Letters to the Editor, 16 January 2001.

 

[xxix] Sophocles Antigone ll 339-370. In his 1947 translation for Penguin, E. F. Watling rendered the key expressions “wonders are many on earth and the greatest of these is man”. Sophocles The Theban Plays, Penguin, 1975, p. 135.  David Constantine renders into English Friedrich Holderlin’s early nineteenth century German translation with far darker phrasing: “Monstrous, a lot. But nothing more monstrous than man.” Holderlin’s Sophocles: Oedipus and Antigone, Bloodaxe Books, Highgreen, Tarset, Northumberland, 2001, p. 81. The translation “uncanny” is derived from Walter Kaufmann, though he also used “awesome” in this case. As he remarks, “Reading Sophocles’ tragedies, one certainly does not gain the impression that he found man as such very wonderful. Rather, the poet’s world is governed by merciless powers, and men are strange, even frightening.” Tragedy and Philosophy, Princeton University Press, Ch VII ‘Sophocles: Poet of Heroic Despair’, p. 278.

 

[xxx] Calvin op. cit. p. 139. The standard account is Denise Schmandt-Besserat How Writing Came About, University of Texas Press, Austin, (abridged edition) 1996. For a clear account of how the Roman alphabet developed out of archaic writing, some 2,500 years after cuneiform was invented in Sumer, see David Sacks The Alphabet, Thames and Hudson, London, 2003.

 

[xxxi] For a good general history, see Georges Ifrah From One to Zero: A Universal History of Numbers, Viking, New York, 1985.

 

[xxxii] Quoted in Stanislas Dehaene The Number Sense: How the Mind Creates Mathematics, Penguin, 1997, p. 249.

 

[xxxiii] See, in particular, Reuben Hersh What Is Mathematics Really? Jonathan Cape, London, 1997, and George Lakoff and Rafael Nunez Where Mathematics Comes From: How the Embodied Mind Brings Mathematics Into Being, Basic Books, New York, 2000.

 

[xxxiv] Ibid. Preface, p. xi.

 

[xxxv] Calvin op. cit. p. 104.

 

[xxxvi] Ibid. p. 114.

 

[xxxvii] Ibid pp. 183-84.