readings> the bifold mind

This is a chapter from my book Going Inside (1999). The book is mostly about consciousness and the brain - the biological story. But in this chapter I summarise the "bifold" or Vygotskean argument that grammatical speech and human culture scaffold the brain's biological capacities to create a mind that is something different again.

It is obvious that the consciousness of humans is different from that of animals, but the question is in what way? Humans appear to have all sorts of added extras, such as self-awareness, rational thought and freewill. Yet it is hard to tell whether the distinction is one of degree or kind. Is the human mind just a scaled-up chimpanzee brain? Or is the gap so great that there can be no comparison between the mental lives of animals and humans? Do animals even have subjective states?

The place to start looking for answers is the archaeological record. The hominid family line is thought to have broken off from the rest of the African apes about five to six million years ago. The first adaptation appears to have been upright walking. Early hominid skeletons have completely modern hips and legs but still an ape-sized brain—and presumably an ape-like mind. However it was not long before the brain of the hominids began to swell. With the arrival of Homo erectus about a million years ago, it had trebled in size. And there is plenty of evidence that this extra brain power was put to good use. Homo erectus made simple stone tools, built rough shelters, and by about half a million years ago, was keeping warm beside fires.

In evolutionary terms, this was certainly a swift, but not unprecedented, rate of development. The dramatic increase in brain size was achieved by the quite simple mechanism of letting the brain continue to grow for a year or two after birth. The brain of an ape normally does almost all its growing in the womb. But humans develop as big a brain as they can in the womb—large enough to make childbirth a risky, painful, affair — and then keep on going.

It is true that half of the neurons actually die back during the first year of life as the brain's pathways prune themselves to shape. But overall, the brain gets bigger as the remaining cells swell in size, sprout their connections, and develop a fatty myelin insulation. There is also a huge increase in the number of support cells that keep the neurons healthy and fed. So within a year, the brain doubles in weight. By the time it reaches full adult size at the age of six or seven, its size has trebled.

As far as the genetics go, producing the human brain could hardly have been simpler. Whatever our mental abilities depend upon, it does not seem to have been the development of any radically new brain structure. The human brain looks exactly like a scaled-up monkey brain. All that needed to be changed was the timing of the genetic clock ruling the starting and stopping of the various phases of brain growth.

What makes the self-conscious human mind all the more puzzling is that until very recently, the ballooning of the hominid brain did not even appear to have much effect on the nature of consciousness. Of course, it is impossible to say anything certain about the mind of Homo erectus, yet there is no evidence of a huge mental gap between this ape-man and the apes of today.

Chimpanzees might not use fire or put a roof over their head, but they can be reasonably skilled at fashioning tools. They will strip twigs to fish termites out of a nest or crumple leaves to act as sponge to get water out of a tree bole. One chimp was observed to make four different kinds of tool to get at honey in a bees' nest, using successively finer gauges of splintered branch as chisels to pierce its wall, then finally nipping off a thin whip of vine to collect the honey.

Chimps show many other signs of intelligent behaviour, such as making nests of branches in which to sleep, seeking out medicinal plants when they have worms, and wiping their bottoms with leaves when they have diarrhoea. Chimps even hunt co-operatively. In the wild, a pair of males will creep up on a group of feeding colobus monkeys, panicking them and driving them into an ambush by the rest of the troop.

The fact that Homo erectus was a tool maker and fire user is impressive. But it seems equally significant that Homo erectus then hung around for about a million years without making any further spectacular advance. Having mastered the skill of chipping a flint into a teardrop shaped hand axe, Homo erectus went on producing exactly the same tool for thousands upon thousands of generations. This ancestor had a mind that was intelligent, but not explosively inventive.

About 100,000 years ago, Homo sapiens—true modern humans—appeared on the scene. Homo sapiens was a puny creature compared to the Neanderthals who happened to be the dominant species of hominids at the time. And Homo sapiens did not even measure up in brain size, having a cranial capacity of about 1500 millilitres compared to the 1600 millilitres for the Neanderthals. Yet despite this, Homo sapiens took over the world. The Neanderthals were shoved aside, the last few disappearing about 30,000 years ago, while Homo sapiens spread rapidly to fill every corner of the planet.

From the first appearance of Homo sapiens in the fossil record, it is plain that we are dealing with a fully modern mind. The tools made by early humans represent a quantum leap over anything produced by a Neanderthal hand. As well as finely crafted harpoon tips and arrowheads, archaeologists have dug up bone needles for sewing clothes and small fat-burning lamps. Homo sapiens made the first proper campsites, with huts and hearths, rather than crude windbreaks and open fires. The dwellings were also arranged in a way that suggests a clear social pecking order.

The clincher, however, is that by about 40,000 years ago, Homo sapiens was capable of art. Bone knives were decorated with carvings of deer. Shells and bones were whittled into beads to be strung on necklaces. Most famously of all, people were crawling deep down into caves to paint the walls with awe-inspiring murals of animals and hunting expeditions.

So the archaeological record tells of a long haul to produce a large brained ape. By resetting the brain's genetic clocks, the hominids grew into smart chimpanzees. But then something else happened. For some reason, in a blink of geological time, one line of hominids suddenly became symbolically minded and self-aware. The only possible explanation for this overnight change appears to be the development of language—or more precisely, of articulate, rapidly-spoken and grammatical, speech.

setting the evolutionary scene

Speech would have emerged out of the extremely close social lives that the hominids were leading. Living in a group depends on good communication. Chimps and gorillas may not have a formal language system, but they are expert at reading the moods and intentions of one another from subtle signs such as facial expressions, direction of gaze, general posture and, of course, grunts, screeches, pants and other emotional noises.

A lot of these signs are involuntary and so depend entirely on the intelligence of the receiver to notice them and interpret them. But the apes also make a deliberate use of gestures. Chimpanzees have been seen holding out a hand as an invitation to groom, or directing attention with a flick of their eyes.

Even more revealingly, chimps are capable of deceiving. The anthropologist, Jane Goodall, who studied wild chimps on the shores of Lake Tanganyika, reported how a bunch of bananas was once dropped in front of a junior chimp after the rest of its group had wandered on ahead. The chimp made excited "food barks" which immediately brought the others racing back and so the youngster lost out. As a test, the same thing was tried again the next day. This time the chimp stayed silent—although a faint choking noise could be heard coming from its throat as it struggled to stifle its glee.

Many other examples of concealment, misdirection and feigned nonchalance have proved that apes are not only smart enough to read another's emotions but also to make some predictions about how their own signals might be interpreted. An ape like the chimpanzee seems very close to language. However, two further developments were probably needed to clear the way for a symbol-based system of communication.

As many speech researchers, such as Doreen Kimura of the University of Western Ontario, Canada, have argued, the first of these would have been the gradual lateralisation of the brain as an adaptation for tool use. Making and handling tools demands the ability to plan a focused sequence of actions. A flint has to be turned in the hand as precise blows are struck to produce a sharp edge.

And, as said earlier, doing this would require two kinds of thinking to be going on simultaneously. While one side of the brain was shrinking hard to isolate the separate steps of the sequence, the other would have to be taking a more global view, holding the overall outcome in mind. Then having spent a million years or so developing the brain so that it could handle complicated sequences of hand and finger movement, the same ability would have smoothed the path for language. Speaking needs both the ability to chain words together and also to keep in mind a general idea of where a conversation is going. There would have been an existing motor and planning hierarchy for language to hitch a ride on.

The other less obvious pre-condition for the development of speech would have been the social change of stable pair bonds. Almost every other species of animal that lives in a group depends upon rivalry. The males compete for dominance and so the right to father most of the children in the troop. The result of this pattern is that the males have little reason to help in the care of the young—they either have no offspring or are too busy keeping their place at the top of the pack.

This tournament style of mating works fine in smaller brained animals such as baboons and hyenas. But big brains are metabolically expensive. The young need to be well fed. It also takes longer for the young to grow up. The strain of this already shows in chimpanzee and gorilla mothers who have to space out their children. A chimpanzee mother can only have a baby once every four or five years. A simple extrapolation shows that the apes were heading up an evolutionary blind alley—they could not grow brains much bigger before they ran into problems raising enough babies to start the next generation.

To get out of this bind, anthropologists such as Donald Johanson of the Institute of Human Origins in Arizona say the early hominids would have had to make a drastic change in their approach to parenting. The need was for a more monogamous, more co-operative, reproductive strategy. By pairing off, fathers would have a stake in looking after their children.

The relaxation of tensions between males would also have allowed more group activities, such as hunting and shelter building. Food could be gathered collectively and then shared. As the hominids like Homo erectus became more socially organised, there would also have emerged a division of labour between the sexes. Weighed down by their brood of children, the mothers would have stayed close to the safety of a base camp, spending the day collecting staple foods like roots and berries. Meanwhile, the males would have gone after riskier, but nutritionally more rewarding, food. They would hunt, fish, or climb trees in search of honey.

The advantages of these changes in food-collecting and child-rearing behaviour are easy to see in modern day hunter-gatherer tribes. Even though civilisation has pushed such people into the most marginal lands, they can still feed themselves with just a few hours effort a day. Chimpanzees, by contrast, have to spend most of their time foraging. And instead of children being spaced out, humans have no trouble supporting another child every couple of years.

The importance of pair-bonding was that it would have both put a premium on the development of communication skills and also preserved the first hesitant steps towards proper speech. Obviously, to live in a co-operative group would demand a more explicit method of communication than just meaningful grunts, face-pulling and eye-rolling. At some point, the bridge must have been crossed so that the use of gestures and sound became symbolic. Rather than things simply standing for what they were — a grimace of annoyance or a grunt of interest — they would refer to something else.

Learning such an association is not itself all that difficult. In experiments, chimpanzees have been taught to recognise and use hundreds of words. They can use hand signs or plastic tokens to ask for a cup of orange juice or to go for a walk. Chimps in the wild have also been seen to invent their own personal gestures, such as holding out one arm clasped in the other as a plea to be groomed, or shaking a head to say no. Even with a brain a quarter our size, chimps seem to be capable of making the first small steps towards the use of symbols.

But the hurly burly of chimpanzee life also means that these first steps are unlikely ever to be preserved. Mother chimps have a close relationship with their infants and so might create some shared quirk of expression. However, as adults, this bond is not repeated and any experiments with symbolism would get washed away. With the hominids, on the other hand, the necessary closeness between parents would close the cycle and allow the advances of one generation to passed on to the next. So from minor beginnings, the use of symbolic communication could quickly snowball.

All these many kinds of evolutionary change seem locked together in a virtuous feedback spiral. The apes had reached a bit of a dead end, their brains being about as big as their social system would support. Then the hominids broke out. Walking upright freed the hands and encouraged the greater use of tools. Using tools demanded more brain power and also the ability to imagine sequential actions. To afford bigger brains, the hominids had to change their parenting style. Closer bonds, in turn, fostered developments in communication. Closing the feedback loop, every small advance in the ability to communicate would have further tightened the social ties, help formalise the use of tools, and so pay for the growth of yet larger brains. Once started down this evolutionary path, the hominids were propelled forward until eventually a true speech capacity began to develop.

Language would not have appeared all at once. Words must have come before grammar. The first words would have been names for everyday objects such as firewood and antelopes. These would have made possible simple one word sentences whose meaning was made clear with expressions and gestures. A tribal elder would only have to say "wood" with a commanding tone and a nod of the head towards a dying fire to tell a passing youngster to go and add a few more branches. Or a hiss of "antelope", said with a quieting slash of the hand, might have been enough to warn a band of hunters to hush up and spread out.

It is not hard to teach chimpanzees and gorillas to communicate at this grammarless, single word, level. So with a much larger brain, it seems almost certain that Homo erectus would have been using just such a proto-language for most of its million years on the planet. And for the daily round of a hunter-gatherer species, single words would have been plenty. Speech does not need grammar to be useful. But eventually, somewhere along the line, the next step was taken and a race of hominids began to talk in complex sentences.

the logic of grammar

Inventing the rules of grammar was probably more of a jump than it seems. When speaking in single words, the only choice to be made is one of topic. But as soon as words are chained into sentences, there is the problem of what to say first. We can only transmit one word at a time, so a sentence has to have a backbone logic connecting its parts. A standard formula is needed for breaking down an often complex tangle of thought into a linear stream of symbols. In other words, to be grammatical, our ancestors had to learn how to think like reductionists!

There are over 6,000 dialects and 200 language families in the world today. Superficially, the grammar of each looks very different with almost any rule of word order or declension of tense appearing to apply. Yet the first feature shared by every known language is that they all divide the flow of words into sentences—speech modules with a self-contained logic. The second is that these sentences always have the three fundamental components of a subject, a verb, and an object—a doer, an action, and a done to. The story they tell is a straight-line, cause and effect, tale of who did what to whom.

Of course, these three essential components can be arranged in any order. The standard English order of subject-verb-object seems the most sensible because it is what we are accustomed to. But other languages, like Japanese, use subject-object-verb. Instead of saying "the cat sat on the mat", a Japanese speaker would say "the cat on the mat sat." Then some languages, like Gaelic, use a verb-subject-object order, so the sentence would read "seated was the cat on the mat".

In a few rare cases, the object can even come before the subject. This reverse logic sounds as though it would be difficult to follow, but even to English-speaking ears, the sentence, "On the mat was sitting the cat," still makes sense. What matters is that it seems to tell us the complete story. The sentence has the three components needed to express a simple linear relationship.

So where did this reductionist formula come from? Tool use would have pre-adapted the brain for grammar, giving it the resolving power to think in sequences. The three part logic then adopted would have been no more than an accurate reflection of life as seen through human eyes. We view ourselves as creatures of purpose, taking decisions and making things happen. The world we inhabit is primarily one of actors and actions. So it was only natural that forced to find a serial logic to structure our speech, we should chose to break each sentence down into a tale of cause and effect.

The in-built reductionism of human language is clearly a problem for the would be dynamicist. Its straight-line logic becomes a handicap as soon as we want to talk about systems which are the product of competition and complex feedback relationships—systems like the brain, for instance. But, for a tribe of hunter-gatherers, it was more than adequate. Their life depended not on philosophical clarity but on being able to communicate what were essentially social ideas. A grammar centred on actors and their actions would capture precisely what most needed to be said about any social situation.

The question then is when did grammatical, articulate, speech enter the picture? It is query that has caused surprising heat among language scholars. But many researchers, such as Philip Lieberman of Brown University in Rhode Island, believe that grammar was only perfected with the arrival of Homo sapiens.

The most convincing line of evidence for this comes from the shape of the human vocal tract. Reconstructions of the throat and mouth of Neanderthals suggest that they would have simply lacked the equipment for talking fast. They did not have the right-shaped lips or tongue to nip the flow of air into crisp sequences of syllables. The shallowness of their throats would also have greatly limited the range of vowel sounds they could make. It has been guessed that the articulation rate of a Neanderthal may have been as much as five to ten times slower than a human.

The actual changes needed to evolve a better vocal tract were all relatively slight—a resetting of the genetic growth clocks to arch the palate, thicken the muscles of the tongue, jut the chin, refine the lips and drop the voice-box lower down the throat. So, as Lieberman argues, the fact that the Neanderthals did not make them was a sign they had no use for them. Equally, the fact that humans did make the changes necessary for rapid speech appears proof that we were also the first to want to speak in sentence-length bursts of words.

Thus far, the general story is well enough accepted. Yet then comes the puzzle of how did grammatical speech make a difference? Why was there the sudden explosion in art and creativity which heralded the arrival of the abstract thinking, self-aware, human mind?

To answer this, we have to be able to say something more about what distinguishes the mental life of an animal from that of a human. In one way, the difference is terribly simple. Animals are locked into the present tense. As numerous philosophers from John Locke to Ludwig Wittgenstein have remarked, animals live entirely in the here and now, their minds responding to whatever is currently going on around them or to whatever urges happen to be welling up from their bodies. Humans, by contrast, have broken free of this tyranny of the moment. We have a consciousness that can wander, thinking back to review memories from our past, or thinking forwards to imagine how life might be in the future. We can even take a step away from ourselves to contemplate the fact of our own conscious existence.

Animals have awareness, feelings, associations, anticipations, memories, self-control—seemingly the full complement of parts needed to make a mind. However, all these mental abilities are tied to the moment. For example, an animal can recognise but not recollect. If a monkey sees a banana or a picture of a ship, the mapping of the sensation will surge forwards through the hierarchies of the cortex, stirring a sense of familiarity and meaningfulness in the memory areas of the temporal lobe. And yet there is no reason to think that a monkey ever sits around mulling over the story of its life.

Of course, an animal as intelligent as a monkey can make associations and this will seem to bring memories back. Feeling a pang of hunger might trigger a memory for where a monkey had earlier visited a tree of ripe figs. Or being placed in an experimenter's box would rouse a state of expectation about the kind of events likely to follow.

However the monkey's response is still being prompted by outside cues—the pains, urges and alarms of the body being as much something "outside" as far as the brain is concerned. There is no hint that a monkey has the independence of thought to be able to indulge in fond reminiscences about the fig trees it used to clamber up as a youth, or to wonder whether it might face some new kind of experimental test for a change today. Wittgenstein summed this up neatly when he commented: "We say a dog is afraid his master will beat him, but not he is afraid his master will beat him tomorrow. Why not?".

So the mind of an animal is caught in the flow of life. It is always taking an intelligent view of the moment, but has no freedom to take flight. Language would have changed all this. By taking a tool developed for organising their social world and turning it around to organise what went on inside their own heads, our recent ancestors would have been able to interrupt life's ceaseless tide and begin directing their thoughts elsewhere. Words have the power to spark images.

So by getting into the habit of talking to ourselves, using a self-questioning, self-prompting, inner voice, we are able to transport ourselves to imaginary viewpoints. We can roam our memory banks to relive past moments or spin fantasy images. The invention of articulate, grammar-driven, speech was also the invention of articulate, logic-driven, thought.

The idea that speech must be responsible for the special mental abilities of humans is also a very old one. It was present in the writings of Enlightenment philosophers such as Locke and Thomas Hobbes. Charles Darwin and many of the Victorian evolutionists speculated about the idea. Then in this century, a strong case was made by the Russian psychologist, Lev Vygotsky, and his colleague, Alexander Luria. Working in the 1930s, Vygotsky observed children closely and realised that the development of their capacity for self-awareness, memory and thought went hand in hand with their learning of speech. From this, he argued that such abilities were not innate—wired into the genes—but habits which every child had to pick up during the first seven or eight years of life.

However, while the suggestion has been frequently advanced, it has been just as easily dismissed. For most people, the claim that language could make the human mind falls at the first hurdle. After all, it seemed only logical that before our hominid ancestors could have felt the urge to speak, and so put a premium on the development of their vocal equipment, they must have already had something on their minds they wanted to say. The special abilities of humans had to precede grammar and words rather than result from them. Speech would be the outward sign of what was an inner revolution.

The belief that words can only clothe thought has seemed so axiomatic that until a rediscovery of Vygotsky's work in the late 1980s, it was exceptional to find a recent Western philosophy or psychology text that even mentioned the possibility that language might make a difference. And in truth, introspection seems to suggest that words are indeed mostly secondary.

Sometimes speech clearly appears to lead our thoughts—as when we feel that we did not know what our opinion was going to be until we heard ourselves expressing one. But at most other times, thought definitely seems to come first. There is at least an inkling of what we intend to say before the words start spilling through consciousness or out of our mouths. So if this is the way we are now, then it seems only sensible to think that the same was true for our ancestors and that they must have been thinkers before they were speakers.

The realisation that the brain is a dynamic system—a fluid, adapting, bag of circuits—of course changes the whole conception of the problem. There can be no chicken and egg dilemma because simple ideas of cause and effect do not apply. The brain is constructed according to a genetic blueprint, but it is a blueprint that codes more for degrees of plasticity and bursts of growth than actual pathways or processing circuits.

As said, even the way the brain sees and hears is something that is largely learnt—or at least self-organised to fit. There is an unbending level of neural structure—some basic proportions of cells types, cell numbers and cell branching patterns needed to give the brain is correct shape. But the pathways are responsive to the world they find themselves in. It takes months for areas like the IT cortex to become even partially organised in a human child. Then all our lives, we are adding extra memories—further habits of sensory processing that allow us to see people, places and events as familiar or novel. The circuits of the brain are tailored by experience and although they may eventually accumulate a thick sludge of habits, they never actually settle.

What this plasticity means is that the pathways of the brain are open to being colonised by culture. Language—and the habits of thought which it supports—may have developed first in the social sphere, being the product of a cultural rather than a biological evolution. But because nerve tissue is plastic, words and grammar would immediately grow into the brain. Every tiny advance in the usage of one would show up immediately as circuit changes in the other. Speech would not be a small add-on, a symbol-processing module strapped belatedly to the side of the brain. Instead, as scanner images have so graphically revealed, the structure of language would penetrate just about every part of the brain.

The early PET studies at Washington University in St Louis dazzled those who expected language processing to be confined to the two classic, coin-sided, speech centres, Broca's in the frontal cortex and Wernicke's in the bend of the temporal lobe. The experiments showed the stain of activity all through the higher cortex and also down low in the thalamus, basal ganglia and cerebellum.

As new generations of scanners came into service with better resolution, language researchers found more and more of the brain lighting up. The traditional language centres handled core tasks such as chaining words into grammatically correct sequences and storing representations of the sound of words—the aural mappings used to drive the muscles of the throat and mouth. However, as might be expected, the meaning of any word—the associations and imagery it might spark—were spread right across the cortex.

If the word was the name of a tool, such as a screwdriver or hammer, then thinking about it would cause a stir of activity in the motor areas of the frontal cortex. The verbal symbol "screwdriver" was connected to memories of what it was actually like to handle such an instrument. Likewise, if the word was the name of a colour, an area around V4 would light up. And if a person was asked to think about a yellow screwdriver, then both areas would fire.

Like any kind of memory, word meaning was something that was distributed back across all the mapping areas that would have to process aspects of what that word stood for. This did not turn all sensory and motor areas into outposts of the language hierarchy. But it does show how something socially invented could come to texture the whole processing landscape of the brain.

In fact, giving the plasticity story another twist, the already plastic cortex of the mammalian brain—with its extra dollop of uncommitted circuitry in the form of the ever-swelling prefrontal cortex—has taken on yet new levels of plasticity in humans so as to allow language to have its maximum impact. As is obvious, human children are born much more helpless than the animals of any other species. It takes a year before they show much ability even just to move about and handle objects with any skill.

The reason for this is that the maturation of the brain — the insulation of its many paths with myelin to make the connections fast and efficient — is delayed quite abnormally, particularly in the cortex areas critical to language and the higher levels of thought. The main language areas do not really begin myelinating until the second year of life and it takes six or seven years before the process approaches completion — which is why learning a second language is easy for an infant but hard for adults. In other regions of the brain, such as the prefrontal lobes and the area around the hippocampus, the delay is even more dramatic. These parts do not reach full maturation until the late teens or early twenties.

The slowness of the brain's maturation is an immense evolutionary burden. Children need total care for the first few years of their life, which would have made the caring, co-operative, lifestyle of our hominid ancestors all the more essential. But it also creates a wide window of opportunity in which children can be exposed to language and soak up their culture's rhythms of speech and thought. So our genes have set us up with not only a large brain, but a brain that patiently awaits its coming brush with culture.

What this means is that the evolution of a language system was always something that was socially scaffolded. The advances were not individual—a lucky genetic break that allowed some distant ancestor to start expressing some of the thoughts he or she had been having — but collective. It would be the group that developed new habits of sentence forming or new words. Then because of the extreme plasticity of the infant brain, the next generation would find themselves doing with ease what had been moderately difficult for their parents. Ways of thought that were once alien would become second nature for those who grew up with them — a situation that might be repeating itself today with the operation of video recorders and home computers.

There is still a question about whether the human brain has made any specific genetic adaptation for grammar — whether it has a wired-in motor template for generating subject-verb-object shaped sentences as so many linguists, especially the towering figure of Noam Chomsky, the MIT theorist who was one of the major inspirations of the cognitive science movement, have argued. The more distributed looking the language system becomes according to scanning studies, and the more that is discovered about the way children can pick up the rules of grammar from simple statistical inference—from noticing patterns and regularities—the less likely it seems that there is much that needs to be hardwired.

As said, most of the genetic changes are permissive. Human brains are four times bigger than an ape's, they myelinate many years later and they were probably already lateralised and skilled at sequencing motor operations from several million years of tool use. Once some vocal equipment changes had been made to allow faster, more distinct, articulation, not a lot else seems that necessary.

Babies do seem some useful instinctive behaviours to get them on the path to learning, such as a keenness to experiment with babbling noises. Babies also have an instinct for conversational turn-taking. They will alternate between periods of babbling and listening. And they will naturally follow another's gaze — a trick which means they can more quickly learn the connection between an object and its name. So there are a host of genetically simple adaptations which make the infant brain fertile soil for the establishment of language. But it is the human brain's plasticity — its capacity to respond to a culturally-imposed need and develop some language-textured circuitry — that seems the main advance.

the power of words

The dynamism of the brain's design means that our ancestors did not have to be thinkers before they were speakers. Cultural change and genetic change could go hand in hand. So exactly how does language make a difference? As Vygotsky and others have argued, just having words would have started the process of breaking us free of the moment. A word is no more than a puff of air, a growl in the throat. It is a token. But saying a word has the effect of grabbing the mind of a listener and taking it to some specific spot within their memories.

The animal brain is tied to the present by its very dynamism. The evolutionary story of the brain has been one of an ever increasing ability to wrap itself around the shape of a moment. It is a living memory surface, tuned by learning and experience. But this means that the brain has no such thing as memories in the digital sense used by computer scientists. A computer memory is made up of discrete bits that can be picked up and shuffled about. But in the brain, memory is embedded. It is the processing landscape. So while an animal can use its circuits to react—to bear down on each moment with the full weight of a lifetime of experience—it has no mechanisms to fetch and replay arbitrary chunks of data.

Words, however, allow us to treat our brains as if they actually were digital warehouses. We cannot shift the data. That always has to stay in place. But we can use words to trick the brain into making a shift in its point of view—to open up an angle into an area of experience. Hearing a word like rhinoceros, camel or cat will cause an adequately trained brain to react as if it had just seen the real thing—or to be more accurate, to react with the anticipation of seeing the real thing.

The power of words is all the greater because we can slap a label on anything. Large or small, simple or complex, it takes the same effort to speak its name. A word can take us as quickly to the idea of the Universe as the idea of a rhinoceros or the colour blue. We can even give names to abstractions like love and honour. This gives the human mind another kind of freedom.

The animal mind is not only trapped in the present tense, but it is also stuck with a concrete level of categorisation. A monkey can recognise a banana or a fellow troop member when it sees them, but it does not classify these experiences in abstract terms, thinking about them as examples of a fruit or some relation like an uncle. An animal develops natural categories of processing—ones that lump experiences in terms of their basic sensory qualities. But words allow humans to create artificial categories for organising and exploring memory. Things which might otherwise be very difficult to think about, such as bravery, outer space, or family relationships, get given a convenient handle.

Using these, we build our way out of the world of immediate sensation and move our consciousness into a realm of culturally-evolved thought. We can reach points of view that are only possible because a symbol serves to bind the meaning together.

Words are a critical first step towards taking control of the brain's memory landscape, making it possible to wrest the focus of attention away from the events of the moment. However grammar would have been needed for any real breakthrough.

Simply possessing a proto-language based on single word utterances would not have made that much difference to the mental abilities of our distant ancestors. An isolated word could be used to force a big shift in the thoughts of others, causing them to focus on something like firewood or big game which had been far from their minds. But the speaker would still be left trapped in the here and now. The urge to utter a word would arise only because of some immediate need or circumstance the person was experiencing. There would be no mechanism for calling up a word unless something happened to jog its use. However, developing a grammar with a driving reductionist logic would have instantly have put a motor into our thoughts.

As has been seen, the language centres hang off the prefrontal lobes to form a third flank of the frontal motor hierarchy. The focus of every moment is mapped within the prefrontal cortex's broad expanse and then ripples back down through the motor, orientation and speech output areas, so sparking ideas about suitable reactions. This means that the itch to say something in response to each fresh shift in focus is automatic. Once the reductionist logic of grammar has taken root in our brains, sheer force of habit will make us look at each moment and search for the way it can best be broken down into a story with a subject, verb and object.

Of course, as with any stirring impulse, we can always cut short the urge to speak—or more likely, we will be interrupted and have moved onto something else before the urge has got very far. But having grammar means that every instant of our lives is viewed through a reductionist prism. We are always just about to launch into a sentence given half a chance. Our thoughts are always just about to head off somewhere.

And then as soon as we do let ourselves speak — or imagine the same words through our inner voice — our minds will quickly be carried to far places. A sentence may start out as a comment about the moment, but it will immediately open up its own chain of thought. The words we have just use will trigger their own associations and images, creating the potential focus for a fresh sentence. The act of sentence-making feeds on itself to draw us into an entirely private world of thoughts about thoughts.

For example, seeing the glassy eye of a fish staring out at us on a trip to the supermarket might spark an inner comment about the fish not looking too fresh—its opaque gaze would be recognised as significant and our words would then serve to make explicit a link with the idea of freshness.

Throwing the emphasis on the idea of freshness might next prompt an inner puzzling about how this fish got to the shop. Before we know it, our minds would be filled with images of trawler boats tossing in the waves, or crates of glistening fish being slung across market floors, together with some inner comment about it taking days or even weeks to get back to shore to unload—do they freeze the fish solid or just chill them? Borne along by the trick of grammar, rapidly we will find that our thoughts have travelled a long way from the here and now of standing by a supermarket fish counter.

Importantly, we do not actually have to voice full sentences for our thoughts to be pushed along like this. One of the awkward points faced by those who, like Vygotsky, wanted to claim that speech drives thought was that too often our inner dialogue seems sketchy at best. A lot of the time, it is no more than a string of half-formed phrases or even just the feeling of being about to say something.

When we strike some difficult moment in a train of thought, we usually do seem to try to prompt ourselves with fully articulated questions and suggestions. We may even talk out loud to ourselves or attempt to clarify our ideas by putting them down on paper. But mostly our thoughts seem to consist of a confused jumble of inner mutterings and fleeting glimpses of imagery. Our minds appear to move along too fast to be dependent on the laborious probings of self-directed speech.

However, as has been seen, the brain does not have to have everything "in consciousness" to do useful work. A speech intention must go through many levels of decomposition before it arrives at the primary motor surfaces as a fully fledged set of muscle instructions ready to drive the mouth and throat. Like a "spontaneous" flexing of the finger or any other motor action, it may take half a second or more for the brain to gear up to speak a sentence. As well as selecting the actual words, the brain has to decide the pattern of emotional emphasis and plan for any face or hand gestures that will accompany the message. Even the lungs have to be instructed to take a breath proportional to the expected length of the sentence.

Yet the part of the speech act that is crucial to thought — the extraction of a logical link from the current focus of the moment—actually happens very early in the process. So just getting to the stage of having a first inkling of what we want to say — an idea of what should be the subject, verb and object — will set our thoughts moving along quite nicely. Going the whole way and voicing the fully formed sentence might well produce a greater impact on our thinking. The more clearly we state an assertion, the easier it is for us to notice its implications or shortcomings. But long before this stage, the key step of establishing a logic connection and rousing the relevant areas of memory will already have been achieved.

Exactly the same is true of the mental imagery that is the other half of the equation. Given time, our brains can respond to a word like rhinoceros with a whole succession of rhinoceros-related views — each fleeting mental picture taking about half a second to generate. But long before a vivid, fully fledged, sensory experience comes together, our anticipatory state will already be doing work.

Merely having the sense of being in the right spot to start seeing rhinoceros images would be enough for us to feel we understand the meaning of the word. We do not need to unpack the mental pictures that go with each word of a sentence for its meaning to carry us along. Letting the imagery blossom will always give a sentence greater impact. But we only tend to pause to allow this to happen at critical stages in a train of thought.

As usual, the brain prefers to operate at the lowest, most habitual, level that it can. Like any other kind of action, much of our thinking is actually rather stereotyped. The things we say and the imagery we rouse will tend to repeat what went through our heads the last time we faced a similar situation. So unless we have good reason to want consciously to check the links in a chain of thought, there would be no need to slow down and make each step explicit.

Yet what matters is that the full structure of grammatical speech and reductionist thought is always there at the back of whatever we do. We will either be using it to deal with the moment, or it will have been used successfully to deal with an almost identical moment in the past, establishing the necessary connections to do the same job swiftly and preconsciously. So even when speech does not seem to be acting overtly, our thoughts will still be moved along on an "as if" basis—as if each word and image were making the individual, full-blown, trip through the spotlight of consciousness.

memory and self-awareness

Grammatical speech put an engine into human thought, allowing our minds to break free of the tyranny of the present. We could then start going places in both our imagination and our memories. Our brains are still the same, always reacting to whatever has just been placed in front of them. But with words, we can start feeding our brain fake moments — pseudo angles in. And this control over our state of mental representation gave us two new powers in particular — recollective memory and self-awareness.

Memory is a confusing term because it is used to cover such a wide spectrum of the brain's activity. Any measurable change in the brain's organisation — even the fleeting pattern of a working memory state — is seen as a type of memory. But what we really mean by memory in humans is recollection.

The animal brain has memory in that it can accumulate pathway changes and sensory habits. However, as said, these circuits only show themselves during the processing of a moment. A patch of "memory" will allow the brain to make recognition matches and flesh out an experience with a halo of associative meaning, but an animal has no independent mechanism for returning to moments in its past.

Language acts in several ways to make recollection possible in humans. At the most obvious level, we can use self-questioning to steer our minds back to some occasion. For instance, we just have to ask ourselves what we had for breakfast or what it was like back in our schooldays for the words immediately to open up an angle into an area of past experience. The words will stir an anticipatory sense of what it would be like to be back there, living those same moments again. But we do not need to use such overt questioning for memories to flow. A lot of the time images will be jogged free simply as a result of a train of thought. A moment from our past will come to mind because some association has been struck.

Thinking about breakfast may remind us of the time when we ate out under an awning at a hotel while on holiday. But the point is that words are needed for our brains to wander away from the moment in the first place. And if we wanted to, the potential for control is always there. We could say to ourselves: "Stop daydreaming. It's this morning's breakfast that we want to get back to."

Naturally, the accuracy and vividness of our recollections would depend on how crisply the moments had been trapped by the hippocampus at the time. Some episodes from our past, like accidents and embarrassing situations, are caught with a flash-bulb crispness because of their emotion-laden impact. But psychologists analysing eyewitness accounts of staged dramas like a bank robbery find that much of any remembered scene tends to be an invention.

When asked to report details of the robbery, subjects inserted facts that they felt ought to be in their memories. And leading questions about what one of the robbers might have been wearing frequently implanted the conviction that such clothing was actually witnessed. In other words, there is not much difference between our imaginations and our memories except that with one, we know when we are inventing. With our recollections, the gist is usually accurate enough to serve our purposes. But the background details will be mostly a generalisation — what we might reasonably expect the scene to have looked like based on the blurred recall of many similar experiences.

If recollection is based on the power of language, then that other hallmark of the human mind, self-awareness, is based in turn on the ability to recall past states of awareness. Animals live with their noses pressed hard up against life. They are always in the thick of the moment and so have little chance to contemplate the fact of their own existence. But with words comes the ability to step back and start appreciating ourselves as beings enjoying states of representation.

Broadly speaking, the self-awareness of humans has two elements. The first is the act of being self-aware—of adopting a retrospective stance to each moment and taking notice of the fact that things are happening in our heads. The second is having knowledge of being a self. We learn to form a detached view of ourselves as a mental being.

Neither of these is natural to the brain. Even having an introspective slant to awareness is probably absent in animals. As argued, the brain has no evolutionary use for contemplation. It exists for the representation of reactions rather than sensations. Even the brain's fixing of memories is really a prospective rather than a retrospective step. The sole reason for the brain trapping the significant aspect of a moment is so as to save a viewpoint or habit which might improve its processing of future moments. The hippocampus is not taking snapshots to create a diary of where consciousness has been. It is merely hanging onto a potentially useful pattern of information for the minutes, hours and days it takes for it to become built into the processing circuitry of the brain.

It may be a subtle point but animals do not see into a moment, rather they look out from it. Subjectively, the animal brain would always be facing forward—focused not on where the latest shift in viewpoint has come from, but where it is heading. Rather than feeling like an observer or a passenger, an animal would have a feeling of simply being the vehicle — of doing the journey. This suggests that even our feeling of being there during a moment, observing, supervising and taking decisions, is a habit grafted onto consciousness.

Language does several things to foster the human trick of self-consciousness. Simply being forced to speak grammatically — to break the world down into a story of actors and their actions — would have nurtured the idea of being a self in early humans. Phrases like "I did this" or "you must do that" would focus attention on the fact of there being an I and a you. The power that self-directed speech gives over thought and memory could then be used to think about the experiences of this newly-discovered self.

But the habit of being self-aware, of looking inwards and noticing, would not have been just a happy accident. Instead, it would have been developed to serve a social purpose. It was not that our hunter-gatherer ancestors wanted to create a tribe of moody, philosophising, individualists. Their lifestyle depended upon an ability to share and co-operate. So the main reason for instilling a habit of watching the self would be to make members of the group self-policing.

Animals are creatures of impulse. Even for a highly intelligent, highly social, species such as the chimpanzee, self-restraint is difficult. Any co-operation or sharing of food is a fragile affair. When a group of chimps catch a colobus monkey, most of the spoils go to the strongest. And, certainly, it is hard to imagine a chimp stumbling on a bunch of bananas only to scoop it up and rush off excitedly to divide it with the rest of its troop. But language allowed humans to internalise a socially-developed framework of control.

Words stand for knots of ideas and the ability to encode abstract social qualities, such as duty, patience, kinship, fairness and conscience, would have been even more valuable to early humans than having names for talking about common objects like deer, stone axes or fires. All the virtues needed by a society could be represented in the vocabulary passed from one generation to the next so that children would learn the words, then learn the complex social attitudes that went with the words.

The next step after knowing what to do has to be the ability to tell ourselves to do it. By encouraging young children to get into the habit of looking inwards, they can be taught to guard against their anti-social impulses. The temptation to sneak food, be disrespectful, show fear in hunting, or leave the collecting of firewood until another day, can be resisted. The mind would become a supervised place, with the self acting as an outpost for social thinking. The teaching of this kind of self-awareness is obvious whenever parents are heard telling their children to think about what they are doing, or asking how they would like it if the same thing were done to them.

Of course, the social control over behaviour would never be perfect. However, anthropologists who have studied the few remaining modern day hunter-gatherer tribes often remark how harmoniously they live. Rules are bent and voices often raised, yet self-awareness allows for a skilfully negotiated trade-off between the needs of the individual and the needs of the group. The socialisation of the brain was not about imposing a rigid, unquestioning, pattern of behaviour, but taking the opportunity presented by language to make humans even more socially attuned.

So society educates us to look backwards through the moment and take responsibility for what we see. Even the normal definition of the word consciousness is synonymous with introspection and control — with being in charge as things are happening. The idea of degrees of awareness or delays in brain processing make us uncomfortable because we are supposed to be the "I" that watches the tennis ball on to the strings or makes the spontaneous decision to flex a finger. It is important that we believe our consciousness to be instant and all-seeing so that society can hold us accountable for our slips.

But it is only once we have gone inside the making of a single moment of consciousness and seen how much is involved — how much planning, settling, escalating and reacting — that our conscious selves can really begin to come into some sort of focus.


How the human mind is different: The story of human evolution and the learnt nature of our mental skills is dealt with in detail in The Ape That Spoke: Language and the Evolution of the Human Mind by John McCrone (London: Macmillan, 1990), and The Myth of Irrationality: The Science of the Mind From Plato to Star Trek by John McCrone (London: Macmillan, 1993).

Rise of Homo sapiens: For a good general account, see Lucy: The Beginnings of Humankind by Donald Johanson and Maitland Edey (New York: Warner Books, 1982), and The Origin of Modern Humans by Roger Lewin (New York: WH Freeman, 1993).

Chimpanzee intelligence and tool-use: See The Chimpanzees of Gombe by Jane Goodall (Cambridge, Massachusetts: Harvard University Press, 1986), Chimpanzee Politics: Power and Sex Among Apes by Frans de Waal (London: Jonathan Cape, 1982), and Chimpanzee Material Culture: Implications for Human Evolution by William McGrew (New York: Cambridge University Press, 1992).

Art as sign of modern mind: Becoming Human: Evolution and Human Uniqueness by Ian Tattersall (New York: Harcourt Brace, 1998).

Tool use preadapted brain for speech: "Neuromotor mechanisms in the evolution of human communication," D Kimura, in Neurobiology of Social Communication in Primates: An Evolutionary Perspective, edited by Horst Steklis and Michael Raleigh (New York: Academic Press, 1979). Interestingly, another early proponent of this idea now believes that lateralisation exists in most animals and so tool use could only have further refined an existing feature of the brain. See "Towards a unified view of cerebral hemispheric specializations in vertebrates," PF MacNeilage, in Comparative Neuropsychology, edited by David Milner (Oxford: Oxford University Press, 1998).

Johanson on change in parenting style: Lucy: The Beginnings of Humankind (Johanson and Edey, Warner Books).

Chimps can be taught words: For review of the vexed issue of ape language competence, see Aping Language by Joel Wallman (New York: Cambridge University Press, 1992).

Over 6,000 dialects and 200 language families: This is a popularly quoted figure—see "Hard Words," PE Ross, Scientific American, p70-79 (April 1991)—but others suggest the true number may be double this. See "Language diversity," JA Allan, P Baker and M Farmer, New Scientist, p48 (10 February 1996).

Variations in sentence order: Universals of Language, edited by Joseph Greenberg (Cambridge, Massachusetts: MIT Press, 1963), and The Language Instinct: The New Science of Language and Mind by Steven Pinker (New York: William Morrow, 1994).

Neanderthals probably not articulate: Uniquely Human: The Evolution of Speech, Thought and Selfless Behavior by Philip Lieberman (Cambridge, Massachusetts: Harvard University Press, 1991).

Animals locked into the present: See The Ape That Spoke: Language and the Evolution of the Human Mind (McCrone, Macmillan), Animal Thought by Stephen Walker (London: Routledge and Kegan Paul, 1983), An Essay Concerning Human Understanding by John Locke, edited by Peter Nidditch (Oxford: Clarendon Press, 1975), and The World as Will and Idea by Arthur Schopenhauer, translated by R Hackforth (Cambridge: Cambridge University Press, 1972).

Wittgenstein on dog not afraid of a beating tomorrow: Philosophical Investigations by Ludwig Wittgenstein (Oxford: Basil Blackwell, 1976).

Idea speech responsible is old: See The Myth of Irrationality: The Science of the Mind From Plato to Star Trek (McCrone, Macmillan), and Understanding Vygotsky: A Quest for Synthesis by René van der Veer and Jaan Valsiner (Oxford: Blackwell, 1991).

Our ancestors were thinkers before being speakers: This is the standard line taken by cognitive scientists. So, for example, Steven Pinker says: "it seems uncontestable, even banal, to say that the language faculty was selected for its ability to communicate thought." See "Facts about human language relevant to its evolution," S Pinker, in Origins of the Human Brain, edited by Jean-Pierre Changeux and Jean Chavaillon (Oxford: Oxford University Press, 1995). See also How the Mind Works by Steven Pinker (London: Allen Lane, 1998), and Language, Learning and Thought, edited by John MacNamara (New York: Academic Press, 1977).

PET studies of the mapping of vocabulary: "Discrete cortical regions associated with knowledge of colour and knowledge of action," A Martin et al, Science 270, 102-105 (1995),

"Neural correlates of category-specific knowledge, " A Martin et al , Nature 379, p649-652 (1996), and "A neural basis for lexical retrieval," H Damasio et al, Nature 380, p499-505 (1996).

Myelinisation is slow in humans: See Brain Development and Cognition: A Reader, edited by Mark Johnson (Cambridge, Massachusetts: Blackwell, 1993), "Development of cortical circuitry and cognitive function," PS Goldman-Rakic, Child Development 58, p601-622 (1987), and "Myelinisation of cortical-hippocampal relays during late adolescence," FM Benes, Schizophrenia Bulletin 15, p585-593 (1991).

Children pick up grammar from inference: Ever since Noam Chomsky drew a line in the sand with Syntactic Structures (The Hague: Mouton, 1957), there has been a bitter divide between those who believe in nature and those who argue for nurture. It has only been with the advent of neural networks and theories about self-organising systems that it has become possible to see a mutual resolution—see Rethinking Innateness: A Connectionist Perspective on Development, edited by Jeffrey Elman et al (Cambridge, Massachusetts: MIT Press, 1996), "Innateness, autonomy, universality? Neurobiological approaches to language," R-A Mueller, Behavioral and Brain Sciences 19, p611-675 (1996), and "Statistical learning by 8-month-old infants," JR Saffran, RN Aslin and EL Newport, Science 274, p1926-1928 (1996). For predisposing instincts like turn-taking and gaze-following, see Language Development: A Reader, edited by Andrew Lock and Eunice Fisher (London: Croom Helm, 1984), and Joint Attention: Its Origins and Role in Development, edited by Chris Moore and Philip Dunham (Hillsdale, New Jersey: Lawrence Erlbaum Associates, 1995).

Often our inner dialogue seems sketchy: For evidence that it takes time mentally to unpack speech acts, see Speaking: From Intention to Articulation, edited by Willem Levelt (Cambridge, Massachusetts: MIT Press, 1989), and "Brain activity during speaking: from syntax to phonology in 40 milliseconds," M van Turennout, P Hagoort and CM Brown, Science 280, p572-574 (1998).

Recollection and self-awareness are word-based: For review, see The Myth of Irrationality: The Science of the Mind From Plato to Star Trek (McCrone, Macmillan). Eyewitness experiments in Eyewitness Testimony by Elizabeth Loftus (Cambridge, Massachusetts: Harvard University Press, 1979). See also Private Speech: From Social Interaction to Self-Regulation, edited by Rafael Diaz and Laura Berk (Hillsdale, New Jersey: Lawrence Erlbaum Associates, 1992), and The Disappearance of Introspection by William Lyons (Cambridge, Massachusetts: MIT Press, 1986).

home> back to readings