Information

Is cruelty a human instinct?


I had started studying human behaviour and human instincts a few months ago, earlier the question was raised after seeing human behaviour with animals (his prey). Animals hunt their prey for eating, on the other hand, I have seen kids stone pelting at street cats and dogs. Again human approach towards life is more than survival, it is dominating his peer beings. Can I call this behaviour of mankind as a cruel? If so, is it instinctual? I am very sorry for the late response. Hope this clears the question.


It sounds like a philosophical question, but it also has evolutionary roots. All species are instinctually selfish, in that there is a drive to move their genes into the next generation. So misbehaviors (like "cruelty") should be held in check, because they are likely to be punished in a social species. We evolved as a social species that lives in family groups, with a high probability of meeting nearby family groups repeatedly. If you are cruel to your neighbors, you would expect to meet the points of their spears sometime in the future. This expectation of future encounters with neighbors can explain altruism as well. But its a balance. If cruelty to your neighbors is in your long term selfish interests, then you would expect it to be instinctual, just as you might expect kindness to be. Its all about different strategies to send our genes forward. As a side issue, chimps, our close relatives exhibit violence to their neighbors (but bonobos, equally close, are much less violent).


Big Ideas Articles & More

Humans are selfish. It’s so easy to say. The same goes for so many assertions that follow. Greed is good. Altruism is an illusion. Cooperation is for suckers. Competition is natural, war inevitable. The bad in human nature is stronger than the good.

These kinds of claims reflect age-old assumptions about emotion. For millennia, we have regarded the emotions as the fount of irrationality, baseness, and sin. The idea of the seven deadly sins takes our destructive passions for granted. Plato compared the human soul to a chariot: the intellect is the driver and the emotions are the horses. Life is a continual struggle to keep the emotions under control.

Even compassion, the concern we feel for another being’s welfare, has been treated with downright derision. Kant saw it as a weak and misguided sentiment: “Such benevolence is called soft-heartedness and should not occur at all among human beings,” he said of compassion. Many question whether true compassion exists at all—or whether it is inherently motivated by self-interest.

Recent studies of compassion argue persuasively for a different take on human nature, one that rejects the preeminence of self-interest. These studies support a view of the emotions as rational, functional, and adaptive—a view which has its origins in Darwin’s Expression of Emotion in Man and Animals. Compassion and benevolence, this research suggests, are an evolved part of human nature, rooted in our brain and biology, and ready to be cultivated for the greater good.

The biological basis of compassion

First consider the recent study of the biological basis of compassion. If such a basis exists, we should be wired up, so to speak, to respond to others in need. Recent evidence supports this point convincingly. University of Wisconsin psychologist Jack Nitschke found in an experiment that when mothers looked at pictures of their babies, they not only reported feeling more compassionate love than when they saw other babies they also demonstrated unique activity in a region of their brains associated with the positive emotions. Nitschke’s finding suggests that this region of the brain is attuned to the first objects of our compassion—our offspring.

But this compassionate instinct isn’t limited to parents’ brains. In a different set of studies, Joshua Greene and Jonathan Cohen of Princeton University found that when subjects contemplated harm being done to others, a similar network of regions in their brains lit up. Our children and victims of violence—two very different subjects, yet united by the similar neurological reactions they provoke. This consistency strongly suggests that compassion isn’t simply a fickle or irrational emotion, but rather an innate human response embedded into the folds of our brains.

In other research by Emory University neuroscientists James Rilling and Gregory Berns, participants were given the chance to help someone else while their brain activity was recorded. Helping others triggered activity in the caudate nucleus and anterior cingulate, portions of the brain that turn on when people receive rewards or experience pleasure. This is a rather remarkable finding: helping others brings the same pleasure we get from the gratification of personal desire.

The brain, then, seems wired up to respond to others’ suffering—indeed, it makes us feel good when we can alleviate that suffering. But do other parts of the body also suggest a biological basis for compassion?

It seems so. Take the loose association of glands, organs, and cardiovascular and respiratory systems known as the autonomic nervous system (ANS). The ANS plays a primary role in regulating our blood flow and breathing patterns for different kinds of actions. For example, when we feel threatened, our heart and breathing rates usually increase, preparing us either to confront or flee from the threat—the so-called “fight or flight” response. What is the ANS profile of compassion? As it turns out, when young children and adults feel compassion for others, this emotion is reflected in very real physiological changes: Their heart rate goes down from baseline levels, which prepares them not to fight or flee, but to approach and sooth.

Then there’s oxytocin, a hormone that floats through the bloodstream. Research performed on the small, stocky rodents known as prairie voles indicates that oxytocin promotes long-term bonds and commitments, as well as the kind of nurturing behavior—like care for offspring—that lies at the heart of compassion. It may account for that overwhelming feeling of warmth and connection we feel toward our offspring or loved ones. Indeed, breastfeeding and massages elevate oxytocin levels in the blood (as does eating chocolate). In some recent studies I’ve conducted, we have found that when people perform behaviors associated with compassionate love—warm smiles, friendly hand gestures, affirmative forward leans—their bodies produce more oxytocin. This suggests compassion may be self-perpetuating: Being compassionate causes a chemical reaction in the body that motivates us to be even more compassionate.

According to evolutionary theory, if compassion is truly vital to human survival, it would manifest itself through nonverbal signals. Such signals would serve many adaptive functions. Most importantly, a distinct signal of compassion would soothe others in distress, allow people to identify the good-natured individuals with whom they’d want long-term relationships, and help forge bonds between strangers and friends.

Research by Nancy Eisenberg, perhaps the world’s expert on the development of compassion in children, has found that there is a particular facial expression of compassion, characterized by oblique eyebrows and a concerned gaze. When someone shows this expression, they are then more likely to help others. My work has examined another nonverbal cue: touch.

Previous research has already documented the important functions of touch. Primates such as great apes spend hours a day grooming each other, even when there are no lice in their physical environment. They use grooming to resolve conflicts, to reward each other’s generosity, and to form alliances. Human skin has special receptors that transform patterns of tactile stimulation—a mother’s caress or a friend’s pat on the back—into indelible sensations as lasting as childhood smells. Certain touches can trigger the release of oxytocin, bringing feelings of warmth and pleasure. The handling of neglected rat pups can reverse the effects of their previous social isolation, going as far as enhancing their immune systems.

My work set out to document, for the first time, whether compassion can be communicated via touch. Such a finding would have several important implications. It would show that we can communicate this positive emotion with nonverbal displays, whereas previous research has mostly documented the nonverbal expression of negative emotions such as anger and fear. This finding would also shed light on the social functions of compassion—how people might rely on touch to soothe, reward, and bond in daily life.

In my experiment, I put two strangers in a room where they were separated by a barrier. They could not see one another, but they could reach each other through a hole. One person touched the other on the forearm several times, each time trying to convey one of 12 emotions, including love, gratitude, and compassion. After each touch, the person touched had to describe the emotion they thought the toucher was communicating.

Imagine yourself in this experiment. How do you suppose you might do? Remarkably, people in these experiments reliably identified compassion, as well as love and the other ten emotions, from the touches to their forearm. This strongly suggests that compassion is an evolved part of human nature—something we’re universally capable of expressing and understanding.

Motivating altruism

Feeling compassion is one thing acting on it is another. We still must confront a vital question: Does compassion promote altruistic behavior? In an important line of research, Daniel Batson has made the persuasive case that it does. According to Batson, when we encounter people in need or distress, we often imagine what their experience is like. This is a great developmental milestone—to take the perspective of another. It is not only one of the most human of capacities it is one of the most important aspects of our ability to make moral judgments and fulfill the social contract. When we take the other’s perspective, we feel an empathic state of concern and are motivated to address that person’s needs and enhance that person’s welfare, sometimes even at our own expense.

In a compelling series of studies, Batson exposed participants to another’s suffering. He then had some participants imagine that person’s pain, but he allowed those participants to act in a self-serving fashion—for example, by leaving the experiment.

Within this series, one study had participants watch another person receive shocks when he failed a memory task. Then they were asked to take shocks on behalf of the participant, who, they were told, had experienced a shock trauma as a child. Those participants who had reported that they felt compassion for the other individual volunteered to take several shocks for that person, even when they were free to leave the experiment.

In another experiment, Batson and colleagues examined whether people feeling compassion would help someone in distress, even when their acts were completely anonymous. In this study female participants exchanged written notes with another person, who quickly expressed feeling lonely and an interest in spending time with the participant. Those participants feeling compassion volunteered to spend significant time with the other person, even when no one else would know about their act of kindness.

Taken together, our strands of evidence suggest the following. Compassion is deeply rooted in human nature it has a biological basis in the brain and body. Humans can communicate compassion through facial gesture and touch, and these displays of compassion can serve vital social functions, strongly suggesting an evolutionary basis of compassion. And when experienced, compassion overwhelms selfish concerns and motivates altruistic behavior.

Cultivating compassion

We can thus see the great human propensity for compassion and the effects compassion can have on behavior. But can we actually cultivate compassion, or is it all determined by our genes?

Loving-Kindness Meditation

Strengthen feelings of kindness and connection toward others.

Recent neuroscience studies suggest that positive emotions are less heritable—that is, less determined by our DNA—than the negative emotions. Other studies indicate that the brain structures involved in positive emotions like compassion are more “plastic”—subject to changes brought about by environmental input. So we might think about compassion as a biologically based skill or virtue, but not one that we either have or don’t have. Instead, it’s a trait that we can develop in an appropriate context. What might that context look like? For children, we are learning some answers.

Some researchers have observed a group of children as they grew up, looking for family dynamics that might make the children more empathetic, compassionate, or likely to help others. This research points to several key factors.

First, children securely attached to their parents, compared to insecurely attached children, tend to be sympathetic to their peers as early as age three and a half, according to the research of Everett Waters, Judith Wippman, and Alan Sroufe. In contrast, researchers Mary Main and Carol George found that abusive parents who resort to physical violence have less empathetic children.

Developmental psychologists have also been interested in comparing two specific parenting styles. Parents who rely on induction engage their children in reasoning when they have done harm, prompting their child to think about the consequences of their actions and how these actions have harmed others. Parents who rely on power assertion simply declare what is right and wrong, and resort more often to physical punishment or strong emotional responses of anger. Nancy Eisenberg, Richard Fabes, and Martin Hoffman have found that parents who use induction and reasoning raise children who are better adjusted and more likely to help their peers. This style of parenting seems to nurture the basic tools of compassion: an appreciation of others’ suffering and a desire to remedy that suffering.

Parents can also teach compassion by example. A landmark study of altruism by Pearl and Samuel Oliner found that children who have compassionate parents tend to be more altruistic. In the Oliners’ study of Germans who helped rescue Jews during the Nazi Holocaust, one of the strongest predictors of this inspiring behavior was the individual’s memory of growing up in a family that prioritized compassion and altruism.

A more compassionate world

Human communities are only as healthy as our conceptions of human nature. It has long been assumed that selfishness, greed, and competitiveness lie at the core of human behavior, the products of our evolution. It takes little imagination to see how these assumptions have guided most realms of human affairs, from policy making to media portrayals of social life.

But clearly, recent scientific findings forcefully challenge this view of human nature. We see that compassion is deeply rooted in our brains, our bodies, and in the most basic ways we communicate. What’s more, a sense of compassion fosters compassionate behavior and helps shape the lessons we teach our children.

Of course, simply realizing this is not enough we must also make room for our compassionate impulses to flourish. In Greater Good magazine, we feature articles that can help us do just that. Our contributors provide ample evidence to show what we can gain from more compassionate marriages, schools, hospitals, workplaces, and other institutions. They do more than make us reconsider our assumptions about human nature. They offer a blueprint for a more compassionate world.


Humans evolved to have an instinct for deadly violence, researchers find

Humans have evolved with a propensity to kill one another that is six times higher than the average mammal, according to new research.

Scientists calculated that when we first developed into modern humans about two per cent of deaths were caused by fellow Homo sapiens, according to an article about the research in the journal Nature.

While this rate is well below the highest figure – found among meerkats where nearly 20 per cent of deaths are caused by other meerkats – many mammals kill each other only rarely or not at all.

For all their ferocious reputation, tigers are much less likely to fight each other to the death – with a rate of 0.88 per cent.

And we are also prone to periods of extreme violence that can put even meerkats in the shade. Between about 1200 and 1500 in the Americas more than 25 per cent of the people there were killed by other humans.

Recommended

The researchers compiled information about more than four million deaths among more than 1,000 mammals from 80 per cent of the mammalian families, including some 600 human populations from the Palaeolithic era to the present day.

They then used this information to create an evolutionary tree of different mammals’ propensity towards violence.

Humans, they found, were closely related to mammals who were more likely to kill each other than most.

Writing in Nature, the researchers said: “Lethal violence is considered by some to be mostly a cultural trait.

“However, aggression in mammals, including humans, also has a genetic component with high heritability. Consequently, it is widely acknowledged that evolution has also shaped human violence.

Recommended

“From this perspective, violence can be seen as an adaptive strategy, favouring the perpetrator’s reproductive success in terms of mates, status or resources.”

The researchers found that lethal violence was used by nearly 40 per cent of mammals, but suggested this was probably an under-estimate.

The average percentage of deaths caused by members of the same species was about 0.3 per cent.

But, about 160,000 to 200,000 years ago, the same figure for humans was estimated to be about two per cent, more than six times higher than the average.

The Nature paper said there analysis “suggests that a certain level of lethal violence in humans arises from the occupation of a position within a particularly violent mammalian clade, in which violence seems to have been ancestrally present”.

“This means that humans have inherited their propensity for violence,” it added.

“We believe that this effect entails more than a mere genetic inclination to violence. In fact, social behaviour and territoriality, two behavioural traits shared with relatives of Homo sapiens, seem to have also contributed to the level of lethal violence.”

The researchers stressed this inherited tendency towards violence did not mean humans were unable to control themselves.

“This prehistoric level of lethal violence has not remained invariant but has changed as our history has progressed, mostly associated with changes in the socio-political organization of human populations,” they wrote.


14 Final Fantasy XI: The Shadow Lord Is A Vengeful Spirit Bent On Dividing The Nations Of Vana'diel

Originally a kind Galkan Talekeeper named Raogrimm, the Shadow Lord is an entity that's bent on revenge. He was betrayed by his teammate Ulrich during the expedition at Xarcabard, resulting in both Raogrimm and his lover Cornelia's deaths.

Odin answers Raogrimm's dying rage by granting him a corporeal form to enact vengeance. The Shadow Lord goes on to murder everyone who attended the Xarcabard expedition with him and eventually recruits the beastmen into waging war on San d'Oria, Windurst, and Bastok.


Is Language an Instinct?

In my recent book, The Language Myth, I investigate one of the dominant themes that has preoccupied the study of language for the last 50 years or so: whether the rudiments of the human capacity for grammar—central to language—are innate. This idea originated with the research of the American linguist and philosopher, Noam Chomsky, beginning in the 1950s, and gathering momentum from the 1960s onwards. The idea, in essence, is that human infants are born equipped with a species-specific Universal Grammar—a genetic pre-specification for grammatical knowledge, that ‘switches on’ at an early point in the process of acquiring their mother tongue and this being the case, it takes much of the pain out of language learning. From this perspective, human infants acquire language because they come with a hard-wired knowledge of aspects of grammar—although there is no meaningful consensus on what these aspects might amount to, even after over 40 years of looking. This enables a child, so the party-line claims, to ‘pick up’ their native language. I presented a very partial, thumb-nail sketch of just some of the relevant issues in a short popular science essay, published in Aeon magazine, here. And, I have discussed the issues further in a full length radio interview, available to be listened to here.

In a series of recent posts, summarised here, a number of distinguished linguists, who broadly adhere to Chomsky’s proposition that there is an innate Universal Grammar, suggest that I have either misrepresented the claim(s) associated with the research programme surrounding this hypothesis, and/or misunderstood it and, in three specific cases that they draw attention to, that I have supported my arguments using findings which they claim to have been refuted—they appear, at least in one case, when discussing what is known in the jargon as Specific Language Impairment, to be referring to the short Aeon essay, rather than the fuller discussion in the book.

The Language Myth is written for a general audience—not specifically professional linguists—and takes the form of an evidence-based rebuttal of aspects of the world-view developed in the popular, best-selling books written by Professor Steven Pinker of Harvard University. Indeed, Pinker’s first popular book, The Language Instinct, published back in 1994, provides my book with its title, albeit, with a twist: The Language Myth plays on Pinker’s book title, which I cast as the eponymous ‘language myth’. Indeed, claiming language to be an instinct is self-evidently a myth, as first pointed out by psychologist Michael Tomasello in 1995—see his book review here .

But importantly, The Language Myth directly takes on what I see as the larger theoretical and ideological world-view of what I have elsewhere dubbed ‘rationalist’ language science. While my target is the presentation in Pinker’s various books, it necessarily encompasses more than just the research programme initiated by Chomsky and his co-workers.

It also addresses fundamental issues and questions in cognitive science more generally, and the range of Anglo-American linguists, psychologists and philosophers of the second half of the twentieth century who helped shape it. For instance, I consider the nature of concepts, our ‘building-blocks’ of thought—and whether these might be innate, in some meaningful sense—the relationship between language and the communication systems of other species whether language, and the mind more generally, might consist of distinct, and enshrined neurological systems—sometimes referred to as ‘modules’—which evolved independently of one another, for a specific mental function whether the human mind has its own innate mental operating system—sometimes referred to as ‘Mentalese’, or our Language of Thought and whether language can, in some shape or form, influence habitual patterns of thought—sometimes referred to as the Principle of Linguistic Relativity, famously proposed by Benjamin Lee Whorf (and not to be confused with the straw-man argument for linguistic determinism—the idea that thought is not possible without language thought clearly is possible without language, as we know from research on pre-linguistic infants, adults who have suffered language loss—known as ‘aphasia’—as well as studies on other species, who have often sophisticated conceptual capacities, in the absence of language Whorf explicitly argued against linguistic determinism).

The rationalist world-view boils down to the claim that the linguistic and cognitive capacities of humans must ultimately, and at least in outline, be biologically pre-programmed: that there’s no other way, ultimately, to account for what appears to be unique to our species. In The Language Myth, I argue that there are six component ‘sub-myths’ that make up, and mutually inform and sustain this particular stance. I dub them ‘myths’, because they were proposed, in most cases, before any real evidence for or against was available. And since evidence has become available, most objective commentators would be hard-pressed to say that any of these ‘myths’ have much in the way of clear-cut evidence to support them—I take a slightly stronger position, of course my assessment is that there is almost no credible evidence. So, here are the six:

Myth #1: Human language is unrelated to animal communication systems.
The myth maintains that language is the preserve of humans, and humans alone it cannot be compared to anything found amongst non-humans, and is unrelated to any non-human communicative capability. And the myth reinforces a view that there is an immense divide that separates human language from the communicative systems of other species. And more generally, it separates humans from all other species. But recent findings on the way other species communicate, from apes to whales, from vervets to starlings, increasingly suggest that such a view may overstate the divide that separates human language and non-human communicative systems. Indeed, many of the characteristics exhibited by human language are found, to varying degrees, across a broad spectrum of animal communication systems. In point of fact, we can learn more about human language, and what makes it special, by seeking to understand how it relates to and is derived from the communication systems of other species. This suggests that although human language is qualitatively different, it is related to other non-human communication systems.

Myth #2: There are absolute language universals.
Rationalist linguistics proposes that human babies enter the world pre-equipped to learn language. Language emerges effortlessly and automatically. And this is because we are all born with a Universal Grammar: a pre-specification for certain aspects of grammar whatever the ultimate form of these putative ‘universals’ might be—a universal being a feature of grammar that is, at least in principle, capable of being shared by all languages. Moreover, as all languages are assumed to derive from this Universal Grammar, the study of a single language can reveal its design—an explicit claim made by Chomsky in his published writing. In other words, despite having different sound systems and vocabularies, all languages are basically like English. Hence, a theoretical linguist, aiming to study this innate Universal Grammar, doesn’t, in fact, need to learn or study any of the exotic languages out there—we need only focus on English, which contains the answers to how all other languages work. But like the myth that language is unrelated to animal forms of communication, the myth of language universals is contradicted by the evidence. I argue, in the book, that language emerges and diversifies in and during specific instances of language use.

Myth #3: Language is innate.
No one disputes that human children come into the world biologically prepared for language—from speech production apparatus, to information processing capacity, to memory storage, we are neurobiologically equipped to acquire spoken or signed language in a way no other species is. But the issue under the microscope is this: the rationalist linguistics world-view proposes that a special kind of knowledge—grammatical knowledge—must be present at birth. Linguistic knowledge—a Universal Grammar that all humans are born with—is hard-wired into the micro-circuitry of the human brain. The view that language is innate is, in a number of respects, highly attractive at a stroke, it solves the problem of trying to account for how children acquire language without receiving negative feedback, from their parents and caregivers, when they make mistakes—it has been widely reported that parents, for the most part, don’t systematically correct errors children make as they acquire language. And children can and do acquire their mother tongue without correction of any sort. Moreover, children have acquired spoken language before they begin formal schooling: children are not taught spoken language, they just acquire it, seemingly automatically. But, such a strong view arguably eliminates the need for much in the way of learning—apart from the relatively trivial task of learning the words of whatever language it is we end up speaking. The fundamentals of grammar, common to all languages, are, at least in some pre-specified form, present in our brains prior to birth, so the language myth contends. But as I argue in the book, a large body of evidence now shows these specific assumptions to be incorrect.

Myth #4: Language a distinct module of the mind.
In western thought there has been a venerable tradition in which the mind has been conceived in terms of distinct faculties. With the advent of cognitive science in the 1950s, the digital computer became the analogy of choice for the human mind. While the idea that the mind is a computer has been a central and highly influential heuristic in cognitive science, the radical proposal, that the mind, like the computer, is also modular, was made by philosopher of mind Jerry Fodor. In a now classic book, Modularity of Mind, published in 1983 whose reverberations are felt to this day, Fodor proposed that language is the paradigmatic example of a mental module. And this view, from the perspective of rationalist linguistics makes perfect sense. According to Fodor, a mental module is realised in dedicated neural architecture. It copes with a specific and restricted type of information, and is impervious to the workings of other modules. As a consequence, a module can be selectively impaired, resulting in the breakdown in the behaviour associated with the module. And as a module deals with a specific type of information, the module will emerge at the particular point during the life cycle when it is needed. Hence, a mental module, in developmental terms, follows a characteristic schedule. The notion that the mind is modular might, on the face of it, make intuitive sense. In our everyday lives we associate component parts of artefacts with specific functions. The principle of modularity of design is both a practical and sensible approach to the manufacture not just of computers but many, many aspects of everyday commodities, from cars to children’s toys. However, the evidence, as I argue in the book, provides very little grounds for thinking that language is a module of mind, or indeed that the mind is modular.

Myth #5: There is a universal Mentalese.
The language myth contends that meaning in natural languages, such as English, Japanese or whatever, derives, ultimately, from a universal language of thought: Mentalese. Mentalese is the mind’s internal or private language, and makes thought possible. It is universal in the sense that all humans are born with it. It is language-like, consisting of symbols, which can be combined by rules of mental syntax. Without Mentalese we could not learn the meanings of words in any given language—spoken or signed. But as I show in the book, Mentalese assumes a view of mind that is wrong-headed: it assumes that human minds are computer-like. It also suffers from a number of other difficulties, which make this supposition deeply problematic.

Myth #6: Language does not influence (habitual patterns of) thought.
While everyone accepts that language affects thought in the sense that we use language to argue, persuade, convince, seduce and so on, according to the myth, thought is, in principle, independent. The myth contends that the Principle of Linguistic Relativity—that systematic patterns in grammatical and semantic representations across languages influences corresponding differences in patterns of thought across communities—is utterly wrong. As I show in the book, not only does Pinker, and other rationalists mischaracterise the thesis of linguistic relativity—that the language we speak influences how we habitually think, categorise and perceive the world—he is also wrong in another way. Despite Pinker’s assertion to the contrary, there is now a significant amount of scientific evidence suggesting that, in point of fact, the linguistic patterning of our native tongue has indelible and habitual consequences for how we perceive the world. Of course, the question then arises as to how significant, in terms of influencing individual and cultural world-views, one takes this evidence to be. In a recent book, The Language Hoax, its author, John McWhorter, plays down the significance of the relativistic effects of different languages on the minds of distinct communities of language users. While I disagree with McWhorter’s position—and his review of the relevant evidence is at best partial—given the sophisticated methodologies that now exist for directly and indirectly investigating brain function during routine cognitive and perceptual processing, any objective commentator would be hard-pressed to deny the relativistic influence of language and non-linguistic aspects of mental function.

Ultimately, whether or not one accepts the general argument I make, in The Language Myth, boils down to one’s ideological as well as one’s theoretical commitments. Academic research, like any other human endeavour, inhabits a socio-cultural niche. And ideas arise from assumptions, and principles, sometimes explicitly rehearsed, sometimes not, cocooned within the institutional milieu that helps give them life, and sustain them. In terms of the specifically Chomskyan element(s) of the rationalist world-view that I argue against, my view is that perhaps most damaging of all, has been the insistence that the study of language can be separated into two distinct realms: ‘competence’—our internal, and mental knowledge of language—and ‘performance’—the way in which we use language. Chomsky’s position is that performance arises from competence—given his assumption that fundamental aspects of competence—our Universal Grammar—is, in some sense, present at birth. Hence, competence, rather then performance constitutes the proper object of study for language science. But I, and a great many other linguists, believe that the evidence now very clearly shows this perspective to be wrong-headed: our knowledge of language, so-called ‘competence’, in fact arises from use, from ‘performance’. And Chomsky’s logical error, as I characterise is, has held the field of (Anglo-American) linguistics back for too long.

My rationale for writing The Language Myth, and debunking the world-view presented in Pinker’s popular writing was the following. Pinker’s popular presentations of rationalist cognitive science, at least amongst undergraduate and beginning graduate students, and the informed lay audience, is arguably better known than the work of Chomsky, Fodor and the other leading lights of rationalist cognitive science. And his characterisation—whether one likes, or not, the analogy of language as an ‘instinct’, that Pinker coined—of language and the mind as, ultimately, biological constructions, is widely believed. Many of the standard textbooks, used in the stellar universities across the English-speaking world, promote Pinker’s works as essential readings. Moreover, they portray the sorts of arguments he promotes as established fact. Things are really not that clear-cut. At the very least, the (popularisation of the) rationalist world-view is on very shaky ground indeed. I, of course, didn’t write The Language Myth for committed rationalists I don't pretend to be able to convince them--it appears, to me at least, that in the case of many such colleagues, their commitment is ideological, rather than being based on an objective and critical evaluation and appreciation of the voluminous evidence. And of course, while they may accuse me of being partial and/or prone to misunderstanding in my presentation, as I show in The Language Myth, the same accusation must then be applied to Pinker, but with several greater degrees of magnitude!

In my next few posts, I’ll be examining some of the evidence, for and against, each of the component myths that make up the rationalist world-view. And in so doing, I’ll also address some of the criticisms raised by Chomksyan colleagues who have objected to my portrayal of things. Whatever one thinks on these issues, these are fascinating times in the study of language and the mind, and an exciting time to be an academic linguist. And my advice to all objective and curious-minded people is to read The Language Myth, and make your own mind up. Some representative and high-profile reviews of the book are below, to give you a flavour of what’s in store.

Book review in The New Scientist 18 Oct 2014
Book review in the Times Higher Education 13 Nov 2014


Maternal Instinct And Biology: Evolution Ensures We Want Sex, Not Babies

Many women hear an ominous ticking of their 𠇋iological clock” when they reach their 30s, while others never hear it at all.

Some believe the compulsion to bear babies is biologically inbuilt – even suggesting women who refuse their supposed evolutionary duty are being selfish.

Others hold the view that this so-called “maternal instinct”, also referred to as �y fever”, has nothing to do with biology and is a social construct.

It’s unhelpful to explore this debate through a strictly dichotomous “nature vs nurture” prism. Both biology and culture likely contribute to our reproductive behaviour.

Reproduction doesn’t require any “inherited” preference to have children, since natural selection already favours mechanisms that result in reproduction, most significantly through the sexual urge.

But that version of the maternal instinct that relates to a mother’s ability and need to nurture and protect her child may indeed be hardwired, facilitated by the release of certain hormones and other necessary biological changes.

Sexual Urge

The exquisite diversity of past and present lifeforms comes from a single critical feature – reproduction.

Individuals genetically disposed to be indifferent to sex will theoretically be selected out of the population, in favour of those with a greater commitment.

It remains unclear whether the strong longing for a child, otherwise known as �y fever’, is driven by our genes or is a social construction. Sharon Sperry Bloom/Flickr, CC BY

This is a self-evident feature of the evolutionary process.

Imagine a population of people or animals who enjoy sex, where that enjoyment has a genetic basis. This would determine their reproductive success. Now introduce into this population those genetically predisposed to be sexually inactive.

These sexually inactive individuals will not produce offspring, so there will be no sexually inactive individuals in the next generation.

In other words, a genetic disposition to avoid sex will neither become established nor maintained.

Some argue the so-called 𠇋iological clock”, triggering an enhanced awareness of reproduction among childless women in their 30s, is natural selection at work. Maybe.

There is some evidence that fertility decisions may have a genetic basis. For instance, studies that looked at the age of first attempt to have a child in Finnish populations showed children had similar patterns to those of their parents.

But these only proved there is a genetic influence for when women decided to have a child, rather than whether they decided to at all.

We are notoriously susceptible to the influence of others (witness the broad success of advertising and, one hopes, education).

So, like many other aspects of human behaviour, it remains unclear whether the strong longing for a child – �y fever” – is driven by our genes or is a social construction.

Defying Biology

Until recently, sex and reproduction were inextricably entwined in all organisms. The discovery of contraceptive technology severed that nexus for one species.

With varying reliability, humans can now have sex without having babies. So in terms of biological evolution, a genetic preference for sexual activity is no longer equivalent to a maternal (or paternal) instinct to have offspring.

Through the contraceptive pill, humans have defied biology. Amber McNamara/Flickr, CC BY

There are many women in our society who aren’t interested in having children.

For instance, the number of US women between 34 and 44 who have never had children has increased by around 10% since 1976. And a survey of more than 7,000 Australian women between 22 and 27 years found nearly 10% didn’t want children.

My guess is that childless women aren’t necessarily sexually inactive – as natural selection likely dictates. But there may be little opportunity for selection to act on their personal choice.

It’s an impressive example of human behaviour defying biological evolution. But culture and technology have immunised humans from many selection pressures. Clothing, for example, allows us to inhabit cold environments unsuitable even for naturists.

Sex isn’t one of them though. Indeed, most cultures express more than a passing interest in sex – from the widespread inclusion of fertility rites in ancient societies to the almost unseemly obsession with sex in contemporary television advertising campaigns.

Nurturing Instinct

In many cases, successful reproduction requires care of the developing offspring. This is often, but not exclusively, undertaken by the mother.

Nurturing offspring is then a form of “maternal instinct”, as distinct from �y fever”. And nature has built in biological mechanisms to ensure this.

Nurturing offspring is then a form of ‘maternal instinct’, as distinct from �y fever’ shutterstock.com

For mammalian mothers, a demanding infant stimulates the release of the hormone oxytocin, which in turn triggers a flow of milk.

Oxytocin is also implicated in a suite of maternal behaviours throughout pregnancy, strengthening a mother’s bond to her fetus, which impacts on the fetus&apos development.

The crucial, instinctive, nurturing response to feed the child, through the release of oxytocin, occurs only during pregnancy and after birth – otherwise the hormones don’t kick in.

For instance, virgin mice given oxytocin injections could learn to hear and respond to distressed calls of pups, something they were unable to do before the injections.

So it could be argued that the “urge” to have and nurture children is only ensured biologically through the urge to have sex, while the nurturing instinct is biologically inbuilt.

The so-called 𠇋iological clock”, then, may be ticking to a social key.

This article was originally published on The Conversation. Read the original article.


Excerpt: 'Less Than Human'

Less Than Human: Why We Demean, Enslave, and Exterminate OthersBy David Livingstone SmithHardcover, 336 pagesSt. Martin's PressList price: $24.99

Before I get to work explaining how dehumanization works, I want to make a preliminary case for its importance. So, to get the ball rolling, I'll briefly discuss the role that dehumanization played in what is rightfully considered the single most destructive event in human history: the Second World War. More than seventy million people died in the war, most of them civilians. Millions died in combat. Many were burned alive by incendiary bombs and, in the end, nuclear weapons. Millions more were victims of systematic genocide. Dehumanization made much of this carnage possible.

Let's begin at the end. The 1946 Nuremberg doctors' trial was the first of twelve military tribunals held in Germany after the defeat of Germany and Japan. Twenty doctors and three administrators — twenty-two men and a single woman — stood accused of war crimes and crimes against humanity. They had participated in Hitler's euthanasia program, in which around 200,000 mentally and physically handicapped people deemed unfit to live were gassed to death, and they performed fiendish medical experiments on thousands of Jewish, Russian, Roma and Polish prisoners.

Principal prosecutor Telford Taylor began his opening statement with these somber words:

The defendants in this case are charged with murders, tortures and other atrocities committed in the name of medical science. The victims of these crimes are numbered in the hundreds of thousands. A handful only are still alive a few of the survivors will appear in this courtroom. But most of these miserable victims were slaughtered outright or died in the course of the tortures to which they were subjected . To their murderers, these wretched people were not individuals at all. They came in wholesale lots and were treated worse than animals.

He went on to describe the experiments in detail. Some of these human guinea pigs were deprived of oxygen to simulate high altitude parachute jumps. Others were frozen, infested with malaria, or exposed to mustard gas. Doctors made incisions in their flesh to simulate wounds, inserted pieces of broken glass or wood shavings into them, and then, tying off the blood vessels, introduced bacteria to induce gangrene. Taylor described how men and women were made to drink seawater, were infected with typhus and other deadly diseases, were poisoned and burned with phosphorus, and how medical personnel conscientiously recorded their agonized screams and violent convulsions.

The descriptions in Taylor's narrative are so horrifying that it's easy to overlook what might seem like an insignificant rhetorical flourish: his comment that "these wretched people were . treated worse than animals". But this comment raises a question of deep and fundamental importance. What is it that enables one group of human beings to treat another group as though they were subhuman creatures?

A rough answer isn't hard to come by. Thinking sets the agenda for action, and thinking of humans as less than human paves the way for atrocity. The Nazis were explicit about the status of their victims. They were Untermenschen — subhumans — and as such were excluded from the system of moral rights and obligations that bind humankind together. It's wrong to kill a person, but permissible to exterminate a rat. To the Nazis, all the Jews, Gypsies and others were rats: dangerous, disease-carrying rats.

Jews were the main victims of this genocidal project. From the beginning, Hitler and his followers were convinced that the Jewish people posed a deadly threat to all that was noble in humanity. In the apocalyptic Nazi vision, these putative enemies of civilization were represented as parasitic organisms — as leeches, lice, bacteria, or vectors of contagion. "Today," Hitler proclaimed in 1943, "international Jewry is the ferment of decomposition of peoples and states, just as it was in antiquity. It will remain that way as long as peoples do not find the strength to get rid of the virus." Both the death camps (the gas chambers of which were modeled on delousing chambers) and the Einsatzgruppen (paramilitary death squads that roamed across Eastern Europe followed in the wake of the advancing German army) were responses to what the Nazis perceived to be a lethal pestilence.

Sometimes the Nazis thought of their enemies as vicious, bloodthirsty predators rather than parasites. When partisans in occupied regions of the Soviet Union began to wage a guerilla war against German forces, Walter von Reichenau, the commander-in-chief of the German army, issued an order to inflict a "severe but just retribution upon the Jewish subhuman elements" (the Nazis considered all of their enemies as part of "international Jewry", and were convinced that Jews controlled the national governments of Russia, the United Kingdom, and the United States). Military historian Mary R. Habeck confirms that, "soldiers and officers thought of the Russians and Jews as 'animals' . that had to perish. Dehumanizing the enemy allowed German soldiers and officers to agree with the Nazis' new vision of warfare, and to fight without granting the Soviets any mercy or quarter."

The Holocaust is the most thoroughly documented example of the ravages of dehumanization. Its hideousness strains the limits of imagination. And yet, focusing on it can be strangely comforting. It's all too easy to imagine that the Third Reich was a bizarre aberration, a kind of mass insanity instigated by a small group of deranged ideologues who conspired to seize political power and bend a nation to their will. Alternatively, it's tempting to imagine that the Germans were (or are) a uniquely cruel and bloodthirsty people. But these diagnoses are dangerously wrong. What's most disturbing about the Nazi phenomenon is not that the Nazis were madmen or monsters. It's that they were ordinary human beings.

When we think of dehumanization during World War II our minds turn to the Holocaust, but it wasn't only the Germans who dehumanized their enemies. While the architects of the Final Solution were busy implementing their lethal program of racial hygiene, the Russian-Jewish poet and novelist Ilya Ehrenburg was churning out propaganda for distribution to Stalin's Red Army. These pamphlets seethed with dehumanizing rhetoric: they spoke of "the smell of Germany's animal breath," and described Germans as "two-legged animals who have mastered the technique of war" — "ersatz men" who ought to be annihilated. "The Germans are not human beings," Ehrenburg wrote, ". If you kill one German, kill another — there is nothing more amusing for us than a heap of German corpses."

This wasn't idle talk. The Wehrmacht had taken the lives of 23 million Soviet citizens, roughly half of them civilians. When the tide of the war finally turned, a torrent of Russian forces poured into Germany from the east, and their inexorable advance became an orgy of rape and murder. "They were certainly egged on by Ehrenburg and other Soviet propagandists. " writes journalist Giles McDonough:

East Prussia was the first German region visited by the Red Army . In the course of a single night the red army killed seventy-two women and one man. Most of the women had been raped, of whom the oldest was eighty-four. Some of the victims had been crucified . A witness who made it to the west talked of a poor village girl who was raped by an entire tank squadron from eight in the evening to nine in the morning. One man was shot and fed to the pigs.

Excerpted from Less Than Human by David Livingstone Smith. Copyright 2011 by the author and reprinted by permission of St. Martin's Press, LLC.


8 Moro Reflex

Parents may not recognize the name of this reflex, but they've seen it. When a child is placed on their back and their arms and feet immediately shoot into the air, that's the Moro reflex in action. In fact, many moms see this happen after they have rocked their babies to sleep and then attempt to place them in a crib. Not comfortable with this change, the children throw hands and legs up and usually wake themselves up in the process.

The Moro Reflex can also occur when a child hears a loud noise he wasn't expecting and is scared by it. When a child's head is left unsupported they will also try to cling with their appendages to make sure they don't fall.

This reflex is useful because it lets a parent know that the child may feel they are being handled in a manner that is too rough. Babies are delicate, and though they are also tough little people, they need head support and for their caregivers to be gentle when moving them around.


Do humans really have a killer instinct or is that just manly fancy?

is a lecturer on the history of science at Harvard University. She is the author of Constructing Scientific Psychology: Karl Lashley’s Mind-Brain Debates (1999) and Race, Racism, and Science: Social Impact and Interaction (2004), co-authored with John P Jackson, Jr.

Horrified by the atrocities of the 20th century, an array of scientists sought to explain why human beings turned to violence. The founder of psychoanalysis Sigmund Freud argued that ‘man is a wolf to man’, driven to hatred, destruction and death. The neuroscientist Paul MacLean maintained that humans’ violent tendencies could be traced to their primitive ‘reptilian brain’. The social psychologist Albert Bandura countered that aggression was not inborn but resulted from imitation and suggestion. Despite the controversy they provoked, such theories often attained the status of conventional wisdom.

What makes claims about human nature become truisms? How do they gain credibility? They might rely on experiments, case studies or observation, but evidence alone is never enough to persuade. Such theories – by virtue of the very fact that they seek to encompass the human – must always go beyond their evidence. They manage to persuade by appealing to common experience and explaining familiar events, by creating a shock of recognition in their audiences, a sudden realisation that ‘this must be true’. They employ characters and a narrative arc, and draw moral lessons. In short: they tell a good story.

In the 1960s, alongside prevailing psychological and neuroscientific theories of human aggression, a new claim appeared, that aggression was a human instinct. Relying on the sciences of evolution and animal behaviour, this ‘instinct theory’ held that human aggression was a legacy of our deep ancestral past and an inbuilt tendency shared with many other animal species. One important novelty of this theory was its assertion that human aggression was not wholly destructive, but had a positive, even constructive side. Its proponents were talented writers who readily adopted literary devices.

Robert Ardrey’s bestseller African Genesis (1961) won a big American audience. A Hollywood scriptwriter turned science writer, Ardrey travelled to South Africa, then a hotspot for the excavation of prehistoric human remains. In Johannesburg, he met Raymond Dart, the discoverer of a 2 million-year-old fossilised skull, which Dart believed to be the most ancient human ancestor ever unearthed. Although this creature walked upright, its braincase was small and distinctly apelike, so Dart named it Australopithecus africanus, the southern ape from Africa.

Dart found that Australopithecus remains were typically surrounded by equally fossilised animal bones, especially the long heavy leg bones of antelopes evidently hunted for food. But these bones had been shaped and carefully carved. He noticed that they rested comfortably in his own hand. With a shock, he realised that they were weapons. Their double-knobbed ends corresponded perfectly to the holes and dents that Dart observed in other fossilised Australopithecus skulls. Two conclusions seemed inescapable: first, this proto-human ancestor was not simply a hunter he was also a killer of his own kind. Second, the wielding of bone weapons was not solely a destructive act rather, it had far-reaching consequences for human evolution. Freed from their role in locomotion, forelimbs became available for finer manipulations, which then drove the enlargement of the human brain. Picking up a weapon, Dart theorised, was the thing that triggered human advancement.

In Ardrey’s retelling, Dart’s hypothesis became even more dramatic. The ancient African savannah was home also to Australopithecus robustus, a vegetarian, unarmed cousin of africanus – and his victim. In Ardrey’s account, the lithe and ruthless africanus, brandishing bone weapons, had exterminated his competitor, an ancient conflict that Ardrey couldn’t resist comparing to the Biblical murder of Abel by his brother Cain. The weapon had propelled africanus toward full humanity while robustus slouched toward extinction. Human beings were, quite literally, Cain’s children.

Thanks to Ardrey’s embroidered telling, Dart’s theory inspired perhaps the most famous scene in cinematic history. In the opening sequence of 2001: A Space Odyssey (1968), the leader of a band of ape-men smashes the remains of his defeated antagonists with a crude weapon fashioned out of bone. The victors are carnivorous and armed the losers, gentle and defenceless. At the end of the sequence, the leader tosses his bone weapon into the air, where it is transformed into a spaceship gliding silently through darkness. Arthur C Clarke, the scriptwriter for Stanley Kubrick’s film, had read Ardrey’s book, and the scene echoed Dart’s claim: human ingenuity begins in violence.

Ardrey was disturbed by the image he had conjured. What could be more frightening than man the irascible ape, with a penchant for violence inherited from his ancestors in his heart and, in his hand, weapons much more powerful than antelope bones? What would prevent this evolved australopithecine from detonating an atomic bomb?

In African Genesis, Ardrey turned to a different branch of science – ethology, the study of animal behaviour in the wild – for an answer. The Austrian ornithologist Konrad Lorenz developed the foundations of ethology by sharing his home with wild animals, mainly birds of many different species. By living with animals, Lorenz revealed some of the mysteries of animal instinct, including the phenomenon of imprinting, in which a baby bird follows the first parent-figure it sees after birth. In popular books in the 1950s, Lorenz enraptured war-weary audiences worldwide with tales of his life with jackdaws, geese and fish, presenting himself as a scientific King Solomon, the Biblical hero whose magic ring granted him the power to talk with the animals.

Through theories about human nature, readers made sense of race riots and assassinations, the Vietnam War and the threat of nuclear annihilation

By the 1960s, Lorenz had begun to notice a curious feature of the aggression that his animals directed at members of their own species. Unlike predator-prey relationships, these intraspecies encounters rarely ended in killing. Instead, the aggressor animals diverted their violent impulses into harmless or even productive channels. Two rival greylag ganders, spoiling for a fight, cackled and threatened each other, but never physically clashed. Their aggression thus discharged in these playacting rituals, each gander returned to his mate in triumph. Lorenz observed that not only was outright violence avoided, but the social bond between each gander and his own family was actually strengthened. Far from a drive purely toward destruction and death, aggression redirected against an outsider engendered the ties of affection and love among the in-group.

Lorenz’s ethology showed that aggression, when properly managed, had positive consequences. Ardrey realised that the answer to the problem of human aggression was not to try to eliminate it – an impossible task, since Dart had demonstrated that it was ingrained in our nature – but to acknowledge aggression as innate and ineradicable, and then channel it productively. In his book On Aggression (1966), Lorenz made his own suggestions for possible outlets, including the space race.

It would be difficult to overstate the popularity in the 1960s and ’70s of Lorenz’s and Ardrey’s hypothesis about human nature. In the United States, their books became bestsellers. Through their theories about human nature, readers made sense of race riots and assassinations, the Vietnam War and the threat of nuclear annihilation. Their warning – that humans must accommodate their aggression instinct and re-channel it, before it was too late – was cited by US senators and cabinet secretaries. The message made such a lasting impact that even in the 1980s, UNESCO found it necessary to endorse an official statement that biology didn’t condemn humans to violence.

How did the killer-instinct idea achieve such cultural power? Because it came embedded in story. Like the greatest fictional works, Lorenz’s and Ardrey’s books drew on an ancient motif: that man’s fatal flaw was also his greatest strength, deprived of which he would cease to be human. Their deft use of character, plot and scene-setting, their invocation of myth, their summing up in a moral that readers could apply to themselves, drove the theories of Lorenz and Ardrey to conventional wisdom status.

The sciences on which they built their theories might have been superseded. But today’s sciences of human nature – sociobiology and evolutionary psychology – have adopted the claim for an evolved predisposition for aggression. The 1960s bestsellers ushered in a genre of popular science that still depends on speculative reconstructions of human prehistory. It also still draws comparisons between the behaviour and emotions of humans and animals. The grudging compliment we pay a powerful man – ‘he’s an alpha male’ – is one hint of the genre. But we ought to be careful about what we believe. Theories of human nature have important consequences – what we think we are shapes how we act. We believe in such theories not because they are true, but because we are persuaded that they are true. The history of the claim for a killer instinct in humans encourages us to think of the ways in which scientists argue and try to persuade. Storytelling, in this view, is a crucial element of both the science and its public presentation.


Do humans really have a killer instinct or is that just manly fancy?

is a lecturer on the history of science at Harvard University. She is the author of Constructing Scientific Psychology: Karl Lashley’s Mind-Brain Debates (1999) and Race, Racism, and Science: Social Impact and Interaction (2004), co-authored with John P Jackson, Jr.

Horrified by the atrocities of the 20th century, an array of scientists sought to explain why human beings turned to violence. The founder of psychoanalysis Sigmund Freud argued that ‘man is a wolf to man’, driven to hatred, destruction and death. The neuroscientist Paul MacLean maintained that humans’ violent tendencies could be traced to their primitive ‘reptilian brain’. The social psychologist Albert Bandura countered that aggression was not inborn but resulted from imitation and suggestion. Despite the controversy they provoked, such theories often attained the status of conventional wisdom.

What makes claims about human nature become truisms? How do they gain credibility? They might rely on experiments, case studies or observation, but evidence alone is never enough to persuade. Such theories – by virtue of the very fact that they seek to encompass the human – must always go beyond their evidence. They manage to persuade by appealing to common experience and explaining familiar events, by creating a shock of recognition in their audiences, a sudden realisation that ‘this must be true’. They employ characters and a narrative arc, and draw moral lessons. In short: they tell a good story.

In the 1960s, alongside prevailing psychological and neuroscientific theories of human aggression, a new claim appeared, that aggression was a human instinct. Relying on the sciences of evolution and animal behaviour, this ‘instinct theory’ held that human aggression was a legacy of our deep ancestral past and an inbuilt tendency shared with many other animal species. One important novelty of this theory was its assertion that human aggression was not wholly destructive, but had a positive, even constructive side. Its proponents were talented writers who readily adopted literary devices.

Robert Ardrey’s bestseller African Genesis (1961) won a big American audience. A Hollywood scriptwriter turned science writer, Ardrey travelled to South Africa, then a hotspot for the excavation of prehistoric human remains. In Johannesburg, he met Raymond Dart, the discoverer of a 2 million-year-old fossilised skull, which Dart believed to be the most ancient human ancestor ever unearthed. Although this creature walked upright, its braincase was small and distinctly apelike, so Dart named it Australopithecus africanus, the southern ape from Africa.

Dart found that Australopithecus remains were typically surrounded by equally fossilised animal bones, especially the long heavy leg bones of antelopes evidently hunted for food. But these bones had been shaped and carefully carved. He noticed that they rested comfortably in his own hand. With a shock, he realised that they were weapons. Their double-knobbed ends corresponded perfectly to the holes and dents that Dart observed in other fossilised Australopithecus skulls. Two conclusions seemed inescapable: first, this proto-human ancestor was not simply a hunter he was also a killer of his own kind. Second, the wielding of bone weapons was not solely a destructive act rather, it had far-reaching consequences for human evolution. Freed from their role in locomotion, forelimbs became available for finer manipulations, which then drove the enlargement of the human brain. Picking up a weapon, Dart theorised, was the thing that triggered human advancement.

In Ardrey’s retelling, Dart’s hypothesis became even more dramatic. The ancient African savannah was home also to Australopithecus robustus, a vegetarian, unarmed cousin of africanus – and his victim. In Ardrey’s account, the lithe and ruthless africanus, brandishing bone weapons, had exterminated his competitor, an ancient conflict that Ardrey couldn’t resist comparing to the Biblical murder of Abel by his brother Cain. The weapon had propelled africanus toward full humanity while robustus slouched toward extinction. Human beings were, quite literally, Cain’s children.

Thanks to Ardrey’s embroidered telling, Dart’s theory inspired perhaps the most famous scene in cinematic history. In the opening sequence of 2001: A Space Odyssey (1968), the leader of a band of ape-men smashes the remains of his defeated antagonists with a crude weapon fashioned out of bone. The victors are carnivorous and armed the losers, gentle and defenceless. At the end of the sequence, the leader tosses his bone weapon into the air, where it is transformed into a spaceship gliding silently through darkness. Arthur C Clarke, the scriptwriter for Stanley Kubrick’s film, had read Ardrey’s book, and the scene echoed Dart’s claim: human ingenuity begins in violence.

Ardrey was disturbed by the image he had conjured. What could be more frightening than man the irascible ape, with a penchant for violence inherited from his ancestors in his heart and, in his hand, weapons much more powerful than antelope bones? What would prevent this evolved australopithecine from detonating an atomic bomb?

In African Genesis, Ardrey turned to a different branch of science – ethology, the study of animal behaviour in the wild – for an answer. The Austrian ornithologist Konrad Lorenz developed the foundations of ethology by sharing his home with wild animals, mainly birds of many different species. By living with animals, Lorenz revealed some of the mysteries of animal instinct, including the phenomenon of imprinting, in which a baby bird follows the first parent-figure it sees after birth. In popular books in the 1950s, Lorenz enraptured war-weary audiences worldwide with tales of his life with jackdaws, geese and fish, presenting himself as a scientific King Solomon, the Biblical hero whose magic ring granted him the power to talk with the animals.

Through theories about human nature, readers made sense of race riots and assassinations, the Vietnam War and the threat of nuclear annihilation

By the 1960s, Lorenz had begun to notice a curious feature of the aggression that his animals directed at members of their own species. Unlike predator-prey relationships, these intraspecies encounters rarely ended in killing. Instead, the aggressor animals diverted their violent impulses into harmless or even productive channels. Two rival greylag ganders, spoiling for a fight, cackled and threatened each other, but never physically clashed. Their aggression thus discharged in these playacting rituals, each gander returned to his mate in triumph. Lorenz observed that not only was outright violence avoided, but the social bond between each gander and his own family was actually strengthened. Far from a drive purely toward destruction and death, aggression redirected against an outsider engendered the ties of affection and love among the in-group.

Lorenz’s ethology showed that aggression, when properly managed, had positive consequences. Ardrey realised that the answer to the problem of human aggression was not to try to eliminate it – an impossible task, since Dart had demonstrated that it was ingrained in our nature – but to acknowledge aggression as innate and ineradicable, and then channel it productively. In his book On Aggression (1966), Lorenz made his own suggestions for possible outlets, including the space race.

It would be difficult to overstate the popularity in the 1960s and ’70s of Lorenz’s and Ardrey’s hypothesis about human nature. In the United States, their books became bestsellers. Through their theories about human nature, readers made sense of race riots and assassinations, the Vietnam War and the threat of nuclear annihilation. Their warning – that humans must accommodate their aggression instinct and re-channel it, before it was too late – was cited by US senators and cabinet secretaries. The message made such a lasting impact that even in the 1980s, UNESCO found it necessary to endorse an official statement that biology didn’t condemn humans to violence.

How did the killer-instinct idea achieve such cultural power? Because it came embedded in story. Like the greatest fictional works, Lorenz’s and Ardrey’s books drew on an ancient motif: that man’s fatal flaw was also his greatest strength, deprived of which he would cease to be human. Their deft use of character, plot and scene-setting, their invocation of myth, their summing up in a moral that readers could apply to themselves, drove the theories of Lorenz and Ardrey to conventional wisdom status.

The sciences on which they built their theories might have been superseded. But today’s sciences of human nature – sociobiology and evolutionary psychology – have adopted the claim for an evolved predisposition for aggression. The 1960s bestsellers ushered in a genre of popular science that still depends on speculative reconstructions of human prehistory. It also still draws comparisons between the behaviour and emotions of humans and animals. The grudging compliment we pay a powerful man – ‘he’s an alpha male’ – is one hint of the genre. But we ought to be careful about what we believe. Theories of human nature have important consequences – what we think we are shapes how we act. We believe in such theories not because they are true, but because we are persuaded that they are true. The history of the claim for a killer instinct in humans encourages us to think of the ways in which scientists argue and try to persuade. Storytelling, in this view, is a crucial element of both the science and its public presentation.