Saturday, December 31, 2011

A few of my themes for 2012

Yakuzas (Japanese mafia). The largest Yakuza syndicate is over 70% Burakumin. Source

Here are a few themes I wish to write about during 2012:

Archaic admixture: A wild goose chase?

With the discovery that Europeans and Asians are 1 to 4% Neanderthal, there has been a rush to learn more. What genes are involved? Does this admixture explain why Eurasians are, well, hot stuff?

A few words of caution. The estimate of 1 to 4% is based on comparison of the Neanderthal genome with the modern Eurasian genome and the modern sub-Saharan African genome (Green et al., 2010). Neanderthals appear to be genetically closer to modern Eurasians than they are to modern sub-Saharan Africans. This increased closeness is therefore a measure of Neanderthal admixture in modern Eurasians. Right?

Well, not necessarily. It may also be a measure of non-Neanderthal admixture in modern sub-Saharan Africans. We now know that about 2% of the modern sub-Saharan African genome comes from a population that split from ancestral modern humans some 700,000 years ago (Hammer et al., 2011). Another 13% comes from archaics who were much closer to modern humans and probably related to the Skhul-Qafzeh hominins of the Middle East (Watson et al., 1997).

The figure of 1 to 4% Neanderthal admixture in modern Eurasians will thus have to be revised downward, just as our estimate of archaic admixture in modern sub-Saharans will have to be revised upward. This point has been made by Dienekes:

It is no longer tenable to propose that Eurasians are shifted towards Neandertals only because of Neandertal admixture: in fact some of the shift may be due to Africans being shifted away from Neandertals because of admixture with archaic African hominins.

However great or small Neanderthal admixture may be, can it explain why modern Eurasians are “hot stuff”? Doubtful. It’s true that both populations had to adapt to arctic environments, but they did so in very different ways. Neanderthals adapted to the cold through their morphology: thick body fat and dense fur. Modern Eurasians adapted by making tailored clothing and building insulated shelters.

Please don’t get me wrong. If you’re doing research on Neanderthal admixture, I wish you the best of luck. Perhaps you’ll find a thing or two. But don’t get your hopes up.

Whither North Korea?

Whenever an authoritarian leader dies, the door is opened to change, often radical change. The new leader is less able to command authority, and the chain of command itself is called into question at all levels. Pent-up pressure for change can finally be released. This was the case after the deaths of Franco in 1975, Mao Zedong in 1976, and Brezhnev in 1982. Here in my home province, it was the death of Duplessis in 1959 that ushered in the end of Quebec as a conservative Catholic society.

Will we see the same in North Korea? Will the death of Kim Jong Il lead to liberalization and, ultimately, reunification with South Korea?

Yes and no. North Korea will pursue its transition to a market economy. And this process is already making the population more independent-minded. As an observer in Pyongyang recently noted:

The women who daily set out their wares on the streets do so in defiance of police prohibitions. This is one of the clearest indications of the erosion of the regime’s control over its people. (The author observed many others, such as the men who openly smoked under “No Smoking” signs, the peasants who sim­ply ignored the traffic police and trundled their carts across intersections, and the people who—under the very eyes of the police—sat on the escalators in the Metro despite stern signs prohibiting this.) (Everard, 2011)

Private markets are also creating new spaces of social interaction that are independent of the State, and this trend will be assisted by the spread of cellphones and the strengthening of economic and social relations with China—itself a much more liberal society.

Finally, North Korea will drop all pretence of international socialism. This might seem to be just a matter of words—North Korea has long been a de facto nationalist regime—but semantics are important in the way people construct their perceived reality.

But, no, reunification is not in the cards, if only because the Chinese are adamantly opposed. There was a time in the 1990s when they were open to this idea. With reunification, U.S. troops would leave and Korea would become a more neutral country. It is now clear, however, that reunification has produced no such outcome in Germany. The Cold War may be over, but the U.S. still wants to have troops in mainland Eurasia, apparently as part of its geopolitical strategy.

So for now at least the Chinese will try to strengthen North Korea as a friendly buffer state. To this end, they will prod Pyongyang to pursue economic reforms and shed its pariah image, particularly by dismantling its nuclear program. In exchange, the Chinese may offer the protection of their own nuclear umbrella, as well as full membership in the Shanghai Cooperation Organisation (SCO).

It’s also unlikely that liberalization will lead to North Korea becoming more Westernized and Americanized. By “liberalization,” I mean the right of people to live their lives according to their own values—and not those imposed by the State or by a globalist elite. Hence, the Arab Spring has brought the triumph of Islamist political parties who promise to introduce stricter adherence to Shariah law. This has been a surprise to Western observers, but it should not have been.

The Burakumin

Although Japanese society is often seen as being very homogeneous, it does have a distinct class called the Burakumin who were officially outcastes until 1871 and are still widely looked down upon. They seem to descend from Japanese who held stigmatized occupations that involved the taking of life or contact with dead bodies, like butchery, leather making, and preparation of corpses for burial. Today, despite many remedial efforts, an academic gap persists between the Burakumin and other Japanese:

According to research on Buraku pupil/students' scholastic ability conducted in the post-war period, nearly 1 standard deviation difference in achievement scores was found between Burakumin and non-Burakumin pupil/students regardless of when and where the research was conducted. This meta-analysis on Buraku pupil/students' scholastic ability leads us to conclude that the relative difference in scholastic achievements between the Burakumin and non-Burakumin pupil/student has been maintained to a considerable degree through the post-war period. (BLHRRI, 1997)

In 2012, I will try to shed new light on this question by applying Greg Clark’s model. Clark (2007) argued that the English gene pool in 1800 was quite different from what it had been only a few centuries earlier. Over the years, the English middle class had expanded demographically and, through downward mobility, had largely replaced the English lower classes. I will suggest that Japan followed a similar evolution but with an interesting twist. As outcastes with a monopoly on certain occupations, the Burakumin were spared this demographic replacement. They may thus represent the Japanese population as it existed several centuries ago.


BLHRRI (1997). Practice of Dowa Education Today, Buraku Liberation and Human Rights Institute.

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, Princeton University Press, Princeton and Oxford.

Dienekes. (2011). Neanderthal admixture. Why I remain skeptical, December 19, 2011.

Everard, J. (2011). The markets of Pyongyang, Korea Economic Institute, Academic Paper Series, 6(1), 1-7.

Green, R.E., J. Krause, A.W. Briggs, T. Maricic, U. Stenzel, M. Kircher, et al. (2010). A draft sequence of the Neandertal genome, Science, 328, 710-722.

Hammer, M.F., A.E. Woerner, F.L. Mendez, J.C. Watkins, and J.D. Wall. (2011). Genetic evidence for archaic admixture in Africa, Proceedings of the National Academy of Science (USA), early edition,

Watson, E., P. Forster, M. Richards, and H-J. Bandelt. (1997). Mitochondrial footprints of human expansions in Africa, American Journal of Human Genetics, 61, 691-704.

Saturday, December 17, 2011

2012. A year of turbulence?

Child making Nike shoes (source). Western business now has access to labor under conditions not seen since the days of Charles Dickens.

My predictions from last year:

It won’t be such a bad year. Stock markets will reach record highs and pundits will say we’ve entered a sustained boom. For many people, life will never again be so good as it will be this year.

The main worry will be price rises for many commodities. With a return to even modest rates of economic growth, demand will outstrip supply in several areas. Talk of “peak oil” will be joined by concerns over “peak food” and “peak water.” Serious water shortages will hit the American southwest and southeast.

Well, the stock markets have not reached record highs. And there have been no serious water shortages, largely because of an unusually wet winter.

But food prices have been rising ominously. It was this factor that triggered the “Arab Spring” and is now fueling discontent in Russia. Also, for a lot of people—especially our elites—life has never been so good. We are into an economic recovery, of sorts.

How long will the recovery last? Perhaps another twenty years if it were a normal one. But it isn’t. The last recession was not allowed to finish its job of purging the economy. A lot of corporate flab was spared the axe, and dysfunctional attitudes toward debt are still common, particularly among consumers. In addition, the recovery is heavily dependent on government spending and consumer debt, and there is no indication that the economy is ready to go “cold turkey.” We may need more and more of the same stimulus just to maintain sluggish growth.

This debt crisis comes on top of a looming commodity crisis. Prices for fuel, food, housing, and other basics are being pushed up by the new buying power of Asian consumers and by immigration to North America and Western Europe. Can supply be increased to meet the rising demand? Yes, of course. Don’t worry. Everything will be fine—say the business interests that profit from this spike in demand.

Finally, we are facing a globalization crisis. On the one hand, jobs are being outsourced to lower-wage countries. On the other, lower-wage labor is being insourced. The result? A steady downward leveling of incomes throughout the Western World, except for the very rich. The latter now have access to labor under conditions not seen since the days of Charles Dickens.

The current recovery might nonetheless go on indefinitely. The Japanese, for instance, have kept their economy afloat for the past two decades by piling up massive debt. But they are just one society, and it’s one with a strong sense of social cohesion. In contrast, the Western World is very fractious, as seen by the bickering within the European Union. These social and political divisions will probably abort the recovery long before the possibilities for debt financing and money printing have been completely exhausted. And so much the better.

If I have to make a prediction for 2012, it will be that the recovery will continue—on life support, so to speak—but will run into increasing social turbulence. The ‘Arab spring’ will start to play out in the Western World as the elites begin to lose their legitimacy. This process is already under way in Europe, and we may see a domino effect where change in one country facilitates change in other countries.

My research interests

There have been some developments in my areas of research interest.

Skin color and face recognition

Natural selection tends to hardwire recognition of objects that regularly appear in our visual environment. One such object is the human face. As shown by Zhu et al. (2009) through a twin study, the ability to recognize faces is innate and not learned. This heritability is further shown by the two extremes of prosopagnosics and “super-recognizers.” The former cannot recognize faces better than any other object, whereas the latter have exceptional face recognition ability (Russell, Chatterjee, & Nakayama, in press; Russell, Duchaine, & Nakayama, 2009).

The American psychologist Richard Russell has recently shown that face recognition equally uses face shape and facial skin color:

Shape and pigmentation cues were used in roughly equal measure by people with very good and very bad face recognition ability. […] People who are good at recognizing faces are good at using both shape and pigmentation cues to do so; people who are bad at recognizing faces are bad at using both shape and pigmentation cues to do so (Russell, Chatterjee, & Nakayama, in press).

This mental processing of skin color seems to take place in a lower-level module whose output then feeds into the face-recognition module.

Neural circuits related to face recognition ability must use both shape and pigmentation information about equally. This supports the idea that these circuits represent facial appearance by pooling lower-level patterns of shape and reflectance into combinations that include both types of information (Jiang, et al., 2006). Further, this is consistent with the notion that the location of the Fusiform Face Area is midway along the shape–reflectance gradient in ventral cortex (Cant & Goodale, 2011) because the region integrates these two kinds of cues to visually process faces. (Russell, Chatterjee, & Nakayama, in press)

Dumouchel et al. (2010) have likewise concluded that face shape and “skin properties” are the main clues for face recognition, even more so than the relative distances of facial features from each other.

Why does skin color matter so much for face recognition? Didn’t our ancestors evolve in a context where people interacted only with their own kind or with neighboring groups of similar appearance? Yes, but there was another source of variation in skin color—gender and age. Women and young infants are paler, having less melanin and hemoglobin in their skin. Men, in contrast, are ruddier and browner.

We are thus innately sensitive to differences in skin color, but this sensitivity didn’t evolve in response to ethnic differences. It evolved in response to much smaller gender and age differences (Frost, 2010; Frost, 2011; van den Berghe & Frost, 1986).

At present, two research teams have the means and motivation to pursue this line of research: Richard Russell’s team at Gettysburg College and Frédéric Gosselin’s team at the Université de Montréal. We’ll probably see more findings by both teams over the next year.

The puzzle of European hair and eye colors

European populations have an unusually broad palette of hair and eye colors. This diversity doesn’t have a common genetic cause. It is due to a proliferation of alleles at two separate genes: MC1R for hair color and OCA2 for eye color. This proliferation did not come about through relaxation of selection for dark skin as ancestral Europeans moved into higher latitudes. Most of the new alleles have little or no affect on skin color, and in any case the timeframe is too narrow for this evolutionary scenario.

A likelier cause is sexual selection, which favors bright or novel colors that catch the attention of potential mates. If sexual selection is strong enough, a polymorphism of color variants may develop. A new color appears through mutation and, depending on its brightness or novelty, steadily rises in frequency until it is as common as the established color. Over time, these variants will increase in number. Humans have the potential for this kind of frequency-dependent sexual selection, e.g., darker-haired women are sexually preferred to the extent that they are less common. Such selection is consistent with the high number of alleles for hair color and eye color in European populations, the high ratio of nonsynonymous to synonymous variants among these alleles, and the relatively short time over which this hair and eye color diversity developed.

Sexual selection occurs when too many of one sex must compete for too few of the other. Among early modern humans, such imbalances resulted from (1) polygyny (to the degree that women could provide for themselves and their children without male assistance) and/or (2) higher mortality among men than among women (to the degree that men covered longer distances while hunting or changing camp). Wherever the polygyny rate was low and male mortality high, the result was strong sexual selection of women. Such selection was particularly strong on continental steppe-tundra, where men had to provide almost all of the food by hunting migratory game animals over long distances. Although this type of environment is now fragmentary, it covered until 10,000 years ago a much larger territory that matches the current range of European hair and eye color diversity (Frost, 2006).

This hypothesis would predict some degree of sex linkage among European alleles for hair and eye color, since the sexual selection was acting on women. Over time, there would have arisen alleles that produce non-black hair and non-brown eyes more so in women than in men, and these alleles would have gradually replaced their non-sex-linked counterparts. This process should not have gone very far, though, because of the narrow timeframe.

This prediction is borne out by a twin study on the genetics of hair color. Shekar et al. (2008) found that the women had lighter hair on average than the men and a higher proportion of red hair. Hair color was also more diverse in the women than in the men:

Females had, on average, lighter hair, on the A650t scale, than males.

[…] The correlation within brother–sister twin pairs was significantly lower than the correlation within brother–brother and sister–sister dizygotic twin pairs (P ≈ 0.01). This suggests that there may be qualitative differences in the genetic influences on the A650t index between sexes.

[…] Additive genetic influences explain 55% and 58% of variation in the A650t index within females and males, respectively. The additive genetic influence on the A650t index in males was, predominantly, qualitatively different from those that influence the index in females.

[…] Females had, on average, redder hair (P < 0.00001) and greater variation in R index scores (P _ 0.001) than males.

The sexual selection hypothesis would also predict that this evolutionary change took place over a relatively short time, specifically the last ice age 25,000 to 10,000 years ago and well after the entry of modern humans into Europe some 35,000 to 40,000 years ago. Is this prediction supported by evidence?

At present, no one is trying to date the diversification of European hair and eye colors.
The closest research effort would be the work by Norton and Hammer (2007) showing that Europeans became white-skinned long after their entry into Europe. Heather Norton is now trying to get a firm date on this phenotypic change.


Dupuis-Roy, N., I. Fortin, D. Fiset, and F. Gosselin. (2009). Uncovering gender discrimination cues in a realistic setting. Journal of Vision, 9(2), 10, 1–8., doi:10.1167/9.2.10.

Frost (2011). Hue and luminosity of human skin: a visual cue for gender recognition and other mental tasks, Human Ethology Bulletin, 26(2), 25-34.

Frost, P. (2010). Femmes claires, hommes foncés. Les racines oubliées du colorisme, Quebec City: Presses de l’Université Laval.

Frost, P. (2006). European hair and eye color - A case of frequency-dependent sexual selection? Evolution and Human Behavior, 27, 85-103

Norton, H.L. & M.F. Hammer (2007) Sequence variation in the pigmentation candidate gene SLC24A5 and evidence for independent evolution of light skin in European and East Asian populations, Program of the 77th Annual Meeting of the American Association of Physical Anthropologists, p. 179.

Russell, R., G. Chatterjee, and K. Nakayama. (In press) Developmental prosopagnosia and super-recognition: no special role for surface reflectance processing. Neuropsychologia

Russell, R., B. Duchaine, and K. Nakayama. (2009). Super-recognizers: People with extraordinary face recognition ability. Psychonomic Bulletin & Review, 16(2), 252-257.

Shekar, S.N., D.L. Duffy, T. Frudakis, G.W. Montgomery, M.R. James, R.A. Sturm, & N.G. Martin (2008). Spectrophotometric methods for quantifying pigmentation in human hair—Influence of MC1R genotype and environment, Photochemistry and Photobiology, 84, 719–726.

Taschereau-Dumouchel, V., B. Rossion, P.G. Schyns, and F. Gosselin. (2010). Interattribute Distances do not Represent the Identity of Real World Faces, Front Psychol, 1, 159.

van den Berghe, P. L. & P. Frost. (1986). Skin color preference, sexual dimorphism, and sexual selection: A case of gene-culture co-evolution? Ethnic and Racial Studies, 9, 87-113.

Zhu, Q., Y. Song, S. Hu, X. Li, M. Tian, Z. Zhen, Q. Dong, N. Kanwisher, and J. Liu. (2009). Heritability of the specific cognitive ability of face perception, Current Biology, 20, 137-142.

Saturday, December 10, 2011

Suicide and Inuit youth

Canadian suicide rates (per 100,000 people): Inuit, First Nations, all Canadians. Source

From Alaska to Greenland, young Inuit have unusually high rates of suicide, attempted suicide, and suicidal ideation. According to a 1972 survey of Inuit 15 to 24 years old from northern Quebec, 28% of the males and 25% of the females had attempted suicide (Kirmayer et al., 1998). Before the 1970s, suicide was rare among Inuit youth. Today, it has reached epidemic proportions.

Public authorities have responded largely by targeting those factors, like alcohol and drug abuse, that make it easier to go from thinking about suicide to actually doing it. While these efforts are having some success, there still remains the problem of suicidal ideation.

Why do so many Inuit youth contemplate suicide? Kirmayer et al. (1998) point to a prevailing sense of uselessness:

Inuit youth are confronted with the values of an individualistic, consumption-oriented society through mass media but have few opportunities to achieve the life-style portrayed. The result may be a sense of frustration, limited options, and difficulty imagining an optimistic future. This may extend to an impaired sense of self-continuity that contributes to attempted suicide.

Dufour (1994) argues that Inuit society has a long tradition of people ending their lives when they feel they have become useless. In the past, however, this kind of suicide involved only the elderly:

Suicide in early Inuit society was viewed positively when the individual had become a burden for the group. “Senilicide” in particular was deemed to be acceptable and appropriate. Its pattern: a usually elderly person motivated by illness, helplessness, bereavement, dependence on the group, famine, or resource shortage who would decide after consulting family members who sometimes could be called upon to assist. In contemporary Inuit society, the elderly no longer commit suicide. The young people do.

TV and video present young Inuit with an affluent lifestyle that is unattainable for all but a few. Meanwhile, school presents learning goals and standards of behavior that are likewise difficult to attain, especially for boys. By postponing adulthood in order to extend the learning process, school also has the unintended effect of humiliating Inuit youth. In another age, they were treated as young adults, often being parents in their own right. Today, they are just “children.”

Many young Inuit thus perceive themselves as being socially useless. And this self-perception is triggering suicidal ideation.

Such ideation may seem irrational from an individualistic Western standpoint. You cannot make your life better by ending it. Yet it is less irrational from the standpoint of one’s kin group, especially in a context of limited resources. Such was the case with elderly Inuit who would choose death so as not to burden the younger members of their band, such people being close relatives for the most part.

In such a context, natural selection—specifically kin selection—might have favored suicide as a response to perceived uselessness. Such selection is possible. Suicidal ideation is significantly heritable and seems to be inherited as a specific behavioral response:

Suicidal behavior is highly familial, and on the basis of twin and adoption studies, heritable as well. Both completed and attempted suicide form part of the clinical phenotype that is familially transmitted, as rates of suicide attempt are elevated in the family members of suicide completers, and completion rates are elevated in the family members of attempters. A family history of suicidal behavior is associated with suicidal behavior in the proband, even after adjusting for presence of psychiatric disorders in the proband and family, indicating transmission of attempt that is distinct from family transmission of psychiatric disorder. (Brent & Mann, 2005)

According to a twin study using American subjects, suicidal ideation has 36% heritability and suicide attempt 17% heritability (Fu et al., 2002).

De Catanzaro (1991, 1995) has argued that suicidal ideation has evolved as a response to a situation where an individual has become a burden to immediate kin. In studies of the general public and high-risk groups (elderly and psychiatric patients), he found that the strongest correlate of suicidal ideation was burdensomeness to family and, for males, lack of heterosexual activity. As Buss (1999, p. 94) concludes: “If a person is a burden to his or her family, for example, then the kin’s reproduction, and hence the person’s own fitness might suffer as a result of his or her survival.”

The threshold for suicidal ideation may be lower in some human populations than in others, depending on one’s risk of becoming a serious burden on kinfolk. This risk is high in Arctic hunting bands because their members are almost entirely close kin and because their nomadic lifestyle limits food storage for lean times. When food is scarce, who eats and who doesn’t? The question is especially difficult because close kin are involved. The easiest solution, in terms of keeping the peace and maintaining group cohesion, is one where the burdensome individual voluntarily bows out.

What does all of this mean for young Inuit who are thinking of suicide? Clearly, it is not enough to focus on things that facilitate the transition from suicidal ideation to actual suicide. That approach might work in southern Canada, where suicide tends to result from transient episodes that push people up and over the threshold of suicidal ideation. Among the Inuit, the threshold seems to be lower and the focus should be more on preventing ideation, specifically by giving young Inuit a greater feeling of self-worth and social usefulness.


Brent, D.A. & J.J. Mann. (2005). Family genetic studies, suicide, and suicidal behavior, American Journal of Medical Genetics Part C: Seminars in Medical Genetics, 133C, 13-24.

Buss, D.M. (1999). Evolutionary Psychology. The New Science of the Mind, Boston: Allyn and Bacon.

de Catanzaro, D. (1991). Evolutionary limits to self-preservation, Ethology and Sociobiology, 12, 13-28.

de Catanzaro, D. (1995). Reproductive status, family interactions, and suicidal ideation: Surveys of the general public and high-risk group, Ethology and Sociobiology, 16, 385-394.

Dufour, R. (1994). Pistes de recherche sur les sens du suicide des adolescents inuit, Santé mentale au Québec, 19, 145-162.

Fu, Q., A.C. Heath, K.K. Bucholz, E.C. Nelson, A.L. Glowinski, J. Goldberg, M.J. Lyons, M.T. Tsuang, T. Jacob, M.R. True & S.A. Eisen. (2002). A twin study of genetic and environmental influences on suicidality in men, Psychological Medicine, 32, 11-24.

Kirmayer, L.J., L.J. Boothroyd, S. Hodgins (1998). Attempted Suicide among Inuit youth: Psychosocial correlates and implications for prevention, Canadian Journal of Psychiatry, 43, 816–822.

Saturday, December 3, 2011

Were native Europeans replaced?

Spread of farming in Europe. Cultural diffusion or population replacement? Source

Between 9,000 and 3,000 years ago farming spread through Europe and replaced hunting, fishing, and gathering. Was this process just a change in lifestyle? Or was it also a population change? Did Middle Eastern farmers replace native Europeans?

For Greg Cochran, the answer is clear:

Increasingly, it looks as if the hunter-gatherers who lived in Europe at the end of the ice age have been largely replaced. Judging from all those U5 mtdna results from ancient skeletons, I’d say that the hunters don’t account for more than 10% of the ancestry of modern Europeans. (Cochran, 2011)

Actually, the U5 haplogroup remained common after the transition to farming. This was the conclusion of a study of 92 Danish human remains that ranged in time from the Mesolithic to the Middle Ages. The study found genetic continuity from late hunter/gatherer/fishers to early farmers:

The extent to which early European farmers were immigrants or descendents of resident hunter-gatherers (replacement vs. cultural diffusion) has been widely debated, and new genetic elements have recently been added. A high frequency of Hg U lineages , especially U5, has been inferred for pre-Neolithic Europeans based on modern mtDNA data, with Hg U5 being fairly specific to Europe. [...] Our study therefore would point to the Early Iron Age and not the Neolithic Funnel Beaker Culture as suggested by Malmstrom et al. (2009), as the time period when the mtDNA haplogroup frequency pattern, which is characteristic to the presently living population of Southern Scandinavia, emerged and remained by and large unaltered by the subsequent effects of genetic drift (Melchior et al., 2010)

Thus, the sharp genetic divide was not between late hunter/fisher/gatherers and early farmers. It was between the earliest farmers and groups that had been farming for at least a millennium or so. The evidence is more consistent with natural selection than with population replacement.

But isn’t mtDNA unresponsive to natural selection? That’s what I used to think. There is growing evidence, however, that some mtDNA loci respond to natural selection. In particular, some haplogroups seem to reflect a trade-off between thermogenesis and ATP synthesis (Balloux et al, 2009). This trade-off might explain differences in disease risk between different mtDNA haplogroups. Haplogroup U, in particular, is associated with a lower risk of glaucoma (Wolf et al., 2010). There also seems to be an age-related association between this haplogroup and risk of Alzheimer’s (Santoro et al., 2010).

If true, the decline of U-type haplogroups among early farmers may reflect the different patterns of physical activity between them and hunter/fisher/gatherers.

So was it cultural diffusion or population replacement?

The jury is still out, but the consensus is moving towards a position where Middle Easterners initially established pioneer farming settlements in central Europe but were over time largely replaced by native farmers. Rowley-Conwy (2011, p. S434) describes this new model:

Our explanations must now rest on two major foundations: most Neolithic genes were native, but the major domesticates were exotic. Small-scale rather than continent-wide migrations are the best way to integrate these into one model. Agriculture in a region may have been introduced by immigrants, but that does not mean that the immigrants carried mainly Near Eastern genes (Richards 2003; Rowley-Conwy 2004b; Zvelebil 2005). The LBK, for example, originated in the Carpathian Basin; the population that moved westward emerged there carrying a complex mix of European and Near Eastern mtDNA and no doubt picking up more as it moved.

There is evidence that these pioneer farming settlements assimilated local hunter-gatherers, especially women. In at least some cemeteries, the female skeletons are likelier than the male skeletons to have come from outside the local farming community (Rowley-Conwy, 2011, p. S439). Thus, over time, this recruitment of local hunter-gatherers would have steadily diluted the original gene pool, and this dilution would have been more advanced in later, secondary settlements that budded off from the early centers of colonization.

This process was hastened by the extinction of many of the early farming settlements. In northwestern France, the Villeneuve-Saint-Germain culture represented the furthest westward extension of these colonizing farmers. After a couple of centuries, however, it disappeared and was replaced by farming cultures of local origin (Rowley-Conwy, 2011, p. S439)


Balloux F., L.J. Handley, T. Jombart, H. Liu, and A. Manica (2009).
Climate shaped the worldwide distribution of human mitochondrial DNA sequence variation. Proceedings. Biological Sciences, 276 (1672), 3447–55.

Cochran, G. (2011). First-mover advantage, West Hunter, November 25

Melchior, L., N. Lynnerup, H.R. Siegismund, T. Kivisild, J. Dissing. (2010). Genetic diversity among ancient Nordic populations, PLoS ONE, 5(7): e11898

Rowley-Conwy, P. (2011). Westward Ho! The Spread of Agriculturalism from Central Europe to the Atlantic, Current Anthropology, 52 (S4), S431-S451

Santoro A., V. Balbi, E. Balducci, C. Pirazzini, F. Rosini, et al. (2010). Evidence for Sub-Haplogroup H5 of Mitochondrial DNA as a Risk Factor for Late Onset Alzheimer's Disease. PLoS ONE, 5(8): e12037. doi:10.1371/journal.pone.0012037

Wolf, C., E. Gramer, B. Müller-Myhsok, F. Pasutto, B. Wissinger, & N. Weisschuh. (2010). Mitochondrial haplogroup U is associated with a reduced risk to develop exfoliation glaucoma in the German population, BMC Genetics, 11, 8