Monday, August 15, 2022

Comparing an incomparable?


Stigmata Siciliana (1964), David McLure (Wikicommons)



What is the mean IQ of sub-Saharan Africans? There’s no clear answer. Current estimates come from an early stage of the Flynn effect and are also distorted by qualitative differences in cognition. Furthermore, mean IQ differs among African groups.




At present, there is little consensus on the mean IQ of sub-Saharan Africans. Estimates have ranged from a low of 66 to a high of 82 (Lynn 2010; Wicherts et al. 2010). Rindermann (2013) put forward a "best guess" of 75, which is inexplicably much lower than the estimated African American mean of 85. Yes, African Americans are about 20% European by ancestry, but that degree of admixture would not cause a 10-point difference. Malnutrition? That might depress IQ scores in some African countries but not most.


Noah Carl (2022) has reopened the debate by inferring mean IQ from harmonized test scores and GDP per capita. Sub-Saharan Africa looks somewhat better on the first measure and somewhat worse on the second. Both measures correlate roughly with mean IQ, but the correlation isn’t strong enough to tell us whether the mean is 62, 75, or 82. Moreover, the first measure suffers from the same problem that plagues IQ tests: Africa is just starting to experience the secular increase in mean IQ that the West experienced during the 20th century, i.e., the Flynn effect. By how much should we increase the estimate of mean African IQ to adjust for Africa being at an earlier stage of the Flynn effect?


As for the second measure, GDP per capita, the ability to create wealth is determined not only by cognitive ability but also by other mental traits: future time orientation (also known as time preference), willingness to follow rules and enforce them, feelings of guilt over breaking rules, reluctance to use violence to settle disputes, tendency toward individualism rather than nepotism and familialism, and so on.


In a reply to Carl’s article, Emil Kirkegaard (1922) infers mean IQ from the Social Progress Index. But that measure is no less problematic than GDP per capita. Social progress is driven by a basket of mental qualities, of which cognitive ability is only one. Emil himself makes that point:


One cannot just impute IQs reliably from non-IQ data in order to get some kind of unbiased estimates of a region's IQ because the regions themselves may under- or overperform on international rankings for whatever reason, [including] legacy of or current communism, nonWEIRDness, low individualism, or any other difference you can imagine.


Emil concludes: “There’s no avoiding having to collect more African IQ data.”


More data would be nice, but no amount of data will provide us with a mean African IQ that can be usefully compared with the mean IQs of other populations. There are several reasons:


·        Again, estimates of African IQ come from an early stage of the Flynn effect. They are not comparable with estimates of IQ that come from a later stage in other populations.

·        The genetic architecture of cognition is not the same. Sub-Saharan Africans seem to have alleles for cognitive ability that do not exist in other populations. To date, such alleles have been identified only in people of European descent.

·       Recent cognitive evolution, particularly in societies near the Niger, has created differences in mean cognitive ability among African groups. It is no more meaningful to talk about a single mean African IQ than it is to talk about a single mean European IQ.


Differences in the stage of the Flynn effect


IQ data from Western societies are not comparable with IQ data from African societies. The latter are just beginning to experience the rise in mean IQ that took place earlier in the West, specifically the increase of 13.8 points between 1932 and 1978 (Flynn 1984). The Flynn effect seems to be not so much an increase in cognitive ability as an increase in familiarity with the “test paradigm” at school and, more broadly, in society. Flynn (2013) situates the cause in the modernist paradigm: “We freed ourselves from fixation on the concrete and entered a world in which the mass of people began to use logic on abstractions and universalize their moral principles.”


Keep in mind that competitive exams began to appear in the West only in the late 19th century, first for entry into the civil service and then more generally for the educational system (Wikpedia 2022). Previously, people entered the civil service through patronage appointments, and education took the form of apprenticeship and imitation of role models. In those days, people were less inclined to formulate questions and look for the answers. The answers were already known, and you had to learn them. In fact, there was a stigma attached to asking too many questions, especially in rapid-fire succession.


Differences in the genetic architecture of cognition


As a means to estimate cognitive ability, the IQ test is becoming superseded by the educational polygenic score. This measure is based on SNPs that have been shown to be associated with educational attainment. Your polygenic score is higher to the extent that the alleles at those SNPs are associated with higher educational attainment. It is thus a measure of innate cognitive ability. At present, we have identified 1,271 SNPs that are associated with educational attainment and which, together, explain 11-13% of the variance in educational attainment among individuals (Lee et al. 2018). The educational polygenic score has shown good reliability in predicting the IQ of individuals and even better reliability in predicting the mean IQ of populations.


Again, we have identified alleles associated with educational attainment only in people of European descent. For this reason, the educational polygenic score is five times worse at predicting the cognitive ability of African Americans (Lasker et al. 2019). The loss of predictive power seems greatest in the domain of language ability, according to two studies:


·        Guo et al. (2019, p. 27) found that the educational polygenic score is ten to eighteen times worse at predicting the verbal ability of African Americans, in comparison to White, Asian, and Hispanic White Americans. They attributed this difference to the smaller size of the African American sample, to gene-environment interactions, and to “significantly less than full coverage of African genetic variants related to cognitive ability.”

·        With a sample of school-age African Americans, Rabinowitz et al. (2019) found that the educational polygenic score fails to predict performance on a standardized reading test but does predict pursuit of postsecondary education, getting a criminal record (only among boys), and performance on a standardized math test (only for one of the three cohorts).


When modern humans began to spread out of Africa some 60,000 years ago, those left behind began to pursue their own trajectory of cognitive evolution. The evolutionary change seems to have been greatest in the domain of language, i.e., the ability to express oneself in speech and writing. Polygenic scores cannot predict innate reading ability because too many of the relevant alleles are exclusive to the African gene pool and remain unidentified. Other relevant alleles may simply be more important or less important in other gene pools.


Although the educational polygenic score is based on alleles identified in Europeans, it can still be used for rough predictions of cognitive ability among people of African descent. Lasker et al. (2019, pp. 444-445) were able to increase its predictive power for African Americans by almost a factor of three, i.e., an increase from 20% to 54% of its predictive power for European Americans. They achieved this improvement by using alleles from a much smaller subset of SNPs that are less sensitive to decay of linkage disequilibrium.


Differences among African groups in the trajectory of cognitive evolution


Within the larger African trajectory of cognitive evolution, various African populations have pursued their own sub-trajectories. This has been especially true for populations in West Africa over the past millennium and a half. Their educational polygenic scores vary as you go from west to east, being lowest among the Mende (Sierra Leone) and progressively higher among Gambians, the Esan (Nigeria), and the Yoruba (Nigeria). The Yoruba have almost the same educational polygenic score as that of African Americans, who nonetheless are about 20% admixed with Europeans (Piffer 2021, see Figure 7).


Before European contact, West African societies were more complex in the north and the east, i.e., in the Sahel and the Nigerian forest. Those areas saw the creation of towns, the formation of states, and an increasing use of metallurgy and luxury goods from the fourth century onward. The increase in social complexity seems to have been driven by the development of trade along the Niger, which served as the main trading route between the coast and the interior (Frost 2022).


In West Africa, cognitive evolution seems to have gone the farthest among the Igbo of the Niger delta. We have no educational polygenic data on them, but their record of academic achievement in Nigeria, the UK, and elsewhere indicates an unusually high level of cognitive ability (Chisala 2015).




We should get more data, while recognizing the limits of what the data may tell us. IQ tests will always be problematic, and future research should focus on educational polygenic scores. In particular, we need to identify relevant alleles in non-European populations. Some of those alleles may be population-specific, and others may be universal but more important in some populations than in others. Finally, Africa is not a monolith. Different African populations have pursued different trajectories of cognitive evolution.





Carl, N. (2022). How useful are national IQs? Noah’s Newsletter, July 13.  


Chisala, C. (2015). The IQ gap is no longer a black and white issue. The Unz Review, June 25.   


Flynn, J.R. (1984). The mean IQ of Americans: Massive gains 1932–1978. Psychological Bulletin 95(1):29–51.  


Flynn, J.R. (2013). The “Flynn Effect” and Flynn’s paradox. Intelligence 41: 851-857.   


Frost, P. (2021). Polygenic scores and Black Americans. Evo and Proud, April 27.   


Frost, P. (2022). Recent cognitive evolution in West Africa: the Niger’s role. Evo and Proud, April 30.  


Guo, G., Lin, M.J., and K.M. Harris. (2019). Socioeconomic and Genomic Roots of Verbal Ability. bioRxiv, 544411.  


Kirkegaard, E.O.W. (2022). African IQs without African IQs: it’s complicated. Just Emil Kirkegaard Things. August 7.  


Lasker, J., B.J. Pesta, J.G.R. Fuerst, and E.O.W. Kirkegaard. (2019). Global ancestry and cognitive ability. Psych 1(1).  


Lee, J. J., Wedow, R., Okbay, A., Kong, E., Maghzian, O., Zacher, et al. (2018). Gene discovery and polygenic prediction from a genome-wide association study of educational attainment in 1.1 million individuals. Nature Genetics 50(8): 1112-1121.


Lynn, R. (2010). The average IQ of sub-Saharan Africans assessed by the Progressive Matrices: A reply to Wicherts, Dolan, Carlson & van der Maas. Learning and Individual Differences 20(3): 152-154.   


Piffer, D. (2021). Divergent selection on height and cognitive ability: evidence from Fst and polygenic scores. OpenPsych.     


Rabinowitz, J.A., S.I.C. Kuo, W. Felder, R.J. Musci, A. Bettencourt, K. Benke, ... and A. Kouzis. (2019). Associations between an educational attainment polygenic score with educational attainment in an African American sample. Genes, Brain and Behavior, e12558.   


Rindermann, H. (2013). African cognitive ability: Research, results, divergences and recommendations. Personality and Individual Differences 55: 229-233.   


Wicherts, J.M., C.V. Dolan, and H.L.J. van der Maas. (2010). A systematic literature review of the average IQ of sub-Saharan Africans. Intelligence 38: 1-20.   


Wikipedia. (2022). Imperial examination – Influence - West.  

Monday, August 8, 2022

Vampirism and bloodlust


Ishbosheth is slain (Wikicommons – Maciejowski Bible)


Before the State monopoly on violence, an adult male was expected to spill another man’s blood in the course of life. Such action would be authenticated by the sight, feel, and taste of that blood.



Vampirism is the desire to see, feel, and taste blood. Today, we encounter it in horror movies or Gothic fiction, yet it does exist in real life. A “vampire” derives intense pleasure, bordering on sexual excitement, from the sight, feel, and taste of blood (Jaffé and DiCataldo 1994; Vanden Bergh and Kelly 1964). The following is a classic case:


In 1978, during a two-day rampage in the Mayenne region of France, a 39-year-old man attempted to rape a preadolescent girl, also biting her deeply in the neck, murdered an elderly man whose blood he drank and whose leg he partially devoured, killed a cow by bleeding it to death, murdered a married couple of farmers, and almost succeeded in doing the same with their farm hand. (Jaffé and DiCataldo 1994)


Most of the literature on vampirism comes from societies where the State has long held a monopoly over the use of violence and where non-State violence has long been criminalized and even pathologized. In many parts of the world, however, that monopoly is either recent or ineffective. The average man is still expected to use violence to defend himself and his family against threats that may seem trivial in a State-pacified society.


We need a cross-cultural study of the desire to shed blood. An initial step in that direction was taken by Frantz Fanon, who worked as a hospital psychiatrist in Algeria. He described vampirism as a frequent characteristic of murder cases in that country. The Algerian murderer “needs to feel the heat of blood and steep himself in his victim’s blood. […] A number of magistrates even go so far as to say that killing a man for an Algerian means first and foremost slitting his throat” (Fanon 2004[1963], p. 222).


Additional cross-cultural perspective has been provided by two recent papers. One of them is a case report from Sri Lanka:


A 20-year old single, unemployed male was referred from a drug rehabilitation center to the psychiatry clinic. He presented with poor anger control, impulsive behavior and the urge to drink blood, against a background of multiple substance dependence. He had been adopted in his early childhood and there was no childhood features to suggest developmental delays, hyperactivity, impulsivity, or conduct disorder.


[…] Although he experienced a sense of satisfaction after ingestion of blood, this act was not associated with obsessions, delusions, hallucinations, sexual gratification or paraphilic behaviour. He did not have any other psychiatric illnesses. (Adicaram et al. 2021)


The other paper presents two case reports from Turkey. In that country, vampirism usually takes the form of self-mutilation, if only because personal bloodletting is less likely to invite legal retribution. The authors describe it as following a stereotypical behavioral sequence:


We propose the term "hemomania" to describe an impulse control disorder characterized by impaired functioning due to at least one of the following urges: seeing one's own blood, self-bloodletting, and tasting/drinking one's own blood. We argue that hemomania progresses from an urge to see one's own blood to the urge to drink it (Kandeğer et al. 2021).


The “vampire” responds positively to the sight of blood and is thus driven to spill more blood and ultimately bathe in it and taste it. If this is indeed an impulse-control disorder, it should exist in many apparently normal people, among whom it would be unexpressed and under strong inhibition.


A desire to shed blood may have been much more common before the State monopolized the use of violence—at a time when an adult male was expected to spill another man’s blood in the course of life. In that context, it would be counterproductive to feel nauseated. In fact, one should feel excited. And the final triumph over an adversary would be authenticated by the sight, feel, and taste of that man’s blood.


We still have a word for that: “bloodlust.” There is also the word “bloodthirsty.” Today, we hear and say those words without fully understanding their original meaning. They refer to a mental state that used to be common in another time, but which has since been expunged from normal life … to the point that we now see it as weird and pathological.


A Middle English ballad describes the pleasure that a group of men felt when drinking the blood of a freshly killed deer:


They eat of the flesh, and they drank of the blood,

And the blood it was so sweet,

Which caused Johny and his bloody hounds

To fall in a deep sleep. (Haughey 2011, p. 350)


Those men were breaking a taboo against drinking an animal’s blood or eating its bloody flesh. That taboo went back to Anglo-Saxon times, when meals would bring many men together around the same table. It was feared that consumption of animal blood would excite the male mind and lead to violence, murder and, ultimately, consumption of human blood:


From this savage sharing of raw food with dogs, it is a short logical leap to cannibalism, the ultimate food taboo, for once one is able to devour bloody flesh, one has lost inhibitions concerning food. […] Johny Cock eats raw meat with his dogs, many Robin Hood ballads fixate on the sublimated violence in overblown feast scenes, and uncouth outlaw heroes like Hereward, Gamelyn, and Fulk Fitz Waryn repeatedly break taboos against mixing raw human blood with their meals when they bleed on their plates or tables and insist on continuing their feasts. (Haughey 2011, pp. 29-30)


Many cultural traditions insist on the removal of blood from flesh before it can be eaten. This taboo is described in the Hebrew Scriptures:


But you must not eat meat that has its lifeblood still in it. And for your lifeblood I will surely demand an accounting. I will demand an accounting from every animal. And from each human being, too, I will demand an accounting for the life of another human being. Whoever sheds human blood, by humans shall their blood be shed.

Genesis 9:4-6


With the rise of State societies, male violence became criminalized in most cases, with the notable exceptions of self-defense and war. Those new circumstances favored a different sort of man, one who would react negatively to the sight of blood. With the marginalization of bloodthirsty individuals, and their gradual removal from the gene pool, there was likewise a removal of bloodlust from real life.


Today, bloodlust survives as a deactivated behavior that normally remains dormant. This is the situation that prevails in long-pacified societies: vampirism has literally become pathological—it is reactivated only by environmental or genetic accidents that cause many other pathologies. The “vampire” looks and acts like a freak.


This is less true in societies that have been pacified more recently. The “vampire” seems more normal and shows fewer signs of mental disorder.





Adicaram, D.R.S., Wijayamunige, E.S. and Arambepola, S.C.A., 2021. Vampires! Do they exist? A case of clinical vampirism. Sri Lanka Journal of Psychiatry 12(2): 38-40.


Fanon, F. (2004[1963]). The Wretched of the Earth. New York: Grove Press.


Haughey, S. (2011). The 'Bestli' Outlaw: Wilderness and Exile in Old and Middle English Literature. PhD dissertation, Cornell University.


Jaffé, P. D., and F. DiCataldo. (1994). Clinical vampirism: Blending myth and reality. Bulletin of the American Academy of Psychiatry & the Law 22(4): 533–544.


Kandeğer, A., F. Ekici, and Y. Selvi (2021). From the urge to see one’s own blood to the urge to drink it: Can hemomania be specified as an impulse control disorder? Two case reports. Journal of Addictive Diseases 39(4): 570-574.


Vanden Bergh, R.L.,and J.F. Kelly. (1964). Vampirism: A Review with New Observations. Archives of General Psychiatry 11(5):543–547.

Friday, July 29, 2022

Recent evolution in Estonia


Estonian women at a song festival (Wikicommons – Anastasia Lakhtikova)


Estonian women had more reproductive success during the late 20th century if they possessed a more masculine body build, narrower hips, and shorter legs. Such women married earlier and were less likely to stay on the mate market as long as possible.


Human evolution didn’t end in the Pleistocene. In fact, there has been more genetic change within our species over the past 10,000 years than over the previous 100,000, and perhaps more than over the previous million. The growing importance of culture did not slow down the pace of genetic change. In fact, culture became the main driving force of genetic evolution by replacing adaptation to a limited number of natural environments with adaptation to an ever-widening range of cultural environments (Cochran and Harpending 2009; Hawks et al. 2007; Rinaldi 2017).


Two years ago, I reviewed a study on recent evolution in the Estonian population (Frost 2020; Hõrak and Valge 2015). Among Estonians born between 1937 and 1962, women with only primary education had 0.5 to 0.75 more children than did women with tertiary education. This difference in reproductive success correlated with difference in cranial volume: children with larger crania were more likely to go on to secondary or tertiary education, independently of sex, socioeconomic position, and rural vs urban origin (Valge et al. 2019). Thus, for Estonian women in the late 20th century, higher education decreased fertility, probably by postponing the age of marriage.


That finding was found only for women. Perhaps Estonian men with higher education enjoyed greater reproductive success, in which case selection for less intelligent women may have been cancelled out by selection for more intelligent men.


The same research team has now published a new study of the same dataset, this time on both sexes. They confirm the original finding that female fertility correlated negatively with education and cranial volume. As for male fertility, although it correlated positively with education, the most fertile males had only average cranial volume. The authors had no explanation for that finding:


Stabilizing selection on the cranial volume of boys was an unexpected result, given that cranial volume in our study population predicts educational attainment independently of sex, socioeconomic background, and height. Since educational attainment was a strong predictor of fatherhood in our study, we would have expected positive directional selection on cranial volume. However, we found only evidence for stabilizing selection (Valge et al. 2022)


Perhaps women prefer men who are well-educated but not excessively intelligent. As one goes farther and farther away from the mean IQ of a population, higher intelligence becomes more and more often due to genetic “accidents”—unusual genetic variants or combinations of variants that may adversely affect other aspects of mind and behavior. A very intelligent person may seem autistic or have poor social skills.


The new study also shows that women had greater reproductive success if they possessed a more masculine body build, narrower hips, and shorter legs. That finding may seem counterintuitive. Don’t men prefer feminine-looking women? They do. However, as the authors show by citing earlier findings, shorter women are also less selective and likelier to marry earlier:


Similar reasoning might also explain why selection favored girls with masculine body build, narrow hips, and absolutely and relatively shorter legs in our study. If choosiness in women increases with desirability, this could lead to women with more feminine phenotypes engaging in a more time-consuming mate selection process, delaying their age of first birth, and thereby negatively affecting reproduction. (Valge et al. 2022)


Finally, the new Estonian study shows that heavier and stronger boys had more reproductive success.


The results relating to height and strength are consistent with studies of sexual selection showing that men who are taller, stronger, and more physically fit are generally perceived as more physically attractive by women, and therefore, have better opportunities for partnering and becoming a father. For instance, in a sample of Polish men born in the 1930s, childless men appeared significantly shorter than those with at least one child. In West Point graduates, the number of children increased linearly with height because taller men had higher probabilities of marrying more than once. Barclay and Kolk showed in a sample of 405,427 Swedish conscripts born between 1965 and 1972 that men in the lowest deciles of height, and in particular, physical fitness in early adulthood, had the lowest probabilities of transition to parenthood. (Valge et al. 2022)


Final thoughts


This is a study of Estonians who were born more than a half-century ago, long before the breakup of the Soviet Union. Things may be different now. Estonians have rapidly converged on Western social, behavioral, and ideological norms over the past three decades. Although their country is nominally independent, they are now strongly influenced by the inflow of Western culture via the media, and this new media environment is having a decisive impact on how they think and act (Karlin 2018).


Estonia is generally following the lead of the West. With respect to education and fertility, the negative correlation has become stronger throughout the West: “In all countries [Australia, United States, Norway, Sweden], however, education is negatively associated with childbearing across partnerships, and the differentials increased from the 1970s to the 2000s” (Thomson et al. 2014).


This differential is increasing not only between families but also within “families.” Second and third children are born increasingly to women who have divorced and are in relationships with low-quality fathers who often seem to be little more than sperm donors. In Norway, multi-partner fatherhood has become most common among men with the lowest level of education (10 years of schooling, "i.e., compulsory education"):


At age 45, about 15 percent of all men in the 1960-62 cohort with a compulsory education had had children with more than one woman, compared to about 5 percent among men with a tertiary degree. If looking at fathers only (Figure 6), the pattern becomes even more pronounced. At the lowest educational level, 19.3 percent of those who had become fathers, had children with more than one woman, compared to 6.1 percent of those at the highest educational level. (Lappegård et al. 2011)


This trend may partly explain the slowing down and reversal of the Flynn effect, i.e., the steady rise in mean IQ over the 20th century. There is some debate over whether the Flynn effect was a real increase in intelligence or simply an increase in familiarity with doing tests. In any case, its reversal seems real enough.


With respect to Norway, Bratsberg and Rogeberg (2018) have shown that the decline in mean IQ can be explained by “within-family variation.” In other words, mean IQ is declining among people who supposedly share the same genetic background, i.e., siblings. In Norway, however, siblings are increasingly half-siblings. Among Norwegian women with only two children, 13.4% have had them by more than one man. The figure rises to 24.9% among those with three children, 36.2% among those with four children, and 41.2% among those with five children (Thomson et al. 2014). 


The family unit is decomposing throughout the West. It is becoming little more than an administrative entity that can be repeatedly dissolved and reconstituted (Frost 2018a; Frost 2018b).





Bratsberg, B., and O. Rogeberg. (2018). Flynn effect and its reversal are both environmentally caused. Proceedings of the National Academy of Sciences 115 (26) 6674-6678


Cochran, G. and H. Harpending. (2009). The 10,000 Year Explosion: How Civilization Accelerated Human Evolution. Basic Books: New York.


Frost, P. (2018a). Why is IQ declining in Norway? Evo and Proud, June 19.


Frost, P. (2018b). Yes, the decline is genetic. Evo and Proud, June 26.


Frost, P. (2020). Declining intelligence in the 20th century: the case of Estonia. Evo and Proud, August 3.


Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA) 104: 20753-20758.


Hõrak, P., and M. Valge. (2015). Why did children grow so well at hard times? The ultimate importance of pathogen control during puberty. Evolution, Medicine, and Public Health (1): 167–178,


Karlin, A. (2018). Gay marriage in Estonia. The Unz Review, October 30.


Lappegård, T., Rønsen, M., and Skrede, K. (2011). Fatherhood and fertility. Fathering 9: 103-120.


Rinaldi, A. (2017). We're on a road to nowhere. Culture and adaptation to the environment are driving human evolution, but the destination of this journey is unpredictable. EMBO reports 18: 2094-2100.


Thomson, E., T. Lappegård, M. Carlson, A. Evans, and E. Gray (2014). Childbearing across partnerships in Australia, the United States, Norway, and Sweden. Demography 51(2): 485-508. 


Valge, M., R. Meitern, and P. Hõrak.  (2022). Sexually antagonistic selection on educational attainment and body size in Estonian children. Annals of the New York Academy of Sciences Early view

Sunday, July 17, 2022

Cognitive evolution on the Italian Peninsula


A recent polygenic study has shown that mean cognitive ability is higher in the North of Italy than in the South. Cognitive evolution seems to have gone the farthest in the Northeast, perhaps because the Northwest earlier went through the Industrial Revolution, which severed reproductive success from economic success.




As a country, Italy came into existence only a century and a half ago. Regional differences are still strong, particularly between the North and the South. The “Southern question” is usually said to date from the unification of Italy in the 19th century:


In the decades following the unification of Italy, the northern regions of the country, Lombardy, Piedmont and Liguria in particular, began a process of industrialization and economic development while the southern regions remained behind. At the time of the unification of the country, there was a shortage of entrepreneurs in the south, with landowners who were often absent from their farms as they lived permanently in the city, leaving the management of their funds to managers, who were not encouraged by the owners to make the agricultural estates to the maximum. Landowners invested not in agricultural equipment, but in such things as low-risk state bonds. (Wikipedia 2022a)


De Rosa (1979) argues that the South had already fallen behind the North by the 18th century. At that time, its middle class was small, and economic relations were still structured by paternalism and familialism. One could go back even farther, to the Renaissance or even the late Middle Ages, to identify the moment when northern Italy, and Western Europe in general, embarked on sustained economic growth and thus pulled ahead of the rest of the world.


That sustained economic growth brought sustained demographic growth, particularly of the middle class. Gregory Clark found that the English middle class expanded steadily from the twelfth century onward, its descendants not only growing in number but also replacing the lower classes through downward mobility. By the 1800s, its lineages accounted for most of the English population. That demographic change coincided with mental and behavioral changes: higher cognitive ability, lower time preference, and a lower threshold for violent behavior. In a word, the English became more middle-class in character. “Thrift, prudence, negotiation, and hard work were becoming values for communities that previously had been spendthrift, impulsive, violent, and leisure loving” (Clark 2007, p. 166).


Elsewhere in Western Europe, the middle class similarly expanded during late medieval and early modern times. The result would be a growing contrast between regions that had participated in this economic and demographic change and those that had not, such as southern Italy. The contrast can be seen not only on purely economic measures but also on mental ones, like the INVALSI standardized test—an annual test of skills in Italian schools. It is divided into two sections: Italian language skills and Math skills. On both tests, northern Italian students do better than southern Italian students, the difference being a little over half a standard deviation:


Yes, the North-South gap in academic achievement could have a purely environmental cause—and this is a recurring problem when we try to tease apart genetic and cultural evolution. If economic development is held back by a culture of poverty, that same culture may discourage students from trying to do better at school. Those students may also have less access to proper nutrition, medical care, libraries, and so on.


Polygenic scores for cognitive ability


That is why there is so much interest in measures of innate cognitive ability. The most promising one is the polygenic score (PGS)—the summation of alleles (genetic variants) that have been associated with cognitive ability, as measured by educational attainment.  At present, we have identified enough of these alleles to explain 11-13% of the overall variation in cognitive ability (Lee et al. 2018).


Yes, those alleles are just a sample of the total number, but why would they be an unrepresentative sample? More to the point: why would PGS data show certain geographic patterns and not a lot of random noise? The mean PGS does indeed differ geographically among human populations. It is highest in Eurasia, with East Asians, Ashkenazi Jews, and Finns having the highest scores. That geographic pattern is in line with IQ data (Piffer 2019).


Polygenic scores on the Italian Peninsula


In a recent study, Piffer and Lynn (2022) have found regional differences in Italy for alleles associated with educational attainment. They used two datasets: one encompassing 129 Italian individuals and the other 947. All of these individuals had all four grandparents born in the same part of Italy (this requirement was imposed to eliminate the effects of recent interregional migration). When the authors grouped the data into three large regions—North, Central, and South—they found “a clear north-south gradient, with central Italians occupying an intermediate position.” There was more overlap between central and southern Italians than between central and northern Italians.


The datasets were too small to show genetic differences within each of the three large regions. If we go back to the INVALSI data, we see that academic achievement is much stronger in the North-Northeast (Lombardia, Trentino, Veneto, Friuli) than in the Northwest (Valle d’Aosta, Liguria).


At first thought, that geographic pattern may seem counter-intuitive. In northern Italy, industrialization began in the northwest and came later to the northeast: “the diffusion of industrialisation that characterised the northwestern area of the country largely excluded Venetia and, especially, the South” (Wikipedia 2022b). If economic development had driven cognitive evolution on the Italian Peninsula, why would this evolution have gone farther in the northeast? Why would it be negatively associated with industrialization?


Because the Industrial Revolution put a stop to cognitive evolution. It severed the link between economic success and reproductive success. Previously, businesses were family-run, and the family provided the workforce. Successful business owners were incentivized to have larger families, and their children would have the means to marry at a younger age. Then, in the late 19th century, that stage of economic development began to give way to industrial capitalism. Financial success no longer translated into early marriage and large families who could help with the work. If more workers were needed, they would simply be hired. Business owners now tended to have smaller families because of the high maintenance costs of middle-class children (Canlorbe and Frost 2020; Frost 2018).


Cognitive evolution thus ended earlier in the Northwest of Italy than in the Northeast. By the same token, interregional migration has had more time to erode the cognitive advantage that evolved in the Northwest. Yes, the datasets were limited to people who had all four grandparents born in the region, but, for most people, that limitation would not eliminate the effects of interregional migration before the mid-20th century.





Canlorbe, G., and P. Frost (2020). Why are human groups so different? American Renaissance, March 20.  


Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, 1st ed. Princeton University Press: Princeton, NJ, USA.


De Rosa, L. (1979). Property Rights, Institutional Change, and Economic Growth in Southern Italy in the XVIIIth and XIXth Centuries. Journal of European Economic History 8(3): 531-551.


Frost, P. (2018). Rise of the West. Part II. Evo and Proud, December 27  


Lee, J. J., Wedow, R., Okbay, A., Kong, E., Maghzian, O., Zacher, et al. (2018). Gene discovery and polygenic prediction from a genome-wide association study of educational attainment in 1.1 million individuals. Nature Genetics 50(8): 1112-1121.


Piffer, D. (2019). Evidence for Recent Polygenic Selection on Educational Attainment and Intelligence Inferred from Gwas Hits: A Replication of Previous Findings Using Recent Data. Psych 1(1): 55-75.    


Piffer, D., & Lynn, R. (2022). In Italy, North-South Differences in Student Performance Are Mirrored by Differences in Polygenic Scores for Educational Attainment. Mankind Quarterly 62(4), Article 2.  


Wikipedia (2022a). Economy of Italy – Southern Question  


Wikipedia (2022b). Economic history of Italy.  


Thursday, July 7, 2022

Adapting to bubonic plague


Rash associated with familial Mediterranean fever (Wikicommons – Dr. H.J. Lachman)


In the eastern Mediterranean, people were likelier to survive bubonic plague if they had stronger inflammatory responses to infection in their lungs, gut, and other tissues. Today, that natural selection is attested by a high incidence of familial Mediterranean fever.


Familial Mediterranean fever is due to mutations that increase the body’s production of pyrin, a protein that assists inflammatory responses to infection of the lungs, gut, and other tissues. It’s common among eastern Mediterranean peoples, like Jews, Syrians, Armenians, Turks, Greeks, and Italians. The most common symptoms are inflammation of the abdominal lining, the joints, and the chest. Some kind of natural selection seems likely because different mutations have evolved independently to produce the same disease within the same geographic region.


Several years ago, Greg Cochran suggested this fever might be an adaptation to trypanosome parasites (Leishmaniasis). Recent research now suggests an adaptation to bacteria of the genus Yersinia. Y. pestis causes bubonic plague, whereas Y. pseudotuberculosis and Y. enterocolitica cause gastroenteritis. These pathogens are highly infectious because they decrease the body’s production of pyrin and thus reduce its ability to fight infection. To compensate for this underproduction of pyrin, there seems to have been selection for mutations that cause overproduction of pyrin, hence the high incidence of familial Mediterranean fever in populations that have coexisted with Yersinia bacteria (Loeven et al. 2020).


Yepiskoposyan and Harutyunyan (2007) argue that selection for familial Mediterranean fever must have begun more than 2,500 years ago, since one of the alleles responsible for it is found in different communities of the Jewish diaspora, notably Iraqi and North African Jews. That argument doesn’t convince me, since diaspora communities were not reproductively isolated. The mutation could have arisen in one community and then been spread to others by Jewish individuals moving from one place to another.


I’m inclined to believe that selection for this fever began with the earliest recorded outbreak of bubonic plague: the Plague of Justinian (541-549 AD), which killed an estimated 25 million people throughout the Mediterranean Basin and the Middle East. An earlier date is nonetheless possible, since Y. pestis has been attested in archaeological finds as far back as 5,000 years ago (Wikipedia 2022).





Chung, L.K., Y.H. Park, Y. Zheng, I.E. Brodsky, P. Hearing, D.L. Kastner, J.J. Chae, and J.B. Bliska. (2016). The Yersinia Virulence Factor YopM Hijacks Host Kinases to Inhibit Type III Effector-Triggered Activation of the Pyrin Inflammasome. Cell Host Microbe 20(3):296-306.


Cochran, G. (2015). Familial Mediterranean fever. West Hunter, January 9.


Loeven, N.A., N.P. Medici, and J.B. Bliska. (2020). The pyrin inflammasome in host-microbe interactions. Curr Opin Microbiol 54:77-86.


Wikipedia (2022). Bubonic plague.


Yepiskoposyan, L., and A. Harutyunyan. (2007) Population genetics of familial Mediterranean fever: a review. Eur J Hum Genet 15: 911–916.