Monday, March 26, 2018

Wishing to end it all

Inupiat family (Wikicommons). In traditional Inuit society, people had a desire to live if they were socially active. Conversely, social inactivity seemed to set off a cascade of mental events leading to suicide.

Until 10,000 years ago, humans lived in small groups of hunter-gatherers. This was a "thick" social environment where everyone interacted intensely with a relatively small number of people. These interactions were not only recurrent but also predictable, being constrained and structured by social rules. 

With the advent of farming, and the population growth it made possible, the social environment became "thinner": everyone now interacted with more people but less frequently on average with each person. These interactions were also less predictable. With the development of large urban communities and, even more so, the market economy, interaction became especially infrequent, unstructured, and voluntary.

This distinction between "thick" and "thin" social environments corresponds more or less to the German dichotomy of Gemeinschaft and Gesellschaft, except that the dichotomy is really a continuum. Social interaction has become especially “thin” in one group of populations, northwest Europeans, through a process of cultural evolution that goes back at least a millennium. Indeed, some authors have argued that this process goes even farther back, perhaps even into prehistory (Frost 2017; Macfarlane 1978; 1992, 2002; and Seccombe 1992).

Given that over the past 10,000 years our genetic makeup has been adapting much more to cultural and social environments than to natural environments (Hawks et al. 2007), adaptation to "thin" social environments should vary from one population to another, being least advanced in those people who, until recently, were hunter-gatherers, i.e., who intensely interacted with a limited number of people. Such people would show the most mismatch between past and present environments.

In a previous post I argued that such a mismatch may explain the high rate of suicide among Inuit youth (Frost 2011). According to a 1972 survey of Inuit 15 to 24 years old from northern Quebec, 28% of the males and 25% of the females had attempted suicide (Kirmayer et al., 1998). Dufour (1994) argues that Inuit society has a long tradition of people ending their lives when they feel they have become useless. In the past, however, suicide involved only the elderly:

Suicide in early Inuit society was viewed positively when the individual had become a burden for the group. "Senilicide" in particular was deemed to be acceptable and appropriate. Its pattern: a usually elderly person motivated by illness, helplessness, bereavement, dependence on the group, famine, or resource shortage who would decide after consulting family members who sometimes could be called upon to assist. In contemporary Inuit society, the elderly no longer commit suicide. The young people do.

In a forthcoming paper, anthropologist Frédéric Laugrand similarly argues that in traditional Inuit society the elderly did not fear death and would even welcome it if they considered themselves no longer useful. Birket-Smith (1929, p. 300) wrote: "Suicide is not rare, and it is the duty of pious children to assist their parents in committing it." Knud Rasmussen (1929, p. 96) described a visit to a sick Inuit woman: 

I straightened myself up [inside the hut] and went across at once to the spot where the sick woman used to lie. On coming nearer, I nearly cried out aloud: I found myself looking into a face that was perfectly blue, with a pair of great eyes projecting right out from the head, and the mouth wide open. I stood there a little to pull myself together, and now perceived a line fastened round the old woman's neck and from there to the roof of the hut. When I was able to speak once more, I asked those in the house what this meant. It was a long time before anyone answered. At last the son-in-law spoke up, and said: 'She felt that she was old, and having begun to spit up blood, she wished to die quickly, and I agreed. I only made the line fast to the roof, the rest she did herself.'

The Moon Spirit was said to help people commit suicide by calling out to them: "Come, come to me! It is not painful to die. It is only a brief moment of dizziness. It does not hurt to kill yourself" (Rasmussen 1929, p. 74). The literature on the Inuit is replete with examples of this and related customs. When a travelling family had run out of food and were facing starvation, the oldest members would offer their flesh as food, after death, so that the others might live. This kind of request appears in an account about a group of travelling Inuit who ran out of food in 1905, near Igloolik.

When he sensed his coming death, Qumangaapik said to his wife, 'It has already happened in the past, in times of starvation, that people survived by feeding on human flesh. When I die, I want you to eat my body to survive, for you have many relatives.' She refused his offer, but he insisted, 'Please, you'll have to eat me!'

The inhabitants of Kangiqsualujjuaq (George River) have similar memories about a family who faced starvation on their way to the Labrador coast. They had used up all of their reserves of food, and the grandmother convinced the other family members to let her die and then eat her to ensure their survival. She told them that if they respected her wishes, they would have an abundant posterity to preserve her memory (Saladin d'Anglure, forthcoming).

It is understandable why the elderly might wish to commit suicide. When they have become a burden, their deaths will free up food and other resources for their children. This is especially so in times of famine, which was common in the Arctic. But who benefits when young people kill themselves? Something about modern society is sending Inuit youth the wrong signal.

Inuit seem to receive this signal when they becomes socially inactive for a length of time. There then develops a feeling of uselessness, and this feeling in turn triggers suicidal ideation. Previously, this cascade of mental events happened only in old people who could no longer help with hunting, food preparation, shelter building, or other strenuous activities. Such people would stay home most of the time. Saladin d'Anglure (forthcoming) recounts the story of an old shaman who could no longer get around. He shrank the outside world to the different parts of his igloo: the sleeping platform became the land, the floor the sea ice, the ice window the sun, the opening for the entrance the moon, and the dome the vault of the heavens. For elderly Inuit, this shrinking of their world foreshadows their departure for the next one.

Today, many young Inuit likewise stay home and become socially inactive. What else is there to do? On the one hand, the old economy of hunting and living on the land no longer exists. On the other hand, the new economy doesn't generate enough employment. There is also the trouble that Inuit have adapting to the Western model of working outside the family with non-kin for lengthy periods of time. For mine work, two or three weeks is the maximum they can stand being away from their families.


Social inactivity seems to trigger thoughts of suicide among the Inuit and, I suspect, among former hunter-gatherers in general. Once a hunter-gatherer people had transitioned to farming and hence to a larger and “thinner” social environment, selection would then raise the threshold for this trigger. Many factors probably decide how high the threshold is raised: the recentness of this transition and, more importantly, how far the population has gone down the path to a “thinner” social environment. Social networks can be relatively “thick” even in large urban settings.

In any case, no human population has fully adapted to the asocial environment we increasingly have in the Western world. Widespread asociality is recent, even in the West. 

All of this makes me wonder whether the “White Death” is due to something deeper than the opioid epidemic. When I visit my hometown the biggest change I notice is the large number of people who live alone, particularly men in their 40s and 50s—as a result of easy divorce and relationships that never went anywhere. Most of them work, and when they’re not working they drink or get stoned. When the inevitable happens, is it due to alcohol or drug abuse? Or is the ultimate cause a death wish?


Birket-Smith, K. (1929). The Caribou Eskimos. Material and social life and their cultural position, Copenhagen: Gyldendal

Dufour, R. (1994). Pistes de recherche sur les sens du suicide des adolescents inuit, Santé mentale au Québec 19: 145-162.

Frost, P. (2017). The Hajnal line and gene-culture coevolution in northwest Europe, Advances in Anthropology 7: 154-174.

Frost,P. (2011). Suicide and Inuit youth, Evo and Proud, December 10

Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA) 104: 20753-20758. 

Kirmayer, L.J., L.J. Boothroyd, S. Hodgins (1998). Attempted suicide among Inuit youth: Psychosocial correlates and implications for prevention, Canadian Journal of Psychiatry 43: 816-822.

Macfarlane, A. (1978). The origins of English individualism: Some surprises. Theory and Society 6: 255-277. 

Macfarlane, A. (1992). On individualism. Proceedings of the British Academy 82: 171-199. 

Macfarlane, A. (2002). The making of the modern world. London: Palgrave.

Rasmussen, K. (1929). Intellectual Culture of the Iglulik Eskimos, Vol. 7 (1) of Report of the Fifth Thule Expedition 1921-24, Copenhagen, Gyldendalske Boghandel.

Saladin d'Anglure, B. (forthcoming). Inuit Stories of Being and Rebirth, University of Manitoba Press.

Seccombe, W. (1992). A Millennium of Family Change. Feudalism to Capitalism in Northwestern Europe. London: Verso.

Monday, March 19, 2018

We make the environments we adapt to

Distribution of malaria in Italy, 1944 (Wikicommons). Malaria used to be common in parts of Italy, particularly Sardinia.

Gene-culture coevolution seems to be attracting more interest. According to Google Scholar, this term is appearing in more and more scientific articles:

2010-2017 - 248 mentions per year

2000-2009 - 107 mentions per year

1990-1999 - 22 mentions per year

1980-1989 - 31 mentions per year

I’d like to take some of the credit, but most of it actually goes to John Hawks and the landmark paper he authored in 2007 with Eric Wang, Greg Cochran, Henry Harpending, and Robert Moyzis. That paper, more than any other, changed the way we view the relationship between genetic evolution and cultural evolution in our species.

Rinaldi (2017) provides a good review of this field of research. He starts off with a definition of gene-culture coevolution:

"[All] organisms adapt to their environment, and in humans much of our environment is defined by our culture. Hence, cultural change can actually spur on adaptive evolution in humans", wrote evolutionary biologist Alan Templeton at Washington University in St. Louis, MO, USA. Following this argument, culture, social learning and technology have not replaced biological adaptation. Rather, human evolution is driven by the environmental conditions we created ourselves through culture, a process that has been accelerating since the beginning of agriculture and urban civilization.

Indeed, human genetic evolution sped up more than a hundred-fold some 10,000 years ago, when hunting and gathering gave way to farming, which in turn led to population growth and larger, more complex societies. Our ancestors were no longer adapting to relatively static natural environments but rather to faster-changing cultural ones of their own making. They created new ways of life, which in turn influenced who would survive and who wouldn't (Hawks et al. 2007).

Among other changes, farming exposed humans to new diseases. "Virulent epidemic diseases, including smallpox, malaria, yellow fever, typhus, and cholera, became important causes of mortality after the origin and spread of agriculture" (Hawks et al. 2007). This causation has been amply documented in the case of malaria:

If malaria was contracted by humans in the Pleistocene, it likely would have been in isolated incidences. For example, recent genetic analysis of the glucose-6-phosphate dehydrogenase gene, some variants of which confer resistance to the infection, confirmed that malaria is a recent selective force in human populations, occurring within the last 10,000 years. Based on the mitochondrial genome of the parasite itself, Joy et al. concluded that though the parasite that causes falciparum malaria originated long ago (perhaps 50,000-100,000 YBP), a sudden increase in the population size of the parasite did not occur until around 10,000 years ago when humans began to practice agriculture.

[...] Livingstone argued that slash-and-burn agriculture in West Africa would have exposed populations to Anopheles gambiae, the mosquito that serves as the vector for Plasmodium falciparum, the cause of malaria. Slash-and-burn agriculture resulted in sedentary populations surrounded by the pools of sunlit water required for propagation of the Anophelese mosquito. (Harper and Armelagos 2010)

Farming changed the adaptive equilibrium between the human body and Plasmodium falciparum. A new equilibrium arose in those human populations that had to coevolve with a high incidence of this parasite. Now, modern health measures are upsetting this equilibrium, and those same populations are falling prey to certain diseases—ironically, because of efforts to fight another disease:

The increase in multiple sclerosis and probably other autoimmune diseases such as type 1 diabetes in Sardinia, Italy, has been linked to the elimination of malaria from the island in the early 1950s. Centuries of exposure to Plasmodium falciparum would have shaped the human immune system to aggressively fight the parasite with a tendency to over-respond to triggering factors even after the disappearance of the parasites. Recent research has indeed identified a number of gene variants involved in malarial resistance and increased risk of multiple sclerosis in Sardinians. (Rinaldi 2017)

Human culture has created new environments of biological adaptation, and these environments differ from one human population to the next because human culture likewise differs from one to the next. Subsequent cultural change will therefore have a greater biological impact on some populations than on others. This is a general principle and is not limited to the above example of malaria.

In my next post, I will explore the biological impacts of another cultural change: the shift from a "thick" to a "thin" social environment. A "thick" social environment is characterized by intense interaction with a relatively small number of people. This interaction is not only recurrent but also predictable because it is constrained and structured by social rules. A "thin" social environment is characterized by interaction with more people on a less frequent basis, and this interaction is less predictable because social relations are less constrained and less structured.

Human cultures fall along a continuum from "thick" social environments, such as exist in small bands of hunter-gatherers, to "thin" social environments, such as exist in large societies where conditions for personal autonomy are optimal. In the latter, people are freer not only to change their networks of social interaction but also to reduce them to the minimum necessary for personal survival.

For several centuries, the West has been expanding personal autonomy by transferring collective authority from personal “bottom-up” structures (family, clan, ethny) to impersonal “top-down” structures (the State). Beginning in the 19th century, we have exported this cultural model to the rest of the world. Have there been adverse effects? And have they been worse in some human populations than in others? Most experts would answer “Yes” to both questions, while adding that the adverse effects are temporary. Once everyone has grown accustomed to being merely individuals, the effects should be pretty much the same everywhere. 

I will argue that some of these adverse effects will be permanent in all human populations. This is because no population has ever fully adapted to a social environment where individualism is at a maximum and where the State has largely replaced traditional “bottom-up” structures. I will also argue that these permanent adverse effects will be worse in some populations than in others.


I've been trying to measure the degree to which my blog is being "deplatformed," i.e., intercepted by blocking software. This search took me to the site Easy Counter, which told me that Evo and Proud is "poorly socialized" and may be "penalized." I also learned something else: on March 13, 2018, my blog was set to expire in 4 months. To date, no one has notified me of this decision, and I would still be unaware if I hadn’t gone to Easy Counter.

As I understand it, this sort of thing happens when a blog is inactive. Blogger is supposed to be free, and I've never had to renew my registration through an annual payment. So I'm asking you for advice. For whatever reason, Google wants to terminate this blog. Should I fight this decision or migrate to another platform? If so, which one would be best? And what is the easiest way to transfer all of my blog posts?


Harper, K. and G. Armelagos. (2010). The changing disease-scape in the third epidemiological transition, Int J Environ Res Public Health. 7(2): 675-697.

Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA), 104: 20753-20758.

Rinaldi, A. (2017). We're on a road to nowhere. Culture and adaptation to the environment are driving human evolution, but the destination of this journey is unpredictable, EMBO reports 18: 2094-2100 

Monday, March 12, 2018

Thoughts on the Italian election

Matteo Salvini - leader of Lega and the center-right coalition (Wikicommons)

What do I think of the Italian election results? How well do they bear out the predictions I made last November? In some ways, the nationalists did better than I expected, and in some ways worse. First the good news.

Western Europe's first nationalist government

Lega Nord (now simply Lega) went into the election as a junior partner in a center-right coalition led by Silvio Berlusconi. It is now the senior partner. Berlusconi's party, Forza Italia, did poorly, getting only 14% of the popular vote in comparison to Lega's 17%. Given that 4% of all votes went to the other nationalist party in the coalition, Fratelli d'Italia, we see that Italian support for the center-right is much more nationalist than conservative.

With a plurality of seats in the Chamber of Deputies and the Senate, Matteo Salvini will likely form the next government. He will bring a new perspective to the job of Italian prime minister:

Matteo Salvini embraces a very critical view of the European Union (EU), especially of the euro, which he once described a "crime against humanity". Salvini is also opposed to illegal immigration and the EU's management of asylum seekers.

On economic issues, he supports flat tax, tax cuts, fiscal federalism, protectionism and, to some extent, agrarianism. On social issues, Salvini opposes same-sex marriage, while he supports family values and the legalisation of brothels. In foreign policy he opposed the international embargo against Russia of 2014 and supported an economic opening to Eastern Europe and to countries of the Far East such as North Korea. (Wikipedia 2018)

Lega's success is in contrast to the situation in France, the Netherlands, and Germany, where nationalist parties have done well but have never been part of a ruling coalition. We thus have the strange sight of Angela Merkel looking for coalition partners on the left and even the far left, while studiously ignoring Alternative für Deutschland, a party that won 13% of the popular vote in her country's last general election.

Now the bad news:

A hung parliament and false friends

Without a majority in the Chamber of Deputies and the Senate, the center-right coalition will need support from the Movimento 5 Stelle (Five Star Movement), which came second with 33% of the popular vote. Unfortunately, that party will be far from supportive. It is not at all nationalist—contrary to what you may have read or heard.

Yes, the co-founder of the Five Star Movement, "Beppe" Grillo, has called for deportation of "terrorists" and people with no right to asylum:

"The migratory situation is out of control," Grillo wrote on his blog. "Our country is becoming a place where terrorists come and go and we are not able to recognise and report them and they can wander all over Europe undisturbed thanks to Schengen." "Those who have the right to asylum should stay in Italy, all the others should be repatriated at once, starting from today." "Schengen must be revised," he said, adding it should be suspended "immediately and border controls reinstated" when there is an attack until the suspects have been captured. (ANSA 2016)

Also, the current leader of the Five Star Movement, Luigi Di Maio, has called for "an immediate stop to the sea-taxi service", i.e., the ferrying of African migrants to Italy by NGOs (Reuters 2017).

Tough words. Keep in mind, however, that similar words have been spoken by conservative politicians elsewhere—Deport terrorists! No fake refugees! The problem, here, isn't that such promises have often been broken. The problem is that the issue of population replacement isn't even being addressed. The deconstruction of Europe thus continues, and at an ever higher rate.

Furthermore, if we look at actual party policy, and not personal opinions, we get a different picture of the Five Star Movement. In 2014 its members voted to decriminalize illegal immigration:

The Five Star Movement activists say no to the crime of illegal immigration. The majority of votes, which were cast online on Beppe Grillo's blog, were in favor of repealing the crime of illegal immigration. Yes for the repeal: 15,839. No: 9,093. There were 24,932 voters. (Corriere del Sera 2014)

Admittedly, that was four years ago, but only this year Luigi Di Maio reacted angrily when “extremist” remarks were made about immigration by the center-right candidate for Lombardy, Attilio Fontana.

"Berlusconi says that we are worse than the post-communists, that they are moderate and we extremists, but after Fontana's phrase about the white race are we sure that they are the moderates? If they are moderate then I am Gandhi. [...] We want to know if Fontana remains their presidential candidate [for Lombardy]." (ANSA 2018b)

Were Fontana's remarks extremist? Judge for yourself:

This is not an issue of being xenophobic or racist, but a question of being logical or rational. We cannot [accept all asylum seekers] because we won’t all fit in, so we have to make choices. We must decide if our ethnicity, if our white race, if our society, should continue to exist or if it should be wiped out. A serious State should plan and program a situation of this type. It should say how many we consider it right to receive and how many migrants we don't want to allow in, how we want to assist them, what jobs to give them, what homes and schools to give them. At that point, when a government prepares a project of this type, it submits it to its citizens.

It is absolutely unacceptable to say that we have to accept them all. It is a scheme that we must react against, that it is necessary to rebel against. We cannot accept them all because, if we did, we would no longer be ourselves as a social reality, as an ethnic reality. Because there are many more of them than us, and they are much more determined to occupy this territory. (ANSA 2018a; ANSA 2018b)

On March 4, the people passed judgment on Fontana: he was elected governor of Lombardy.

In all this, the Five Star Movement comes across as being too worried about its image and not sufficiently concerned about offering a coherent policy. This is a common failing of populist movements.


With this election, the bloc of nationalist states has welcomed a new member—a country near the core of the Western world-system. There is now a continuous stretch of territory from the Baltic to the Mediterranean where post-nationalism is no longer a “consensus.”

This new reality has not gone unnoticed, and there will likely be efforts to turn back the clock. The Italian parliament will become mired in one stalemate after another, and Salvini may have to go directly to the people, using his bully pulpit to rally support for his measures. Don't expect to see the Five Star Movement play a constructive role.

Salvini will also face determined opposition from the courts, the civil service, and the media—what we call the deep state. The situation, however, isn't the same as in the United States, where the elites don’t feel much in common with the American people and see no reason why they should. If Salvini can present his arguments boldly and energetically, he will mobilize support even among his country’s elites.


ANSA (2018a). White race at risk - Fontana on migrants (2). Centre-right Lombardy candidate says not question of racism, ANSAen Politics, January 15

ANSA (2018b). Attilio Fontana si scusa per la 'razza bianca' ANSAit. Lombardia, January 17

ANSA (2106). Grillo calls for mass deportations (2).ANSAen Politics, December 23  

Corriere della Sera. (2014). Grillo, gli iscritti del M5S dicono no al reato di immigrazione clandestine, January 13

Reuters (2017). Italian prosecutors widen investigation to include MSF over migrant rescues: source, World News, August 5

Wikipedia (2018). Matteo Salvini

Tuesday, March 6, 2018

Why universal human rights aren't universal

Jean Piaget (1896-1980). A renowned Swiss psychologist, he argued that moral development is linked to cognitive development.

Are intelligence and morality interlinked? This was what Swiss psychologist Jean Piaget concluded from his studies of child development. With increasing age, children develop not only intellectually but also morally, growing out of infantile self-centredness and into adult decentered-ness:

According to Piaget, moral development — the ability to judge ethical problems in an impartial and unbiased way — relies on prior cognitive development. Indeed, cognitive and moral development are structurally similar. In both is acquired a well-founded, reasonable structure. As Jean Piaget (1948/1932, p. 404) stated: "Parallelism exists between moral and intellectual development: ... Logic is the morality of thought just as morality is the logic of action." And this parallelism is based on the cognitive nature of morality, e.g. to behave ethically one has to take the perspective of third parties. (Rindermann and Carl 2018, p. 32)

This view has become popular and is even central to much of present-day thinking. If people are better educated, they will presumably become not only smarter but also more empathic and, thus, more considerate of their fellow humans. This view, as popular as it is, doesn't seem quite true. Many of us have known people who are intelligent and yet lacking in empathy. We call them psychopaths. Usually, they're explained away as aberrations. They're sick, aren't they? In reality, the line between 'normal' and 'psychopath' is arbitrary—like most mental traits, the capacity for empathy is distributed continuously along a bell curve. Lots of seemingly normal people have little empathy.

Nor does Piaget's view seem true if we look farther afield. Many moral systems attach little importance to empathy. Indeed, of all the world religions, Christianity seems unique in advocating the moral duty not only to help others but also to feel their pain, even when they aren't fellow Christians. Yes, most Christians fail to meet this standard of universal selflessness, but other religions don't set the bar so high. 

Indeed, the ideal of universal selflessness isn’t at all universal. It developed essentially within a single cultural context, the Christian world:

In Judaism and Christianity, "God created man in his own image" (Gen1:27 ESV). Humans being the image of God, "God-likeness", implies treating humans in a respectful way. Of course, at first blush, history reveals large discrepancies between the message of Christianity and the actual behavior of Christians. However, this does not mean that such behavior was consistent with the Christian message, and in many cases it was criticized by prominent Christians at the time. The Christian message had a corrective function. For instance, the inhumane treatment of American Indians by Spanish colonists was criticized by the Dominican priest Bartholomé de Las Casas (as mentioned above). The abolitionist movement was organized by Protestants and led by the Evangelical Christian William Wilberforce. The horrors of war were mitigated by charities such as the Red Cross, which was founded by the evangelical Christian, Henry Dunant. (Rindermann and Carl 2018, p. 34)

The Muslim world imported as many slaves as did the Christian world, yet a Muslim abolitionist movement never arose, and the trade was ultimately abolished worldwide through the intervention of Christian nations, particularly Great Britain. Today, the slave trade has left no legacy of guilt among Muslims, while it definitely has in those nations that strove to bring it to an end.

This apparent paradox has led Heiner Rindermann—a well-known psychologist in HBD circles—to challenge the Piagetian idea that moral development is linked to intellectual development. These two mental traits are distinct and have followed their own trajectories in different moral traditions.

To prove his point, he teamed up with sociologist Noah Carl to study how respect for human rights is related, cross-culturally, to cognitive ability and religion. They found a stronger relationship with religion than with cognitive ability. Specifically, the percentage of Christians in a society had a stronger positive impact (r = .62) on respect for human rights (Rindermann and Carl 2018) than did educational level (r = .54) or cognitive ability (r = .50 to .51).

One can quibble about the methodology. The study defines human rights largely as the right to make choices on one's own, regardless of existing social norms. Freedom of religion, for instance, is defined as the freedom not only to practice one's religion but also to convert to another. As the authors themselves note, this is not a legitimate freedom in much of the world, unless one is converting to the majority religion. Freedom doesn’t mean that a minority is free to become the majority.

Nonetheless, there does seem to be a correlation between Christianity and respect for human rights as long as we define the latter, at least in part, as maximization of personal choice and autonomy. 

Is Christianity confounded with European ancestry?

Correlation isn't causation. Couldn't Christianity be a proxy for "European-ness"? Indeed, most Christians are at least partly of European ancestry, and even more live in societies founded and still largely run by people of European origin.

To control for this confounding factor, one could compare Christian and non-Christian societies within a region where European ancestry is minimal. Sub-Saharan Africa comes to mind. Even in South Africa, the European minority is down to the single digits.

Rindermann and Carl (2018, p. 60) did make that comparison:

Within sub-Saharan Africa [...] the percentage of Christians is still positively (but weakly) related to human rights (r = .10; N = 48), and the percentage of Muslims is still negatively (but weakly) related to human rights (r = -.12).

Those correlations are indeed weak. Moreover, the one between Christianity and respect for human rights is largely due to the relatively stable societies of southern Africa, i.e., South Africa itself, Namibia, Botswana, Lesotho, and Swaziland. Those societies enjoy a judicial and administrative legacy that may not last much longer, given recent events and the example of Zimbabwe.

To be honest, I feel little in common with fellow Christians like Jacob Zuma and Robert Mugabe. Ironically, both of them have a better claim to being Christian than I do, since I refused to be confirmed after attending my confirmation classes.

Is European ancestry confounded with a genetically influenced trait?

If European ancestry is a confounding factor, could it be a proxy for some unknown genetically influenced trait? Rindermann and Carl tried to answer this question by estimating the average "skin brightness" of each country.

Skin brightness is more highly correlated with human rights than is cranial capacity (r = .25 vs. .18). Of course, skin color itself is unlikely to exert any effect; it constitutes a marker for evolutionary pressures that may be associated with culture. (Rindermann and Carl 2018, p. 53)

This is, I suspect, a reference to Arthur Schopenhauer (1788-1860) and his belief that humans had to become more intelligent as they spread into harsher northern climates: "those tribes that emigrated early to the north, and there gradually became white, had to develop all their intellectual powers, and invent and perfect all the arts in their struggle with need, want, and misery, which, in their many forms, were brought about by the climate. This they had to do in order to make up for the parsimony of nature, and out of it all came their high civilization" Parerga and Paralipomena, Volume II, Section 92.

Rindermann and Carl seem to be assuming that European skin became white solely as an adaptation to the northern natural environment. They also seem to be assuming that moral development is linked to cognitive development—the very hypothesis they want to test.

A better genetic marker would be the long allele for the 5-HTTLPR serotonin transporter gene. It's less frequent in collectivistic cultures than in individualistic cultures, the latter being the cultures of western and northern Europe—the same cultures that value so much the rights of the individual (Chiao and Blizinsky 2010). In a study of American toddlers, carriers of the short allele were more likely to imitate the way other people behaved (Schroeder et al. 2016).

The study provides additional evidence for the view that Christianity, in itself, doesn't explain why Europeans, and especially northwest Europeans, see all individuals as being endowed with the same rights. Of the three branches of Christianity, Protestantism has the strongest correlation with respect for human rights (r = .48), followed by Catholicism (r = .42), and finally Orthodoxy (r = -.07) (Rindermann and Carl 2018, p. 52). This suggests that Christianity changed as its geocenter progressively moved from the Middle East to southern Europe and then to northwest Europe, along the way becoming more focused on the individual and on individual responsibility. 

Within Christianity, Protestantism stresses conscience, individual guilt, internal control, autonomy and self-responsibility (Weber, 2008/1904). All these traits are conducive for liberty, the rule of law, democracy and human rights (Rindermann and Carl 2018, p. 34)

[...] in Protestant countries, trust is higher, corruption is lower and levels of social and economic freedom are higher (Delhey & Newton, 2005; Harrison, 2013). People tend to be more self-controlled, having internalized social rules, meaning that harsh and violent control by the state is not needed. (Rindermann and Carl 2018, p. 37)

The two authors are aware of the Hajnal Line and its relationship to a suite of psychological and behavioral traits. In societies north and west of a line running approximately from Trieste to St. Petersburg, social relations have long shown a certain pattern:

- men and women marry relatively late

- many people never marry

- children usually leave the nuclear family to form new households

- households often have non-kin members

This is the Western European Marriage Pattern (WEMP). Everyone is single for at least part of adulthood, many stay single their entire lives, and a significant proportion of households have members not belonging to the immediate family or even to kin. In short, an individual is less fettered by the bonds of kinship even within his or her household (Frost 2017).

This led to late marriage, high rates of childlessness (of about half of the cohort), more rights for women, and large investments in education. Going further than Hajnal himself did, it arguably also enhanced delay of gratification, self-control (especially of sexuality), conscientiousness, frugality, industry and cognitive ability. The causes of this marriage pattern can be traced to Roman, Germanic and Christian traditions, to the interests of the church, and to the interests of landlords and guilds. (Rindermann and Carl 2018, p. 39)

The above view is also the one held by *hbd chick, i.e., the WEMP developed after the introduction of Christianity and was, at least in part, a consequence of medieval Christian practices and institutions. Yet there is good evidence for the existence of the WEMP as early as ninth-century France and fragmentary evidence even earlier (Frost 2017). I have argued that the arrow of causality points in the other direction: a pre-existing mindset in northwest Europe was carried over into Christianity, much like the Christmas tree and other pagan traditions. Later, as the center of Christendom moved west and north, this mindset gained importance within Western Christianity and pushed it more and more toward the idea of individual salvation and an individual relationship with God. 

The northwest European mindset is characterized essentially by four interrelated mental traits:

Independent social orientation - independence of the self from others, including stronger motivation toward self-expression, self-esteem, and self-efficacy and emphasis on personal happiness rather than social happiness. 

Universal rule adherence - capacity to obey universal and absolute moral rules, i.e., moral universalism and moral absolutism, as opposed to situational morality based on kinship. These rules are enforced by monitoring not only others but also oneself. Rule-breakers may be branded as morally worthless and expelled.

Affective empathy - capacity to experience the emotional states of other people in order to prevent harm and to provide help if needed. Help is conditional on the other person being judged morally worthy.

Guilt proneness - capacity to self-monitor thoughts and behavior for rule adherence in order to self-judge and, if necessary, to self-punish.


Are universal human rights truly universal? If we look at cultures across space and time, we find that the notion of human rights was nonexistent in most cultures and historical periods. Not until the 18th and 19th centuries did some countries codify this notion in law, although it clearly has antecedents that go farther back, at least to the formulation of canon law by the Catholic Church and perhaps farther. Northwest Europeans seem to have long been predisposed to think in terms of individual rights and universal moral rules.

Since the early 19th century, we in the West have tried to impose these rights on the entire world, initially through the suppression of the slave trade and then through the efforts of missionaries and colonial authorities to ban certain practices, like the custom of sati in India. Such efforts became an integral part of Western imperialism and "the white man's burden."

Although this burden has since been taken up by truly international bodies, like the U.N., the notion of universal human rights still reflects a Western view of people as atomized individuals who mainly seek to maximize their wealth, happiness, and personal autonomy. This is not how most humans view the purpose of existence. For that matter, this view was not originally held by northwest Europeans, whose understanding of moral universalism has steadily radicalized and expanded in scope over time.


Chiao, J.Y. and Blizinsky, K.D. (2010). Culture-gene coevolution of individualism-collectivism and the serotonin transporter gene. Proceedings of the Royal Society B 277: 529-537.

Frost, P. (2017). The Hajnal line and gene-culture coevolution in northwest Europe, Advances in Anthropology 7: 154-174.

Rindermann, H. and N. Carl. (2018). Human rights: Why countries differ, Comparative Sociology 17: 29-69.

Schopenhauer, A. (1974)[1851]. Parerga and Paralipomena, English translation by E. F. J. Payne, Clarendon Press, Oxford, 2 volumes.

Schroeder, K.B., Asherson, P., Blake, P.R., Fenstermacher, S.K., and Saudino, K.J. (2016). Variant at serotonin transporter gene predicts increased imitation in toddlers: relevance to the human capacity for cumulative culture. Biology Letters 12(4).