Thursday, October 10, 2019

Is this the Gay Germ? Part II

Courtyard with Lunatics, Francisco Goya (1746-1828). Why is HIV much more likely to cause cognitive impairment in the body of a gay man than in the body of an intravenous drug user? Has an unknown pathogen been caught in the dragnet of AIDS studies?

My last post focused on certain discrepancies in data on AIDS victims: as antiretroviral therapy becomes more widespread, there has been a decline in opportunistic infections, but the decline hasn't been the same for all pathogens. In particular, some brain infections have shown modest declines or no change at all. 

Has an unknown pathogen been caught in the dragnet of AIDS studies? This pathogen would coexist with HIV only because it, too, is associated with the gay lifestyle. It would not be a "cofactor" that makes the HIV infection worse. In fact, it probably precedes the HIV infection by many years. This unknown pathogen may target certain sites in the brain of its host early in life in order to change his sexual orientation and thereby increase its chances of transmission to another host. It thereafter remains in the background until its host has reached an age when he ceases to be useful. The pathogen is then no longer penalized if it causes damage to surrounding neural tissues. Various neurocognitive disorders could therefore develop in its host from late middle age onward.

AIDS in gay men and intravenous drug users

This post will focus on discrepancies in data from two other papers. The first one is a study of AIDS victims in the Italian city of Bologna. Some of them contracted AIDS via homosexual/bisexual behavior, and some via intravenous drug use. One finding strikes me as unusual: "Compared with injecting drug users, homosexual/bisexual and heterosexual participants had ORs of 9.6 (95% CI, 2.2-42.7) and 6.3 (95% CI, 2.2-18.3), respectively, for cognitive impairment" (De Ronchi et al. 2002).

In other words, when the researchers looked at AIDS victims, they found that cognitive impairment was ten times more strongly associated with homosexuality/bisexuality than with intravenous drug use. That finding is curious because the ratio of ten to one doesn't correspond at all to the ratio of homosexuals/bisexuals to intravenous drug users among Italian AIDS cases. In fact, intravenous drug users made up about 60% of those cases in 1997 (Wikipedia 2019). The Bologna study took place between 1994 and 1997.

Why is HIV much more likely to cause cognitive impairment in the body of a gay man than in the body of an intravenous drug user? Do druggies take better care of their mental health? The evidence actually suggests the reverse: HIV-associated dementia seems to progress more rapidly in intravenous drug users (Bouwman et al. 1998). The latter finding also points to a qualitative difference between the two groups: dementia seems to develop more slowly in gay men.


The second paper is a review of studies on HAND [HIV-associated neurocognitive disorders]. It notes that HAND can develop even in individuals on HAART [Highly active antiretroviral therapy] with no detectable traces of HIV:

Furthermore, 21% [of individuals in the CHARTER study] developed HAND despite effective HAART (although the precise number who were aviremic is unclear). Similarly, in a cohort of individuals with AIDS, 21% of aviremic individuals (who also had undetectable CSF HIV RNA) progressed to HAD [HIV-associated dementia]. A third prospective study also identified HAND in 8-34% (depending on the time point of the assessment) of aviremic patients without comorbidities and with a nadir CD4 cell count less than 200 cells/µl (McArthur and Brew 2010)

The authors suggest that HIV can produce irreversible neural damage that becomes noticeable only much later in life. Well, perhaps. Nonetheless, it seems to me more parsimonious to postulate a second pathogen.

Parting thoughts

Clearly, HIV does cause cognitive impairment. The Bologna study showed a strong association between HAND and low white cell counts. But it looks like a certain proportion of HANDs are due to a cause that exists independently of HIV infection.

Please note: I'm not arguing that HIV is interacting with an unknown pathogen to cause cognitive impairment. I am arguing that these two pathogens impair cognition independently of each other and in different ways. They share only one thing in common: they have a much higher incidence among gay men than in the general population.

Finally, I'm not arguing that this unknown pathogen is the only cause of male homosexuality. There are likely multiple causes. In a nutshell, male homosexuality seems to be due to a genetic predisposition interacting with something in the environment. The genetic predisposition is a smaller-than-average neuronal population that promotes a heterosexual orientation. Normally, natural selection keeps it from falling below the threshold needed to sustain attraction to women. Certain environmental agents, however, can cause this neuronal population to fall below the threshold: fraternal birth order effects, stressful events during pregnancy, exposure to environmental estrogens during childhood, and, yes, a pathogen.

I don't know whether my views on the "gay germ theory" are consistent with Greg Cochran's. I hope he will deign to provide his comments.


Bouwman, F., R. Skolasky, D. Hes, O. Selnes, J. Glass, T. Nance-Sproson, W. Royal, G. Dal Pan,  and J. McArthur. (1998). Variable progression of HIV-associated dementia. Neurology 50(6): 1814-1820. 

Cochran, G.M., Ewald, P.W., and Cochran, K.D. (2000). Infectious causation of disease: an evolutionary perspective. Perspectives in Biology and Medicine 43: 406-448. 

De Ronchi, D., I. Faranca, D. Berardi, et al. (2002). Risk Factors for Cognitive Impairment in HIV-1-Infected Persons with Different Risk Behaviors. Archives of Neurology 59(5): 812-818.

McArthur, J.C., and B.J. Brew. (2010). HIV-associated neurocognitive disorders: is there a hidden epidemic? AIDS 24(9): 1367-1370|aidsonline:2010:06010:00017|| 

Wikipedia (2019). HIV/AIDS Public Health Campaigns in Italy

Wednesday, October 2, 2019

Is this the Gay Germ?

Poster for 1997 World AIDS Day (Wikicommons - Neil Curtis, Christian Michelides). Antiretroviral therapy has reduced infections in AIDS victims, but the decline hasn't been the same for all pathogens. Some infections have shown modest declines or no change at all. Could they be due to the "gay germ"?

Male homosexuality has low to moderate heritability (30 to 45%). A recent study in the UK Biobank and 23andMe has identified a number of genetic variants associated with same-sex sexual behavior. Together, they account for 8 to 25% of variation in male and female same-sex behavior (Ganna et al. 2019). There is thus a genetic predisposition, but it's weak and may simply reflect a smaller population of neurons for heterosexual orientation.

So this genetic predisposition seems to be interacting with something in the environment. But what?

There may be different environmental factors. One possibility would be a pathogen that alters its host's sexual orientation in order to enhance its chances of spreading to other hosts. This is Greg Cochran's "gay germ" theory (Cochran et al. 2000).

With the introduction of antiretroviral therapy for AIDS, we may have a chance to identify candidates for the "gay germ." Over time this therapy should reduce the incidence of infections in AIDS victims. Indeed it has, but the decline has been uneven.  A retrospective study of AIDS autopsies in Vienna between 1984 and 1999 found a lower rate of decline for infections due to fungi and most bacteria than for infections due to protozoa, viruses, and mycobacteria:

Extracerebral protozoal (Pneumocystis carinii, toxoplasmosis), Mycobacterium avium complex, viral [e.g., cytomegalovirus (CMV)], multiple opportunistic organ and CNS infections, and Kaposi sarcoma significantly decreased over time. There was less decrease in fungal infections, while bacterial organ and CNS infections (except for mycobacteriosis), lymphomas, HIV-associated CNS lesions (around 30%), non HIV-associated changes (vascular, metabolic, etc.) and negative CNS findings (10-11%) remained unchanged. (Jellinger et al. 2000)

These findings are in line with those of a retrospective study of AIDS autopsies in San Diego between 1982 and 1998:

Pneumocystis carinii pneumonia and Mycobacterium avium complex decreased, whereas bacterial infections increased and the frequency of fungal infection remained unchanged over time. (Eliezer et al. 2000)

After the lungs, such pathogens most often target the brain:

This study suggests that despite the beneficial effects of antiretroviral and anti-opportunistic infection therapy, involvement of the brain by HIV continues to be a frequent autopsy finding. (Eliezer et al. 2000).

Similar to a recent autopsy study from San Diego, these data suggest that despite the beneficial effects of modern antiretroviral combination therapy, involvement of the brain in AIDS subjects continues to be a frequent autopsy finding. (Jellinger et al. 2000)

Subjects with brain alterations at an early stage otherwise seemed almost normal:

Of the cases with early brain alterations, systemic opportunistic infections were present in only 5.9% of the cases, neoplasms in 0.5%, and neoplasms and opportunistic infections in 1.7%. (Eliezer et al. 2000)

A few caveats

The change in incidence over time partly reflects differences between fast-developing infections and slow-developing ones. By definition, people succumb more quickly to the former than to the latter. When antiretroviral therapy was still unavailable those infections were the ones that generally killed people with AIDS. Better control of aggressive infections may have also created a better environment for the growth of less aggressive infections.

But ...

It is harder to explain why the brain should remain a major pathogenic target. It is especially hard to explain why subjects with brain alterations at an early stage otherwise seemed almost normal.

Eggers et al. (2017) pointed out another apparent contradiction: HIV-associated neurocognitive disorders (HAND) are continuing to develop in people whose HIV infection is under control.

Despite the brain infection taking place in the days after primary infection, the development of HAND takes years. As an explanation for this ostensible contradiction, it has been suggested that initially, the brain infection is relatively well controlled, while later, there is a quantitative and qualitative breakdown of immune control in the CNS (Eggers et al. 2017)

Some authors have suggested co-infection by the Hepatitis C virus, but Eggers et al. (2017) ruled this out:

While some authors implicated HCV co-infection in the pathogenesis of HAND, a recent large and well-controlled study found no evidence for worse cognitive function in HCV co-infected patients, at least in the absence of liver dysfunction. (Eggers et al. 2017)

Pathogen "X"

Could we be looking at an unknown pathogen that exists independently of HIV? Over the years some have suggested that HIV is not the only pathogen involved in AIDS. In this case, pathogen "X" may cause adverse effects that get blamed on HIV, but its relationship with HIV is incidental, the only common denominator being the gay lifestyle.

I would propose the following scenario. Pathogen "X" enters its host early in life, just in time to alter that person's psychosexual development. From then on it remains in the background and reaps whatever benefit it gets from its behavior manipulation. Past the age of 40 the host becomes less useful, and the pathogen begins to cause more adverse effects, including neurocognitive disorders that are wrongly attributed to HIV.

Pathogen "X" is most likely a fungus. If we go back to the two retrospective studies, the fungal infections were the ones that seemed the least influenced by the introduction of antiretroviral therapy.


Cochran, G.M., Ewald, P.W., and Cochran, K.D. (2000). Infectious causation of disease: an evolutionary perspective. Perspectives in Biology and Medicine 43: 406-448. 

Eggers, C., G. Arendt, K. Hahn, K., I.W. Husstedt, M. Mashke, et al. (2017). HIV-1-associated neurocognitive disorder: epidemiology, pathogenesis, diagnosis, and treatment. Journal of Neurology 264: 1715-1727

Eliezer, M., R.M. DeTeresa, M.E. Mallory, and L.A. Hansen. (2000). Changes in pathological findings at autopsy in AIDS cases for the last 15 years. AIDS 14(1): 69-74.

Ganna, A., K.J.H. Verweij, M.C. Nivard, R. Maier, R. Weddow, et al. (2019). Large-scale GWAS reveals insights into the genetic architecture of same-sex sexual behavior. Science 365(6456) 

Jellinger, K.A., U. Setinek, M. Drlicek, G. Böhm, A. Steurer, and F. Lintner. (2000). Neuropathology and general autopsy findings in AIDS during the last 15 years. Acta Neuropathologica 100(2): 213-220.

Wednesday, September 25, 2019

Differences in the genetic architecture of cognition?

Tableau III, Piet Mondrian (1872-1944). The polygenic score can provide a measure of innate cognitive ability in various human populations. However, it is less valid for African Americans, apparently because of differences in the genetic architecture of cognition.

When IQ is measured in European Americans and African Americans, the two groups differ on average by about 15 points. Is the difference genetic? Or is it due to different environments?

After years of debate, we are coming close to an answer. The weight of evidence is shifting, especially because of two unrelated developments: 

- We can now easily measure ethnic ancestry by means of genetic data. Previously, we had to rely on self-report or indirect measures like skin color.

- We can now measure the genetic component of cognitive ability: the polygenic score. This score is a summation of alleles associated with high educational attainment. Initially a crude measure, it is getting better and better as we identify more and more of these alleles.

Both research tools were used in a recent study. Lasker et al. (2019) applied them to the Philadelphia Neurodevelopmental Cohort, a sample of 9421 individuals from the Philadelphia area who received medical care from the Children's Hospital of Philadelphia network. They ranged in age from 8 to 21 with a mean of 14.2. They were 51.7% female, 55.8% European American, 32.9% African American, and 11.4% other. All of them were genotyped and given a series of cognitive tests.

This dataset had advantages over those of previous studies:

- All participants came from the same geographic area. 

- Heritabilities of cognitive ability were already estimated by another research team, specifically 0.61 for the African American participants and 0.72 for the European American participants.

- Skin, hair, and eye color could be estimated from the genetic data to control for the effects of "colorism" (discrimination favoring lighter-skinned over darker-skinned African Americans)

- Polygenic scores could be calculated from the genetic data

The main disadvantage was the participants’ young age. Before adulthood the brain is plastic and still developing, so the heritability of cognitive ability is lower. 

IQ results

IQ scores were 100 for European Americans, 98 for self-described biracial Americans, and 85 for African Americans. The three groups were respectively 99% European, 80% European, and 19% European.

African Americans only

European admixture significantly correlated with IQ among the African American participants. The correlation remained significant after controlling for either skin color or socioeconomic status. Interestingly, skin and hair color didn't significantly correlate with IQ independently of European admixture, but eye color did. Brown eyes correlated positively with IQ. No explanation was offered by the authors. Did they get the same finding with European Americans?

Biracial Americans only - smarter than expected

As with African Americans, skin color didn't seem to influence intelligence independently of European admixture. On the other hand, "biracial status had a significant effect independently of European ancestry." In other words, racially mixed individuals who identified equally as African American and European American, and not just as African American, tended to be more intelligent than what their degree of European admixture would predict. 

The term "biracial" as a badge of identity is recent and seems to be most popular among middle-class people: 

Interestingly, many of the respondents here who identify as biracial are middle class, educated in private schools, and raised in predominantly white neighborhoods with mostly white social networks (Rockquemore and Brunsma 2008, p. xxii)

It may be, then, that self-identified "biracial" people have parents who are, on average, of higher quality than other people of the same racial background.

African Americans, Biracial Americans, and European Americans combined

When all three groups were combined, the most important factor was European admixture. Next came socioeconomic status, which correlated with cognitive ability independently of European admixture. Finally, self-identification as a European American had an effect over and above that of European admixture. The last factor suggests that European American culture has a positive influence on cognitive ability.

Polygenic score results

The polygenic scores ran into a problem that others have noted: the genetic architecture of cognition seems to be different in African Americans. This is a problem because researchers have used only Europeans or European Americans to identify genetic variants that are associated with high educational attainment. Those variants did correlate with cognitive ability in the African American sample, but to a much lower degree than in the European American sample. Their validity as a measure of cognitive ability was only 20% of what it was in the European American sample.

The authors used a subset of the same variants to create a polygenic score that would be less sensitive to linkage disequilibrium decay and thus more valid across different human populations. This polygenic score had good validity in both the African- and European-American samples (r = 0.112 and r = 0.227 respectively).

The authors then tried to create an even better polygenic score by excluding variants that are rare in African Americans. There was no effect on the results for either African Americans or European Americans. But what about the reverse? Perhaps cognitive ability is improved by some variants that are rare in European Americans but common in African Americans.


This is an excellent study, on a par with the Minnesota Transracial Adoption Study (Frost 2019). The main problem is the participants’ young age. Had adults been used, there would have been less noise in the data, and the results would have been better.

Another problem is the apparently different genetic architecture of cognition in people of sub-Saharan African origin. Piffer (2019) has noted that polygenic scores underestimate African American IQ. He disagrees, however, with the "different genetic architecture" hypothesis, pointing out that no divergence exists between mean population IQ and the polygenic scores he has calculated for various sub-Saharan African groups (Esan, Gambians, Luhya, Mende, Yoruba). None of them, however, are Igbo, and the Igbo are really the one group that stands out from other West Africans on measures of intellectual and educational attainment (Frost 2015a, 2015b). They also contributed to the gene pool of African Americans: "Many of the enslaved Igbo people in the United States were concentrated in Virginia's lower Tidewater region and at some points in the 18th century they constituted over 30% of the enslaved black population" (Wikipedia 2019).

While the polygenic score is a good measure of raw cognitive ability in most humans, we need to develop a modified version for people whose ancestry comes primarily from sub-Saharan Africa.


Frost, P. (2015a). The Jews of West Africa? Evo and Proud, July 4 

Frost, P. (2015b). No, blacks aren't all alike. Who said they were? Evo and Proud. October 10. 

Frost, P. (2019). IQ of biracial children and adults. Evo and Proud. March 10.

Lasker, J., B.J. Pesta, J.G.R. Fuerst, and E.O.W. Kirkegaard. (2019). Global ancestry and cognitive ability. Psych 1(1) 

Piffer, D. (2019). Evidence for Recent Polygenic Selection on Educational Attainment and Intelligence Inferred from Gwas Hits: A Replication of Previous Findings Using Recent Data. Psych 1(1): 55-75  

Rockquemore, K.A. and D.L. Brunsma. (2008). Beyond Black. Biracial Identity in America. Lanham: Rowman & Littlefield Publishers.

Wikipedia. (2019). Igbo Americans

Wednesday, September 18, 2019

Have we been selected for long-term thinking?

GDP per capita as a function of future orientation (Preis et al. 2012)

To what degree do we value the short term over the long term? The answer varies not only from individual to individual but also from society to society. Hunter-gatherers, for instance, value the short term. Perishable food cannot be stored for future use and, in any case, is not normally obtained in large enough amounts to make storage worthwhile. If a hunter gets more meat than his family can consume, he'll give it away to others in the local band.

There are exceptions, especially at northern latitudes. Meat can be stored in caches during winter and in cold lake waters during summer. With limited opportunities for food gathering, women specialize in technologies that need more cognitive input and longer-term thinking, like garment making, needlework, weaving, leatherworking, pottery, and use of kilns. Finally, men hunt over longer distances and therefore plan over the longer term. Northern hunting peoples thus broke free of the short-term mental straitjacket imposed by hunting and gathering. In time, their descendants would spread south and rise to the challenges of social complexity (Frost 2019).

Those northern hunting peoples were better able to exploit the opportunities created by farming, but the transition from one lifestyle to the other was still far from easy. Farming requires not only longer-term thinking but also less monotony avoidance and higher thresholds for expression of personal violence. In recent times, hunter-gatherers usually refused offers to be settled on farms. They saw farming as akin to slavery.

The change in mindset didn't end with the transition to farming. There were different types of farming, and some required longer-term investment than others. Those types generated stronger selection for future orientation.

Language as a mirror of cultural evolution

Galor et al. (2018) argue that language is a mirror of cultural evolution. It can show a society’s degree of commitment to a long-term mindset, as well as other psychological traits.

The periphrastic future tense

The authors studied the relationship between future orientation and forms of the future tense that express intention and obligation, rather than simply prediction:

Languages differ in the structure of their future tense. In particular, linguists distinguish between languages that are characterized by an inflectional versus periphrastic future tense [...]. Inflectional future tense is associated with verbs that display morphological variation (i.e., a change in the verb form that is associated with the future tense). In contrast, periphrastic future tense is characterized by roundabout or discursive phrases, such as `will', `shall', `want to', `going to' in the English language [...] (Galor et al. 2018, p. 6)

[U]nlike the inflectional future tense, the periphrastic future tense is formed by terms that express a desire, an intention, an obligation, a commitment as well as a movement towards a goal. In particular, in the English language, "shall has developed from a main verb meaning 'to owe', will from a main verb meaning 'to want', and the source of be going to is still transparent" [...]. Moreover, "intention and prediction are most commonly expressed by the periphrastic future, while the synthetic one is more common in generic statements, concessives, and suppositions" [...]. Inflectional futures "also appear systematically (often obligatorily) in sentences which express clear predictions about the future (which are independent of human intentions and planning), whereas less grammaticalized constructions [i.e., periphrastic] often tend to be predominantly used in talk of plans and intentions - a fact which is explainable from the diachronic sources of future tenses" [...] (Galor et al. 2018, p. 6)

Galor et al. (2018, p. 16) used pre-1500 AD data to estimate the return on agricultural investment ("crop return") in the homeland of a language’s speakers. They found a positive correlation between this return on investment and the existence of a periphrastic future tense. They concluded that "a one standard deviation increase in crop return in the language's contemporary homeland is associated with a 6 percentage points increase in the probability that the language is characterized by a periphrastic future tense."

Using the World Values Survey, the authors also found a positive correlation between the existence of a periphrastic future tense and future orientation. The correlation held true both for the people of the world as a whole and for Old World peoples who speak languages originating in the Old World (Galor et al. 2018, p. 23).

Interestingly, the return on agricultural investment did not correlate with other linguistic characteristics, like the existence of the past tense or the perfect tense, the existence of possessive classifications, the existence of coding for evidentiality, the number of consonants, and the number of colors (Galor et al. 2018, pp. 18-19).

Grammatical gender

The authors also looked into the relationship between grammatical gender and the sexual division of labor in a language's homeland:

Further, consider ancient civilizations that had been characterized by a sexual division of labor and consequently by the existence of gender bias. Linguistic traits that had fortified the existing gender biases have plausibly emerged and persisted in these societies over time. In particular, geographical characteristics that had been associated with the adoption of agricultural technology that had contributed to a gender gap in productivity, and thus to the emergence of distinct gender roles in society (e.g., the suitability of land for the usage of the plow […]), may have fostered the emergence and the prevalence of sex-based grammatical gender in the course of human history. (Galor et al. 2018, p. 2)

Galor et al (2018, p. 24) found a negative correlation between grammatical gender and “plow negative” crops (i.e., crops not requiring use of the plow and, hence, requiring less male participation). A one standard deviation increase in the potential caloric yield of plow negative crops was associated with a 13 percentage point decrease in the probability that the language has grammatical gender.  The correlation was reversed in the case of all crops, the caloric yield now being associated with a 17 percentage point increase in the probability that the language has grammatical gender.

Politeness distinctions in pronouns

Finally, Galor et al. (2018) looked into the relationship between politeness distinctions in pronouns and ecological diversity, which they related to the emergence of hierarchical societies.

Linguistic traits that had reinforced existing hierarchical structures and cultural norms had conceivably emerged and persisted in these stratified societies in the course of human history. In particular, politeness distinctions in pronouns (e.g., the differential use of "tu" and "usted" in the Spanish language, "Du" and "Sie" in German, and "tu" and "vous" in French) had conceivably appeared and endured in hierarchical societies. Thus, geographical characteristics, such as ecological diversity that had been conducive to the emergence of hierarchical societies (Fenske, 2014), may have contributed to the emergence of politeness distinctions. (Galor et al. 2018, p. 2)

Galor et al. (2018, p. 32) found a significant relationship between politeness distinctions and ecological diversity in a language's homeland. A one standard deviation increase in ecological diversity corresponded to a 15 percentage point increase in the probability that the language has politeness distinctions.

I'm skeptical about the last finding. Is ecological diversity conducive to hierarchical societies? The authors refer to a study that mostly uses African data. More to the point, the study seeks to link ecological diversity to centralized states. Centralization of state power and social hierarchization are not the same thing. Japan, for instance, had a weak central state for much of its history and yet was very hierarchical, as seen in the politeness distinctions of the Japanese language.


Although the authors refer to work by L.L. Cavalli-Sforza, Peter Richerson, and Robert Boyd on gene-culture coevolution, they avoid discussing the possibility that selection for future orientation, gender specialization, and hierarchical politeness has influenced not only culture and language but also human biology. The coevolution they propose is simply between culture and language. It can be summed up as follows:

- Certain patterns of mind and behavior have been favored to varying degrees in different societies.

- These cultural patterns are transposed into language.

- Language then reinforces those cultural patterns: "In light of the apparent coevolution of cultural and linguistic characteristics in the course of human history, emerging linguistic traits have conceivably reinforced the persistent effect of cultural factors on the process of development" (Galor et al. 2018, p. 1).

Language is not a passive mirror of culture. It can also act upon culture. For instance, the way we perceive the future, and its relative importance to us, may be shaped by the way we speak. This is of course the Sapir-Whorf hypothesis.  In a farming society, the periphrastic future tense might make it easier to envision farming methods and technologies that pay off over the longer term. Similar arguments have been made for grammatical gender and politeness distinctions. The way we speak influences our thoughts and behavior.

Again, the authors leave it to the reader to go one step farther: patterns of mind and behavior may influence the frequencies of alleles in the gene pool.


Fenske, J. (2014). Ecology, trade, and states in pre-colonial Africa. Journal of the European Economic Association 12(3): 612-640. 

Frost, P. (2019). The Original Industrial Revolution. Did Cold Winters Select for Cognitive Ability? Psych 1(1): 166-181 

Galor, O., O. Özak, and A. Sarid. (2018). Geographical Roots of the Coevolution of Cultural and Linguistic Traits (November 7, 2018). Available at SSRN: or  

Preis, T., H.S. Moat, H.E. Stanley, and S.R. Bishop. (2012). Quantifying the advantage of looking forward. Scientific Reports 2: 350

Wednesday, September 11, 2019

Why is vocabulary shrinking?

Vocabulary decline in adult non-Hispanic White Americans (controlled for years of education completed)

"Are Americans more intelligent than a few decades ago, or less intelligent?" So asks psychologist Jean Twenge in her introduction to a recent paper on vocabulary decline in Americans. The findings are disconcerting, to say the least:

We examined trends over time in vocabulary, a key component of verbal intelligence, in the nationally representative General Social Survey of U.S. adults (n=29,912). Participants answered multiple-choice questions about the definitions of 10 specific words. When controlled for educational attainment, the vocabulary of the average U.S. adult declined between the mid-1970s and the 2010s. Vocabulary declined across all levels of educational attainment (less than high school, high school or 2-year college graduate, bachelor's or graduate degree), with the largest declines among those with a bachelor's or graduate degree. (Twenge et al. 2019)

The last decline was especially large: more than half a standard deviation. In general, vocabulary test scores have fallen by 8.5%. Ethnic change doesn’t seem responsible, since non-Hispanic whites have had almost the same decline: 7.2%.

So what's going on? The authors considered the explanation they first raised: Americans have become less intelligent despite the increase in education.

First, Americans' vocabularies might be shrinking despite the increase in education. This is plausible given the steep decline in the amount of time high school students spend reading [...] and the decline in SAT verbal scores over time [...]. This explanation could account for the narrowing of abilities between those without high school educations and those with college educations. The difference in vocabulary by education was approximately 3.4 correct answers in 1974-79 but dropped to 2.9 correct answers by 2010-16. However, this explanation would not account for the decline in performance in all educational groups. (Twenge et al. 2019)

Uh, why not? The last sentence makes sense if the explanation is simply that postsecondary education has become less effective. But what if vocabulary has declined because the capacity for learning words and retaining them has also declined? The cause may be genetic. Can we at least ask that question?

Lower admission standards? Mismatch between cause and effect

The authors then consider another explanation: because college admission standards have been lowered, people of lower ability have been going on to postsecondary education in larger numbers; those who don't are increasingly the least able.

If education does not improve vocabulary, but educational attainment increases, those with higher ability will be increasingly selected into the higher education groups, leaving those with lower ability in the lowest educational attainment groups. Thus, the no high school degree group will be left with those of lowest ability, and the college graduate group will have absorbed more with only moderate ability. (Twenge et al. 2019)

That explanation is popular, but it doesn’t really match the findings. The vocabulary decline was steepest during the late 1970s and early 1980s. It then levelled off. A second decline may have begun in 2008, but it’s still too early to say (see Figure 1 reproduced above). Most of the decline doesn't correspond to any previous change in college enrollment by recent high school graduates. The enrollment rate rose slowly from 45.7% in 1959 to 49.4% in 1980. It began to grow faster only in the mid-1980s, breaking through the 60% level in 1991 and the 70% level in 2009 (Bureau of Labor Statistics 2010).

So the alleged cause doesn’t match the presumed effect. The steep increase in college enrollment from the mid-1980s onward could not have caused the steep vocabulary decline during the late 1970s and early 1980s. Keep in mind that most of the GSS respondents had completed their education some years earlier, almost ten years earlier on average. So the average respondent in the late 1970s had to meet college admission standards that existed in the late 1960s.

Most of the decline has been among early boomers

Because the GSS was first administered in 1974, we don't know when the steep vocabulary decline began. But we do know when it ended: in the mid-1980s, among respondents who were born on average thirty years earlier. A genetic cause would imply a rapid deterioration in the gene pool from 1945 to 1955 and a slower deterioration thereafter. I have no idea what that cause could be.

If we're looking for a cultural cause, it would have acted most strongly on the same cohort of "early boomers." Perhaps it was their increasing exposure to TV and their decreasing exposure to high literature. Those cultural changes were already a fait accompli for "late boomers," who experienced a more gradual dumbing down of vocabulary on TV and in print. The post-2008 vocabulary decline, if it’s real, might reflect the growing importance of iPhone texting since the late 2000s.

That cultural explanation has some support from the data and is favorably mentioned by the authors. For one thing, comparison with the results of another test (WAIS) suggests that the decline has been mostly in passive vocabulary, i.e., the words we understand but don’t use spontaneously in speech (Twenge et al. 2019). We’re less proficient in "bookish" language:

Perhaps American culture became less intellectual, either because of or in response to a lowering of verbal ability among those who read books. Authors aim to sell more copies of their books, and thus may adjust their vocabulary level to the skills and preferences of a wider slice of the population. Or, perhaps authors lowered the vocabulary level of their books for some other reason such as an interest in getting out a message without linguistic complexity getting in the way. For example, the Bible has been revised repeatedly to make it more accessible with the King James Version, the most complex and lyrical English language version, being succeeded by the simpler New International Version, Living Bible, and New Revised Standard Version. (Twenge et al. 2019)

The last point rings true. When I was studying Shakespeare in high school my mother could explain words I had trouble understanding. She had never gone beyond Grade 10, but she could read the Bible in the King James Version, as well as a lot of high-brow literature. This was true for many ordinary adults in the 1970s. Today, regular reading of the Bible is unusual and almost always confined to modern English versions.


Yes, college has become a less interesting place for learning vocabulary, and for learning in general. Yes, a big reason is the growing number of students who don’t really belong there, and the consequent lowering of standards. Yes, America’s cultural and linguistic mix is changing, and for that reason alone the average American would have a smaller English vocabulary.

Nonetheless, those factors fail to explain why non-Hispanic white Americans know fewer words today than they did a half-century ago, especially in their passive vocabulary. Something else is going on, and it seems to be a shift away from high literature and toward simpler audiovisual media: TV, video, text messaging …


Bureau of Labor Statistics. (2010). College enrollment up among 2009 high school grads. TED: The Economics Daily. April 28

Twenge, J.M., W.K. Campbell, and R.A. Sherman. (2019). Declines in vocabulary among American adults within levels of educational attainment, 1974-2016. Intelligence 76: 101377

Wednesday, September 4, 2019

Why podophilia?

Repos dans les récoltes – William-Adolphe Bouguereau (1825-1905)

What is it about women's feet? They are the part of a woman’s body that men most often fetishize. A study on the frequencies of different fetishes concluded: "Feet and objects associated with feet were the most common target of preferences [...] We found podophilia prominent (about half of Feticist groups subscribers) in our sample" (Scorolli et al. 2007). 

That finding is in line with many others:

- Podophilia was common in a sample of male adolescents and young adults with either autistic disorder (AD) or borderline/mild mental retardation (MR): "Partialism (a sexual interest in body parts) was common in the AD group: four individuals got sexually aroused by body parts (three by feet, one by bellies) compared to none of the MR group" (Hellemans et al. 2010). 

- A former escort girl and stripper "reported that [her] most frequent requests were (1) those involving a foot or shoe fetish, (2) those to sell to the male client her underwear, and (3) those to urinate into her underwear before selling it to the client" (Cernovsky 2015). 

- Online searches that include the term "fetish" most often co-occur with the term "foot" (Anon 2007)

- The Austrian psychologist Wilhelm Stekel noted that ''the most widespread form of partialism is preference for feet” (Stekel 1952, p.169)

Female feet have been eroticized even by Victorian writers like George du Maurier (1834-1896):

"That's my foot," she said, kicking off her big slipper and stretching out the limb. "It's the handsomest foot in all Paris. There's only one in all Paris to match it, and here it is," and she laughed heartily (like a merry peal of bells), and stuck out the other.

And in truth they were astonishingly beautiful feet, such as one only sees in pictures and statues—a true inspiration of shape and color, all made up of delicate lengths and subtly modulated curves and noble straightnesses and happy little dimpled arrangements in innocent young pink and white. (Du Maurier 1894, p. 174)

The cause?

There has been a lot of speculation. Ramachandran and Hirstein (1998) attributed podophilia to accidental cross-talk between adjacent regions of the cortex:

In the Penfield homunculus the genitals are adjacent to the foot and, as one might expect, we found that two [amputee] patients reported experiencing sensations in their phantom foot during sexual intercourse. [...] (One wonders whether foot-fetishes in normal individuals may also result from such accidental 'cross wiring'—an idea that is at least more plausible than Freud's view that such fetishes arise because of a purported resemblance between the foot and the penis.)

Actually, Sigmund Freud proposed three hypotheses. He listed them in a footnote and apparently had no strong opinions on the subject. His first hypothesis was that feet are fetishized because they are strong-smelling. His second was that “[t]he foot replaces the penis which is so much missed in the woman.” Finally, he suggested that foot fetishism arises from male desire being redirected away from the female genital area because of “prohibition and repression” (Freud 1920, n19).

The third hypothesis seems to me the most interesting. A young man may focus on a woman’s feet because he cannot look too long at other parts of her body either because of social discomfort (in the case of her face or her breasts) or because they are concealed by clothing, so he looks at a body part that is exposed and freely observable. This is especially a problem in societies where an unmarried woman is expected to cover herself when seen by a man from outside her family (i.e., neither her father nor her brothers). Only her face, hands, and feet may be seen, and sometimes not even her face. Her feet thus become a focus of male erotic interest and sexual fantasizing. With repeated reinforcement and conditioning, they may even become a primary source of sexual arousal.

The reinforcement and conditioning hypothesis has two problems: 

1). In Western societies, socks and other footwear have been worn indoors and out since the eighteenth century, and women’s arms, legs, and upper chests have become denuded since the early twentieth century. If feet no longer rank among the top three areas of naked female skin, shouldn’t podophilia be a lot less common nowadays?   

2). Although puberty seems to be key to development of podophilia, a survey of foot fetishists showed that about half of them remembered feeling attraction to feet at earlier ages:

45 per cent thought that the fetishism was linked to pleasurable experiences during childhood. Many men had their first feelings of sexual pleasure with a member of the family's feet (fathers, uncles, brothers), the experience connected to innocent activities such as tickling or washing feet [...] (Peakman 2013, p. 379)

The mental circuitry thus seems to be already in place by childhood, at an age when sexual fantasizing is still rudimentary at best.


Perhaps some of that circuitry has become hardwired, through a process of gene-culture coevolution. In societies where young unmarried women had to conceal most of their body surface from public view, foot fetishizing may have developed as a safe form of premarital eroticism. That kind of social environment rewarded “good boys” who played by the rules of premarital sexuality, while penalizing “bad boys” who didn’t. The first group would tend to have a certain mix of hardwired sexual predispositions: not only inhibition of overt sexual interest but also displacement of sexual interest into areas that are not socially penalized. Over time, and with each passing generation, those predispositions would have become prevalent in the gene pool.

That sounds weird, but we see such hardwiring in the courtship behavior of other mammals, which typically try to attract a potential mate through behavioral patterns drawn from other areas of social interaction, such as between a mother and her infants. In some cases, courtship can incorporate stress-induced behavior. Feelings of stress cause the male to preen himself, and preening thus becomes a regular and expected part of courtship, at which point there is strong selection to make it a hardwired component of the behavioral sequence (Manning 1972, pp. 112-118).

That kind of opportunism seems to characterize much of our sexual behavior. Kissing, for instance, was initially done only between a mother and her infants or as a gesture of respect between a subordinate and his superior. It then became sexualized in some societies but not in others (Frost 2015). Did podophilia follow a similar evolutionary path? Did it begin as a side effect of sexual repression and later became incorporated into love play? Like kissing, it may have developed as a safe alternative to sexual intercourse. Unlike kissing, it has not reached the same level of social acceptance. Keep in mind that even kissing is frowned upon in many societies. 

To test this hypothesis, we need cross-cultural data. Is podophilia more frequent in those societies where, at least until recent times, most of a woman’s body surface was hidden from the gaze of male strangers?


Anon. (2007). The AOL Search Data: Self Identified Fetishers. Accessed September 4, 2019

Cernovsky, Z.Z. (2015). Fetishistic Preferences of Clients as Ranked by a Sex Worker. Journal of Sex & Marital Therapy 42(6): 481-483. 

Du Maurier, G. (1894). Trilby. Harper’s New Monthly Magazine. January 88(524): 168-189.

Freud, S. (1920). Three contributions to the theory of sex. Second edition. New York: Nervous and Mental Disease Publishing Co. 

Frost, P. (2015). Not everyone does it. Evo and Proud, July 18

Hellemans, H., H. Roeyers, W. Leplae, T. Dewaele, and D. Deboutte. (2010). Sexual Behavior in Male Adolescents and Young Adults with Autism Spectrum Disorder and Borderline/Mild Mental Retardation. Sexuality and Disability 28(2): 93-104.

Manning, A. (1972). An Introduction to Animal Behaviour. 2nd edition. London: Edward Arnold.

Peakman, J. (2013). The Pleasure's All Mine. A History of Perverse Sex. London: Reaktion Books

Ramachandran, V.S. and W. Hirstein. (1998). The perception of phantom limbs. The D. O. Hebb lecture. Brain 121: 1603-1630.

Ribeyrol, C. (2015). 'The Feet of Love': Pagan Podophilia from A.C. Swinburne to Isadora Duncan. Miranda 11

Scorolli, C., S. Ghirlanda, M. Enquist, S. Zattoni, and E.A. Jannini. (2007). Relative prevalence of different fetishes. International Journal of Impotence Research 19: 432-437.

Stekel, W. (1952). Sexual aberrations: The phenomena of fetishism in relation to sex (Vol. 1) (Trans., S. Parker). New York: Liveright Publishing Corporation.

Wednesday, August 28, 2019

The Japanese alternative

Japan is robotizing not only manufacturing but also the service sector (Wikicommons - Michael Ocampo)

In my last two posts I argued that South Korea has embraced not only ultra-low fertility but also mass immigration. In this, it has more in common with Western Europe and North America than with neighboring China and Japan.

China is out of step with Western immigration policy for understandable reasons: it is only now exhausting its reserves of cheap labor and, furthermore, has problematic relations with the West. But those reasons hardly apply to Japan—a Western ally with fewer and fewer people of working age. Yet that country has been going its own way on immigration, just as it has in other areas, notably automation and robotization.  In the West, robotics research is a low priority, except for military applications. In Japan, it is a high priority and has the stated aim of staving off immigration:

"Japan's push for automation has historically been driven by political and social resistance to large-scale immigration by non-Japanese, rooted in the idea that there would be a deep cultural incompatibility with such immigrants," says Grant Otsuki, a lecturer in cultural anthropology at Victoria University of Wellington.

"In contrast, robots are generally seen as compatible with tradition and culture, or at least 'neutral', and therefore more acceptable than immigrants." (Townsend 2019)

Robotic beings have a good image in Japan, as shown by a spate of movies where a shy boy falls in love with a female android: Chobits (2002), Cyborg She (2008), and Q10 (2010). In contrast, we see a darker image in Western movies, such as the Terminator series, Ex Machina (2015), and Blade Runner (1982 and 2017).

Keep in mind that culture is upstream from policy. If you think movies are made only to provide entertainment, you probably also believe that newspapers serve only to cover the news and that advertisements are used only to sell a specific good or service. Culture is an effective way to shape future policies.

South Korea and Japan: different responses to the same demographic crisis

The South Korean response

Although South Korea and Japan face the same demographic crisis, i.e., an aging society and a low birth rate, they have responded in very different ways. South Korea has greatly liberalized its immigration policy, both in law and in enforcement of the law. Since 1997 the country has opened up its labor market to guest workers and has relaxed enforcement to the point that half of all migrants are undocumented (Moon 2010).

Song (undated) sees a link between the beginning of large-scale labor immigration to his country and the IMF bailout of 1997. However, the "Memorandum on the Economic Program," written by the South Korean government in response to the IMF, says nothing specific about immigration. There is only a promise to implement "labor market reform" and take "further steps to improve labor market flexibility" (IMF 1997). Perhaps other promises were made off the record.

To gain support for large-scale immigration, the government began to promote multiculturalism from 2006 onward:

But the South Korean media also began to host fervent discussions of multiculturalism. In 2005-2006, the number of articles on the topic tripled from previous years. The media shift was echoed by a change in policy from the top, initially driven by President Roh Moo-hyun. The campaign then crossed ministerial divisions and party lines, surviving the changeover from the liberal Roh administration of 2003-2008 to the more conservative administration of President Lee Myung-bak. Lee's government sought both to persuade the public to embrace immigrants and to promote integration by educating new foreign-born brides in the intricacies of Korean culture. The Ministry of Gender Equality and Family simultaneously started a campaign to persuade the public to accept multiculturalism. Immigration commissioners and the presidential committee on aging set multiculturalism as a national priority to combat a maturing society. South Korea was to become a "first-class nation, with foreigners" — a phrase echoed throughout government documents and speeches. (Palmer and Park 2018)

Watson (2010) ascribes this new policy to the neo-liberalism that has dominated both the Right and the Left, particularly since the IMF bailout of 1997:

For the conservative government, South Korean nationalism and democracy is fundamentally tied to the doctrine of neo-liberalism. Neo-liberalism refers to the flow of economic migrant labour and mobile global capital. This global environment also requires government policies to attract foreign migrants and workers into South Korea's economy and society.

Multiculturalism is a state-led response to these global changes. The policies of multiculturalism define the present and future economic, security and cultural national strength of South Korea. Critics suggest that, in fact, the GNP regards multiculturalism as an instrumental policy of increasing national state power in this global environment. (Watson, 2010)

The GNP is the Grand National Party. It dominates the political right and resembles mainstream Republicanism in the United States:

The Japanese response

Meanwhile, Japan has been much less willing to open its borders, despite being East Asia's primary destination for foreigners. Its illegal immigrant population has actually declined through stronger law enforcement, and legal immigrants have been mostly overseas Japanese from Latin America. Last year, however, its parliament passed a law to bring in foreign workers for jobs in construction, agriculture, the hotel industry, cleaning, and elder care. Initially, 500,000 were slated to come over the next five years, but the total was cut to 345,000 (Denyer 2018; Nikkei 2018; Shigeta 2018).

Those numbers are still much lower than the 2.4 million foreign workers currently in South Korea, a much smaller country in size and population. In addition, Japan's guest workers will be paid the same as Japanese doing the same work (Denyer 2018). This is in stark contrast to South Korea, which has the largest wage gap between local and immigrant labor in the OECD (Hyun-ju 2015).

Japan is still criticized for not opening up enough. One example is this Washington Post article, whose author warns the U.S. against becoming another Japan:

Now, to be clear, Japan is a wondrous nation, with an ancient, complex culture, welcoming people, innovative industry — a great deal to teach the world. But Japan also is a country that admits few immigrants — and, as a result, it is an aging, shrinking nation. By 2030, more than half the country will be over age 50. By 2050 there will be more than three times as many old people (65 and over) as children (14 and under). Already, deaths substantially outnumber births. Its population of 127 million is forecast to shrink by a third over the next half-century. (Hiatt 2018)

Robotization may make life easier for Japan's growing numbers of elderly but will it pay for their pensions? Mind you, the same sort of question could be asked about low-wage immigration to the U.S.

Why is Japan so different?

A key reason seems to be a high degree of cultural autonomy and a correspondingly high degree of cultural isolation. The term "isolation" might seem strange for a country that does so much importing and exporting. Nonetheless, manufactured goods are not the same as beliefs. The latter are distributed not via shipping containers but through shared language and through shared discourse spaces in academia, entertainment, and the media.

Poor knowledge of English

English has become the language of globalism, and knowledge of English correlates worldwide with public acceptance of core globalist beliefs. In Japan, English is not widely used or understood, even among the well-educated:  

Although English is a compulsory subject in junior high and high school in this country, Japanese still have a hard time achieving even daily conversation levels. According to the most recent EF English Proficiency Index, the English level of Japanese is ranked 35th out of 72 countries. The top three are the Netherlands, Denmark and Sweden, which are all northern European nations. Among Asian countries, Singapore is placed sixth, Malaysia 12th, the Philippines 13th, India 22nd and South Korea 29th. Japan places between Russia and Uruguay. (Tsuboya-Newell 2017)

Sullivan and Schatz (2009) found that attitudes toward learning English correlated negatively with patriotism (defined as positive identification and affective attachment to one's country) and positively with nationalism, internationalism, and pro-U.S. attitudes. Here, "nationalism" is defined as "perceptions of national superiority and support for national dominance"—what Steve Sailer has dubbed "Invade the world, invite the world!" 

Relative isolation of academia

Academia can propagate a new discourse in several ways:

- by inculcating it in young adults

- by acting as a trusted gatekeeper that serves to distinguish between "correct" and "incorrect" discourse.

- by mobilizing scarce intellectual resources for the development and dissemination of "correct" discourse.

New forms of discourse, like globalism, cannot easily penetrate Japanese colleges and universities by means of overseas-trained leaders. Unlike the case in many other Asian countries, educational authorities prefer to select future leaders from within, attaching little importance to foreign experience and credentials for promotion within the system (Yonezawa et al. 2018, p. 235). Foreign-born professors are hired mostly for teaching English language and literature.

Relative isolation of policy makers

This relative isolation is true for Japanese in general, including policy makers. International organizations, like the IMF, have little input into public policy, in large part because Japan's debt is almost wholly Japanese-owned. This economic independence has been a longstanding characteristic of Japan and enjoys support not only from the political left but also from the political right:

In Japan, unlike many of the social democracies resisting capital movements, the most important political opposition came not from organized labor and a political Left anxious to prevent capital flight and to protect the welfare state; rather, it came from nominally "conservative" politicians; many bureaucratic agencies, including the MOF; and protected, cartelized sectors of the economy, including banks, securities houses, and insurance firms. (Pempel 1999, p. 911)

In the West, globalism coopted first the Right and then the Left. That process is still at an early stage in Japan.


I would like to conclude with three points: 

- Japan will be a nice place to visit during the troubled 2020s. The same decade will see South Korea become more and more like the West, especially the United States—in keeping with stated policy goals.

- English is the language not only of globalism but also of anti-globalism. Just as Japan will move toward globalism more slowly than the West, it will also move away more slowly ... when that time comes. As for South Korea, it will enter a period of polarization, perhaps violent polarization.

- Japan shows that the Western model of modernity is not the only one, or even the best. The Western model is a product of specific circumstances, particularly the presence of a large rentier class that feeds on growth while doing little to make growth sustainable. At home and abroad, our rentier class continually pushes for high rates of growth through expansion of the money supply, through mass immigration, and through rapid exploitation of resources that are either non-renewable or slowly renewable. 

Japan's slow-growth model is problematic in other ways, but it promises to be more sustainable in the long run.


Denyer, S. (2018). Japan passes controversial new immigration bill to attract foreign workers. The Washington Post. December 7  

Hiatt, F. (2018). Anti-immigration Republicans have a decision to make about America's future. Washington Post January 2018

Hyun-ju. (2015). Korea's wage gap between local, foreign workers largest in OECD. The Korea Herald, September 9  

IMF (1997). Memorandum on the Economic Program. December 3.  

Moon, S. (2010). Multicultural and Global Citizenship in the Transnational Age: The Case of South Korea. International Journal of Multicultural Education 12: 1-15. 

Nikkei (2018). Abe vows to bring in more foreign workers. Nikkei Asian Review. June.  

Palmer, J., and G.-Y. Park. (2018). South Koreans learn to love the Other. How to manufacture multiculturalism. Foreign Policy. July 16  

Park, Y-B. (2017). South Korea Carefully Tests the Waters on Immigration, With a Focus on Temporary Workers. Migration Policy Institute, March 1  

Pempel, T.J. (1999). Structural Gaiatsu: International Finance and Political Change in Japan. Comparative Political Studies 32: 907-932.

Shigeta, S. (2018). How Japan came around on foreign workers. Nikkei Asian Review, June.

Song, H-J. (undated). Immigration Policy in South Korea & Japan - A Comparative Perspective Theoretical framework. University of Tsukuba. International and Advanced Japanese Studies. PowerPoint presentation

Sullivan, N. and R.T. Schatz (2009). Effects of Japanese national identification on attitudes toward learning English and self-assessed English proficiency. International Journal of Intercultural Relations 33(6): 486-497

Townsend, R. (2019). Japan's big dilemma: robots or immigrants? Asia Media Centre. March 1

Tsuboya- Newell, I. (2017). Why do Japanese have trouble learning English? The Japan Times, October 29 

Watson, I. (2010). Multiculturalism in South Korea: A Critical Assessment. Journal of Contemporary Asia 40: 337-346.

Yonezawa, A., Y. Kitamura, B. Yamamoto, and T. Tokunaga. (2018). Japanese Education in a Global Age. Sociological Reflections and Future Directions. Springer.