Monday, February 25, 2019

Alzheimer's and African Americans



A village elder of Mogode, Cameroon (Wikicommons - W.E.A. van Beek). African Americans are more than twice as likely to develop Alzheimer's. They also more often have an allele that increases the risk of Alzheimer's in Western societies but not in sub-Saharan Africa. Why is this allele adaptive there but not here?



Alzheimer's disease (AD) is unusually common among African Americans. Demirovic et al. (2003) found it to be almost three times more frequent among African American men than among white non-Hispanic men (14.4% vs. 5.4 %). Tang et al. (2001) found it to be twice as common among African American and Caribbean Hispanic individuals. On the other hand, it is significantly less common among Yoruba in Nigeria than among age-matched African Americans (Hendrie et al. 2001).

This past year, new light has been shed on these differences. Weuve et al. (2018) analyzed data from ten thousand participants 65 years old and over (64% black, 36% white) who had been followed for up to 18 years. Compared to previous studies, this one had three times as many dementia assessments and dementia cases. It also had a wider range of data: tests of cognitive performance, specific diagnosis of Alzheimer's (as opposed to dementia in general), educational and socioeconomic data, and even genetic data—specifically whether the participant had the APOE e4 allele, a major risk factor for Alzheimer's.

The results confirmed previous findings ... with a few surprises.


Incidence

Alzheimer's was diagnosed in 19.9% of the African American participants, a proportion more than twice that of the Euro American participants (8.2%).


Cognitive performance and cognitive decline

Cognitive performance was lower in the African American participants. "The difference in global cognitive score, -0.83 standard units (95% confidence interval [CI], -0.88 to -0.78), was equivalent to the difference in scores between participants who were 12 years apart in age at baseline."

On the other hand, both groups had the same rate of cognitive decline with age. In fact, executive function deteriorated more slowly in African Americans. The authors suggest that the higher rate of dementia in elderly African Americans is due to their cognitive decline beginning at a lower level:

[…] on average, white individuals have "farther to fall" cognitively than black individuals before reaching the functional threshold of clinical dementia, so that even if both groups have the same rate of cognitive decline, blacks have poorer cognitive function and disproportionately develop dementia. (Weuve et al. 2018)


Interaction with education.

Differences in educational attainment, i.e., years of education, explained about a third of the cognitive difference between the two groups of participants:

Educational attainment, as measured by years of education, appeared to mediate a substantial fraction but not the totality of the racial differences in baseline cognitive score and AD risk (Table 5). Under the hypothetical scenario in which education was "controlled" such that each black participant's educational level took on the level it would have been had the participant been white, all covariates being equal, black participants' baseline global cognitive scores were an average of 0.45 standard units lower than whites' scores (95% CI, -0.49 to -0.41), a difference smaller than without controlling years of education (-0.69; Table 5), and translating to about 35% of the total effect of race on cognitive performance mediated through years of education. (Weuve et al. 2018)

While educational attainment explains 35% of the cognitive difference between African Americans and Euro Americans, we should keep in mind that educational attainment itself is influenced by genetic factors. These genetic factors vary among African Americans, just as they vary between African Americans and other human populations.


APOE e4 allele

This allele was more common in the African American participants. It contributed to their higher risk of Alzheimer's but not to their lower cognitive score.

Black participants were more likely than white participants to carry an APOE e4 allele (37% vs 26%; Table 1). In analyses restricted to participants with APOE data, racial differences in baseline scores or cognitive decline did not vary by e4 carriership (all Pinteraction > 0.16). Furthermore, adjustment for e4 carriership did not materially change estimated racial differences in baseline performance or cognitive decline (eTable 3).

By contrast, the association between race and AD risk varied markedly by APOE ecarriership (Pinteraction = 0.05; Table 4). Among non-carriers, blacks' AD risk was 2.32 times that of whites' (95% CI, 1.50-3.58), but this association was comparably negligible among e4 carriers (RR, 1.09; 95% CI, 0.60-1.97). (Weuve et al. 2018)


Discussion

This study offers two different explanations: why African Americans have a higher incidence of Alzheimer's and why they have a higher incidence of dementia in general. Two different explanations are needed because Alzheimer's seems to be qualitatively different from other forms of dementia.

First, African Americans have a higher incidence of Alzheimer’s because they have a higher incidence of the APOE e4 allele, a risk factor for Alzheimer's. They may also have other alleles, still unidentified, that similarly favor development of Alzheimer's. This would explain why, if we look at participants without APOE e4, Alzheimer's was still twice as common among African Americans as it was among Euro Americans. On the other hand, the two groups had virtually the same incidence of Alzheimer's if we look at participants with APOE e4.

Second, African Americans have a higher incidence of dementia in general because they have a lower cognitive reserve. When cognitive performance begins to deteriorate in old age, the ensuing decline starts from a lower level and reaches the threshold of dementia sooner. The rate of decline is nonetheless the same in both African Americans and Euro Americans. While this explanation could apply to most forms of dementia, it is hard to see how it applies to Alzheimer's. Euro Americans have a higher cognitive reserve, and yet the APOE e4 allele is just as likely to produce Alzheimer's in them as in African Americans.

Why does the APOE e4 allele exist? It must have some adaptive value, given its incidence of 37% in African Americans and 26% in Euro Americans. African Americans also seem to have other alleles, not yet identified, that likewise increase the risk of Alzheimer’s. Those alleles, too, must have some adaptive value.

This value seems to exist in sub-Saharan Africa but not in North America. When Hendrie et al. (2001) examined Yoruba living in Nigeria, they found no relationship between APOE e4 and Alzheimer’s or dementia in general:

In the Yoruba, we have found no significant association between the possession of the e4 allele and dementia or AD in either the heterozygous or homozygous states. As the frequencies of the 3 major APOE alleles are almost identical in the 2 populations, this variation in the strength of the association between e4 and AD may account for some of the differences in incidence rates between the populations, although it is not likely to explain all of it. It also raises the possibility that some other genetic or environmental factor affects the association of the e4 allele to AD and reduces incidence rates for dementia and AD in Yoruba. (Hendrie et al. 2001)

There has been speculation, notably by Greg Cochran, that Alzheimer’s is caused by apoptosis. Because of the blood-brain barrier, antibodies cannot enter the brain to fight infection, so neural tissue is more dependent on other means of defense, like apoptosis. Such a means of defense may be more important in sub-Saharan Africa because the environment carries a higher pathogen load.

If we pursue this hypothesis, APOE e4 and other alleles may enable neurons to self-destruct as a means to contain the spread of pathogens in the brain. In an environment with a lower pathogen load, like North America, this means of defense would become too inactive. The result would be autoimmune disorders where apoptosis is triggered in neural tissue for no good reason.


References

Chin, A.L., S. Negash, and R. Hamilton. (2011). Diversity and disparity in dementia: the impact of ethnoracial differences in Alzheimer disease. Alzheimer disease and associated disorders. 25(3):187-195.

Cochran, G. (2018). Alzheimers or did I already say that? West Hunter, July 14

Demirovic, J., R. Prineas, D. Loewenstein, et al. (2003). Prevalence of dementia in three ethnic groups: the South Florida program on aging and health. Ann Epidemiol. 13:472-478.

Hendrie, H.C., A. Ogunniyi, K.S. Hall, et al. (2001). Incidence of dementia and Alzheimer disease in 2 communities: Yoruba residing in Ibadan, Nigeria, and African Americans residing in Indianapolis, Indiana. JAMA. 285:739-47.

Tang, M.X., P. Cross, H. Andrews, et al. (2001). Incidence of AD in African-Americans, Caribbean Hispanics, and Caucasians in northern Manhattan. Neurology 56:49-56.

Weuve, J., L.L. Barnes, C.F. Mendes de Leon, K. Rajan, T. Beck, N.T. Aggarwal, L.E. Hebert, D.A. Bennett, R.S. Wilson, and D.A. Evans. (2018). Cognitive Aging in Black and White Americans: Cognition, Cognitive Decline, and Incidence of Alzheimer Disease Dementia. Epidemiology 29(1): 151-159. 



Thursday, February 14, 2019

The Nurture of Nature



Fleet Street, watercolor by Ernest George (1839-1922). In England, middle-class families used to be so large that they overshot their niche and flooded the ranks of the lower class.



Until the last ten years it was widely believed that cultural evolution had taken over from genetic evolution in our species. When farming replaced hunting and gathering, something fundamentally changed in the relationship between us and our surroundings. We no longer had to change genetically to fit our environment. Instead, we could change our environment to make it fit us.

That view has been challenged by a research team led by anthropologist John Hawks. They found that genetic evolution actually speeded up 10,000 years ago, when hunting and gathering gave way to farming. In fact, it speeded up over a hundred-fold. Why? Humans were now adapting not only to slow-changing natural environments but also to faster-changing cultural environments, things like urban living, belief systems, and the State monopoly on violence. Far from slowing down, the pace of genetic change actually had to accelerate (Hawks et al. 2007).

These findings received a broader public hearing with the publication of The 10,000 Year Explosion: How Civilization Accelerated Human Evolution. More recently, they have been discussed in a review article by historian John Brooke and anthropologist Clark Spencer Larsen:


Are we essentially the same physical and biological beings as Ice Age hunter-gatherers or the early farming peoples of the warming early Holocene? How has the human body changed in response to nine or ten millennia of dramatic dietary change, a few centuries of public health interventions, and a few decades of toxic environmental exposures? In short, how has history shaped biology? 

[...] But very clearly human evolution did not stop with the rise of modern humanity in the Middle to Late Paleolithic. Climatic forces, dietary shifts, disease exposures, and perhaps the wider stresses and challenges of hierarchical, literate state societies appear to have been exerting selective pressure on human genetics.

In short, we have become participants in our evolution: we create more and more of our surroundings, and these surroundings influence the way we evolve. Culture is not simply a tool we use to control and direct our environment. It is a part of our environment, the most important part, and as such it now controls and directs us.

Brooke and Larsen nonetheless feel attached to older ways of evolutionary thinking, particularly the "essentialism" of pre-Darwinian biology. We see this when they assert that “the essential modeling of the genetic code ended sometime in the Paleolithic." Actually, there was no point in time when our ancestors became essentially "human"—whatever that means. A Paleolithic human 100,000 years ago would have had less in common with you or me than with someone living 100,000 years earlier or even a million years earlier. Human evolution has been logarithmic—the changes over the past 10,000 years exceed those over the previous 100,000 years, which in turn exceed those over the previous million.


Clark’s model

Brooke and Larsen discuss Gregory Clark's work on English demography. Clark found that the English middle class expanded steadily from the twelfth century onward, its descendants not only growing in number but also replacing the lower classes through downward mobility. By the 1800s, its lineages accounted for most of the English population. Parallel to this demographic expansion, English society shifted toward "middle class" culture and behavior: thrift, pleasure deferment, increased future orientation, and unwillingness to use violence to settle personal disputes (Clark, 2007). 

Clark’s work is criticized by Brooke and Larsen on two grounds:

[... ] there is no biological evidence to support an argument for English industrial transformation via natural selection. More importantly, this was a process that—hypothetically—had been at work around the world since the launch of social stratification in the Late Neolithic and the subsequent rise of state societies.

How valid are these criticisms? Let me deal with each of them.


Is social stratification the only precondition of Clark’s model?

First, it is true that many societies around the world are socially stratified, but social stratification is only one of the preconditions of Clark’s model. There are two others:

1. Differences in natural increase between social classes, with higher natural increase being associated with higher social status.

2. Porous class boundaries. The demographic surplus of the middle and upper classes must be free to move down into and replace the lower classes.

These preconditions are not met in most socially stratified societies. Brooke and Larsen are simply wrong when they say: "The poor died with few or no children everywhere in the world, and across vast stretches of human history." In reality, there have been many societies where fewer children were born on average to upper-class families than to lower-class families. A notable example is that of the Roman Empire, particularly during its last few centuries: upper-class Romans widely practiced abortion and contraception (Hopkins 1965). A similar situation seems to have prevailed in the Ottoman Empire. By the end of the eighteenth century, Turks were declining demographically in relation to their subject peoples, perhaps because they tended to congregate in towns and were more vulnerable to the ravages of plague and other diseases (Jelavich and Jelavich, 1977, pp. 6-7)

Nor are class boundaries always porous. Social classes often become endogamous castes. This can happen when a social class specializes in "unclean" work, like butchery, preparation of corpses for burial, etc. This was the case with the Burakumin of Japan, the Paekchong of Korea, and the Cagots of France (Frost 2014). Because of their monopoly over a despised occupation, they were free from outside competition and thus had the resources to get married and have enough children to replace themselves. This was not the case with the English lower classes, who faced competition from “surplus” middle-class individuals between the twelfth and nineteenth centuries. Such downward mobility is impossible in caste societies, where “surplus” higher-caste individuals are expected to remain unmarried until they can find an appropriate social situation. 

A caste society thus tends to be evolutionarily stagnant. Lower castes in particular tend to preserve mental and behavioral predispositions that would otherwise be removed from the gene pool in a more fluid social environment.

Why did class boundaries remain porous in England? The reason was probably the greater individualism of English society, particularly its expanding middle class. Sons were helped by their parents, but beyond a certain point they were expected to shift for themselves. My mother’s lineage used to be merchants on Fleet Street in London. They were successful and had such large families that they overshot their niche. By the nineteenth century, some of them had fallen to the level of shipbuilding laborers, and it was as such that they came to Canada.


Is biological evidence lacking for Clark's model?

Brooke and Larsen are on firmer ground when they say that Clark's model is unsupported by biological evidence. There is certainly a lack of hard evidence, but the only possible hard evidence would be ancient DNA. If we could retrieve DNA from the English population between the 12th and 19th centuries, would we see a shift toward alleles that support different mental and behavioral traits? That work has yet to be done. 

Nonetheless, a research team led by Michael Woodley has examined ancient DNA from sites in Europe and parts of southwest and central Asia over a time frame extending from 4,560 and 1,210 years ago. During that time frame, alleles associated with high educational attainment gradually became more and more frequent. The authors concluded: "This process likely continued until the Late Modern Era, where it has been noted that among Western populations living between the 15th and early 19th centuries, those with higher social status […] typically produced the most surviving offspring. These in turn tended toward downward social mobility due to intense competition, replacing the reproductively unsuccessful low-status stratum […] eventually leading to the Industrial Revolution in Europe" (Woodley et al. 2017).

Again, work remains to be done, particularly on the genetic profile of the English population between the twelfth and nineteenth centuries, but the existing data do seem to validate Clark's model for European societies in general. Indeed, psychologist Heiner Rindermann presents evidence that mean cognitive ability steadily rose throughout Western Europe during late medieval and post-medieval times. Previously, most people failed to develop mentally beyond the stage of preoperational thinking. They could learn language and social norms but their ability to reason was hindered by various impediments like cognitive egocentrism, anthropomorphism, finalism, and animism (Rindermann 2018, p. 49). From the sixteenth century onward, more and more people reached the stage of operational thinking. They could better understand probability and cause and effect and could see things from the perspective of another person, whether real or hypothetical (Rindermann 2018, pp. 86-87).

As the “smart fraction” became more numerous, it may have reached a threshold where intellectuals were no longer isolated individuals but rather communities of people who could interact and exchange ideas. This was one of the hallmarks of the Enlightenment: intellectuals were sufficiently large in number to meet in clubs, “salons,” coffeehouses, and debating societies.



References

Brooke, J.L. and C.S. Larsen. (2014).The Nurture of Nature: Genetics, Epigenetics, and Environment in Human Biohistory. The American Historical Review 119(5): 1500-1513

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World. Princeton University Press: Princeton and Oxford.

Clark, G. (2009a). The indicted and the wealthy: surnames, reproductive success, genetic selection and social class in pre-industrial England.

Clark, G. (2009b). The domestication of man: The social implications of Darwin. ArtefaCTos 2: 64-80. 

Cochran, G. and H. Harpending. (2009). The 10,000 Year Explosion: How Civilization Accelerated Human Evolution. New York: Basic Books. 

Frost, P. (2014). Burakumin, Paekchong, and Cagots. ResearchGate

Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA) 104: 20753-20758.

Hopkins, K. (1965). Contraception in the Roman Empire. Comparative Studies in Society and History 8(1): 124-151.

Jelavich, C. and B. Jelavich. (1977). The Establishment of the Balkan National States, 1804-1920. Seattle: University of Washington Press.

Rindermann, H. (2018). Cognitive Capitalism. Human Capital and the Wellbeing of Nations. Cambridge University Press.

Woodley, M.A., S. Younuskunju, B. Balan, and D. Piffer. (2017). Holocene selection for variants associated with general cognitive ability: comparing ancient and modern genomes. Twin Research and Human Genetics 20(4): 271-280.

Tuesday, February 5, 2019

Did cold seasonal climates select for cognitive ability?




Paleolithic artefacts (Wikicommons). The northern tier of Eurasia saw an explosion of creativity that pre-adapted its inhabitants for later developments.



The new journal Psych will be publishing a special follow-up issue on J. Philippe Rushton and Arthur Jensen's 2005 article: "Thirty Years of Research on Race Differences in Cognitive Ability." The following is the abstract of my contribution. The article will appear later.


The first industrial revolution. Did cold seasonal climates select for cognitive ability?

Peter Frost

Abstract: In their joint article, Rushton and Jensen argued that cognitive ability differs between human populations. But why are such differences expectable? Their answer: as modern humans spread out of Africa and into the northern latitudes of Eurasia, they entered colder and more seasonal climates that selected for the ability to plan ahead, since they had to store food, make clothes, and build shelters for the winter. 

This explanation has a long history going back to Arthur Schopenhauer. More recently, it has been supported by findings from Paleolithic humans and contemporary hunter-gatherers. Tools become more diverse and complex as effective temperature decreases, apparently because food has to be obtained during limited periods of time and over large areas. There is also more storage of food and fuel and greater use of untended traps and snares. Finally, shelters have to be sturdier, and clothing more cold-resistant. The resulting cognitive demands fall on both men and women. Indeed, because women have few opportunities to get food through gathering, they specialize in more cognitively demanding tasks like garment making, needlework, weaving, leatherworking, pottery, and use of kilns. The northern tier of Paleolithic Eurasia thus produced the "first industrial revolution"—an explosion of creativity that pre-adapted its inhabitants for later developments, i.e., agriculture, more complex technology and social organization, and an increasingly future-oriented culture. Over time these humans would spread south, replacing earlier populations that could less easily exploit the possibilities of the new cultural environment. 

As this cultural environment developed further, it selected for further increases in cognitive ability. In fact, mean intelligence seems to have risen during historic times at temperate latitudes in Europe and East Asia. There is thus no unified theory for the evolution of human intelligence. A key stage was adaptation to cold seasonal climates during the Paleolithic, but much happened later.



References

Rushton, J.P. and A.R. Jensen. (2005). Thirty years of research on race differences in cognitive ability. Psychology, Public Policy, and Law 11(2): 235-294.