There has been much comment on a recent finding that human evolution has accelerated over the past 40,000 years, i.e., the period during which our species has spread out of Africa and differentiated into the populations we see today (Hawks et al., 2007). There has been less comment on a related finding: at least 7% of the human genome has changed over the same 40,000 years.
This second finding seems to challenge a truism that has become widespread in academia and even in our political culture. In a speech earlier this year, Hillary Clinton cited genetic research showing that human populations are 99.9 percent the same and “that the differences in how we look -- in our skin color, our eye color, our height -- stem from just one-tenth of 1 percent of our genes.”
Isn’t there a contradiction here? How can human populations be 99.9% genetically identical if at least 7% of the genome has changed since they began moving apart some 40,000 years ago?
First, the 99.9% figure is not the number of genes that are the same. It’s the number of nucleotide sequences that are the same. A single gene is a long chain of nucleotides, often a very long one, and a single nucleotide mutation can significantly alter how the entire gene works. In theory, each and every human gene could differ by 0.1% from one population to another. And such a difference could make a big difference.
Second, the 99.9% estimate doesn’t capture higher-level nucleotide variation:
The technique originally used … could read the sequence of letters of a genetic code. But it couldn't detect repetitions of some parts of the code, which also occur. Differences in the number of these repetitions, called copy number variants, have since turned out to account for much of the variation in a species' DNA. Another type of variation recently found to be important is called insertion-deletion variants, snippets of code that are either extra or missing in some genomes compared to others. (World Science, 2007)
This higher-level variation has caused geneticist Craig Venter (of the Human Genome Project) to revise the 99.9% figure downward:
The findings reveal “human-to-human variation is more than seven-fold greater than earlier estimates, proving that we are in fact very unique individuals at the genetic level,” Venter said. The 99.9 figure might need to be lowered to about 99, he added. (World Science, 2007) (also see original article: Redon et al, 2006)
So our nucleotide sequences may be closer to being 1% different, and not 0.1%. And don’t be fooled by small numbers. Whether it’s 1% or 0.1% the difference is still big in absolute terms. As John Hawks points out: “one-tenth of 1 percent of 3 billion is a heck of a large number -- 3 million nucleotide differences between two random genomes.”
Finally, there is a third reason why we should not read too much into any of these estimates. When the 99.9% figure first came out in the 1970s, geneticists had also discovered that nucleotide sequences were 98.9% the same between humans and chimpanzees (King & Wilson, 1975). And yet, humans and chimps exhibit a wide range of anatomical and behavioral differences. How come?
There is of course the aforementioned ‘small percentage fallacy’: a tiny sliver of the genome still amounts to a lot of DNA. More importantly, humans and chimps seem to differ the most in ‘regulatory genes’ whose effects are many times greater than those of ‘structural genes’ (the ones that code for the building block proteins of body tissues). A single regulatory gene has such a disproportionate impact because it can control the expression of many other genes.
These less numerous regulatory genes have gained importance as organisms have grown more and more complex. This has especially been so during human evolution. Whereas humans and chimpanzees are almost identical in the proteins that form their tissues, they differ radically in the way their brains and bodies develop. This point is summarized by King and Wilson (1975, p. 115):
The genetic distance between humans and chimpanzees, based on electrophoretic comparison of proteins encoded by 44 loci is very small, corresponding to the genetic distance between sibling species of fruit flies or mammals. Results obtained with other biochemical methods are consistent with this conclusion. However, the substantial anatomical and behavioral differences between humans and chimpanzees have led to their classification in separate families. … A relatively small number of genetic changes in systems controlling the expression of genes may account for the major organismal differences between humans and chimpanzees.
Interestingly, King and Wilson see this paradox as applying not only to human-chimpanzee genetic differences, but also to genetic differences within our species:
This [human-chimpanzee] distance is 25 to 60 times greater than the genetic distance between human races. In fact, the genetic distance between Caucasian, Black African, and Japanese populations is less than or equal to that between morphologically and behaviorally identical populations of other species. (King & Wilson, 1975, p. 113)
Yet human races are not identical populations, anymore than humans and chimpanzees are sibling species. These measures of genetic distance are not comparable because the nature of genetic change can vary dramatically. In one case, there is simply tinkering with an existing body plan through mutations in structural genes. In another, there is radical developmental change through mutations in regulatory genes.
Since the time that the ancestor of these two species lived, the chimpanzee lineage has evolved slowly relatively to the human lineage, in terms of anatomy and adaptive strategy. According to Simpson:
Pan is the terminus of a conservative lineage, retaining in a general way an anatomical and adaptive facies common to all recent hominoids except Homo. Homo is both anatomically and adaptively the most radically distinctive of all hominoids, divergent to a degree considered familial by all primatologists. (King & Wilson, 1975, p. 113)
This is the context in which the 99.9% statistic was initially presented to the academic community … way back in the 1970s. Even then, researchers thought it misleading and went to great pains to explain why it was misleading. Yet their caveats were to no avail. The 99.9% truism has taken on a life of its own, much like those stories we hear of alligators living in sewers or evil people sticking razor blades in Halloween apples. It seems to meet a deep-seated need to affirm our sameness and to give this affirmation a stamp of scientific approval.
But science it is not.
References
Anon. (2007). Finding said to show "race isn't real" scrapped http://www.world-science.net/othernews/070904_human-variation.htm
Elliott, P. (2007). Clinton tells grads only minor genetics make them different.
http://www.boston.com/news/nation/articles/2007/06/14/clinton_tells_grads_only_minor_genetics_make_them_different/
Hawks, J. (2007) Disagreeing with Hillary Clinton on human genetic differences.
http://johnhawks.net/weblog/topics/race/differences/clinton_2007_proportion_differences_speech.html
Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA) early view.
King, M-C. and A.C. Wilson. (1975). Evolution at two levels in humans and chimpanzees. Science, 188, 107-116.
Redon, R., S. Ishikawa, K.R. Fitch, L. Feuk, G.H. Perry, T.D. Andrews, H. Fiegler, M.H. Shapero, A.R. Carson, W. Chen, E.K. Cho, S. Dallaire, J.L. Freeman, J.R. González, M. Gratacòs, J. Huang, D. Kalaitzopoulos, D. Komura, J.R. MacDonald, C.R. Marshall, R. Mei, L. Montgomery, K. Nishimura, K. Okamura, F. Shen, M.J. Somerville, J. Tchinda, A. Valsesia, C. Woodwark, F. Yang, J. Zhang, T. Zerjal, J. Zhang, L. Armengol, D.F. Conrad, X. Estivill, C. Tyler-Smith, N.P. Carter, H. Aburatani, C. Lee, K.W. Jones, S.W. Scherer & M.E. Hurles. (2006). Global variation in copy number in the human genome. Nature, 444, 444-454.
Friday, December 28, 2007
Friday, December 21, 2007
Thoughts on the EEA
For the past twenty years, a key concept in evolutionary psychology has been the ‘environment of evolutionary adaptedness’ (EEA). This is the ancestral environment within which our species first evolved and whose selection pressures shaped our current psychology and behavior. Usually, writers situate this environment in the Pleistocene before Homo sapiens began to spread out of Africa some 50,000 years ago.
The EEA concept has increasingly come under fire in recent years, especially with the recent Hawks et al. study. It now appears that human genetic evolution did not stop 50,000 years ago. Nor has it since slowed down. In fact, it has accelerated by as much as a 100-fold. In light of these findings, can the EEA concept be salvaged? Should it?
Interestingly, its earliest proponents, John Tooby and Leda Cosmides, have always been reluctant to narrow it down to a specific place and time:
Although the hominid line is thought to have originated on edges of the African savannahs, the EEA is not a particular place or time. The EEA for a given adaptation is the statistical composite of the enduring selection pressures or cause-and-effect relationships that pushed the alleles underlying an adaptation systematically upward in frequency until they became species-typical or reached a frequency-dependent equilibrium (most adaptations are species-typical; see Hagen, Chapter 5, this volume). Because the coordinated fixation of alleles at different loci takes time, complex adaptations reflect enduring features of the ancestral world. (Tooby & Cosmides, 2005, p. 22)
According to Tooby and Cosmides, there are potentially as many EEAs as there are human adaptations. Therefore, some human characteristics may have originated in very old EEAs and others in more recent ones.
How recent? For Tooby and Cosmides, the limiting factor is complexity. The more complex the adaptation, the more genes it will involve, and the longer the evolutionary time to coordinate all those genes. Therefore, recent human evolution has probably only involved simple traits, certainly nothing as complex as behavior.
The problem with this argument is that complex traits do not arise ex nihilo. They arise from changes to existing traits that may be just slightly less complex. A point mutation can greatly alter the functioning of a trait that involves thousands upon thousands of genes. Keep in mind that genes vary considerably in their effects. At one extreme, a single ‘structural’ gene may code for one protein. At the other, a single ‘regulatory’ gene may control the output of numerous structural genes … or even numerous regulatory genes like itself. As Harpending and Cochran (2002) point out:
Even if 40 or 50 thousand years were too short a time for the evolutionary development of a truly new and highly complex mental adaptation, which is by no means certain, it is certainly long enough for some groups to lose such an adaptation, for some groups to develop a highly exaggerated version of an adaptation, or for changes in the triggers or timing of that adaptation to evolve. That is what we see in domesticated dogs, for example, who have entirely lost certain key behavioral adaptations of wolves such as paternal investment. Other wolf behaviors have been exaggerated or distorted. A border collie's herding is recognizably derived from wolf behaviors, as is a terrier's aggressiveness, but this hardly means that collies, wolves, and terriers are all the same. Paternal investment may be particularly fragile and easily lost in mammals, because parental investment via internal gestation and lactation is engineered into females but not males.
In all fairness, when the EEA concept was first developed, few people were arguing that natural selection has modified human behavior over the last 50,000 years. In fact, the dominant view was the opposite: that natural selection has not shaped any specific human behavioral traits, not now, not over the past fifty thousand years, and not over the past fifty million. Not ever. The mind was a tabula rasa. Even sociobiologists, often castigated as biological determinists, commonly thought that people were simply predisposed to learn adaptively: “natural selection has produced in humans a general motivation to maximize one’s inclusive fitness—i.e., a domain-general psychological mechanism” (Buss, 1991, p. 463).
The EEA was part of a new paradigm, now called evolutionary psychology, to move away from the domain-general approach of sociobiology and to search for specific innate mechanisms within the human mind. Its earliest proponents saw the EEA not as a dogma, but as a guide—as a way of making people look at human nature from a broader evolutionary perspective, and not from the narrower one of modern industrial life.
The EEA concept has served us well. But it is now time to move on.
References
Buss, D.M. (1991). Evolutionary personality psychology. Annual Review of Psychology, 42, 459-491.
Harpending, H. and G. Cochran. 2002. "In our genes", Proceedings of the National Academy of Sciences 99(1):10-12.
Tooby, J. and L. Cosmides. (2005). Conceptual Foundations of Evolutionary Psychology. In David M. Buss (Ed.) The Handbook of Evolutionary Psychology. (pp. 5-67), Hoboken, NJ: Wiley.
The EEA concept has increasingly come under fire in recent years, especially with the recent Hawks et al. study. It now appears that human genetic evolution did not stop 50,000 years ago. Nor has it since slowed down. In fact, it has accelerated by as much as a 100-fold. In light of these findings, can the EEA concept be salvaged? Should it?
Interestingly, its earliest proponents, John Tooby and Leda Cosmides, have always been reluctant to narrow it down to a specific place and time:
Although the hominid line is thought to have originated on edges of the African savannahs, the EEA is not a particular place or time. The EEA for a given adaptation is the statistical composite of the enduring selection pressures or cause-and-effect relationships that pushed the alleles underlying an adaptation systematically upward in frequency until they became species-typical or reached a frequency-dependent equilibrium (most adaptations are species-typical; see Hagen, Chapter 5, this volume). Because the coordinated fixation of alleles at different loci takes time, complex adaptations reflect enduring features of the ancestral world. (Tooby & Cosmides, 2005, p. 22)
According to Tooby and Cosmides, there are potentially as many EEAs as there are human adaptations. Therefore, some human characteristics may have originated in very old EEAs and others in more recent ones.
How recent? For Tooby and Cosmides, the limiting factor is complexity. The more complex the adaptation, the more genes it will involve, and the longer the evolutionary time to coordinate all those genes. Therefore, recent human evolution has probably only involved simple traits, certainly nothing as complex as behavior.
The problem with this argument is that complex traits do not arise ex nihilo. They arise from changes to existing traits that may be just slightly less complex. A point mutation can greatly alter the functioning of a trait that involves thousands upon thousands of genes. Keep in mind that genes vary considerably in their effects. At one extreme, a single ‘structural’ gene may code for one protein. At the other, a single ‘regulatory’ gene may control the output of numerous structural genes … or even numerous regulatory genes like itself. As Harpending and Cochran (2002) point out:
Even if 40 or 50 thousand years were too short a time for the evolutionary development of a truly new and highly complex mental adaptation, which is by no means certain, it is certainly long enough for some groups to lose such an adaptation, for some groups to develop a highly exaggerated version of an adaptation, or for changes in the triggers or timing of that adaptation to evolve. That is what we see in domesticated dogs, for example, who have entirely lost certain key behavioral adaptations of wolves such as paternal investment. Other wolf behaviors have been exaggerated or distorted. A border collie's herding is recognizably derived from wolf behaviors, as is a terrier's aggressiveness, but this hardly means that collies, wolves, and terriers are all the same. Paternal investment may be particularly fragile and easily lost in mammals, because parental investment via internal gestation and lactation is engineered into females but not males.
In all fairness, when the EEA concept was first developed, few people were arguing that natural selection has modified human behavior over the last 50,000 years. In fact, the dominant view was the opposite: that natural selection has not shaped any specific human behavioral traits, not now, not over the past fifty thousand years, and not over the past fifty million. Not ever. The mind was a tabula rasa. Even sociobiologists, often castigated as biological determinists, commonly thought that people were simply predisposed to learn adaptively: “natural selection has produced in humans a general motivation to maximize one’s inclusive fitness—i.e., a domain-general psychological mechanism” (Buss, 1991, p. 463).
The EEA was part of a new paradigm, now called evolutionary psychology, to move away from the domain-general approach of sociobiology and to search for specific innate mechanisms within the human mind. Its earliest proponents saw the EEA not as a dogma, but as a guide—as a way of making people look at human nature from a broader evolutionary perspective, and not from the narrower one of modern industrial life.
The EEA concept has served us well. But it is now time to move on.
References
Buss, D.M. (1991). Evolutionary personality psychology. Annual Review of Psychology, 42, 459-491.
Harpending, H. and G. Cochran. 2002. "In our genes", Proceedings of the National Academy of Sciences 99(1):10-12.
Tooby, J. and L. Cosmides. (2005). Conceptual Foundations of Evolutionary Psychology. In David M. Buss (Ed.) The Handbook of Evolutionary Psychology. (pp. 5-67), Hoboken, NJ: Wiley.
Friday, December 14, 2007
The rising curve
It was long thought that human genetic evolution pretty much ended with the advent of culture. As Paul Ehrlich (2000, p. 63) wrote:
So did cultural evolution make genetic evolution obsolete? Paul Ehrlich seemed to draw this conclusion … only to pull himself back. “There are many ways in which culture can alter selection pressures,” he says, noting that genes have co-evolved with changes to diet, farming practices, and shelter (Ehrlich, 2000, p. 64). Indeed, the same properties that make cultural evolution so fast have also been diversifying the adaptive landscape at an unparalleled rate. Whenever our species came up with a cultural innovation—a new technology, domestication of a plant or animal, or the advent of agriculture itself—our environment changed as fundamentally as if we had moved to a new ecosystem.
So which factor has mattered most in determining the pace of human genetic evolution? Has cultural evolution been resolving more and more adaptive problems that were formerly resolved by genetic evolution? Or has genetic evolution been resolving more and more adaptive problems because human environments have been diversifying more and more?
The second factor, apparently. A recent study has concluded that genetic evolution has actually accelerated over the past 40,000 years and even more over the past 10,000-15,000. This is partly because there are many more of us and partly because we are spread over an increasingly diverse range of natural and man-made environments. At least 7% of the human genome appears to have changed since the advent of Homo sapiens. And the rate of change has increased a 100-fold since the advent of agriculture (Hawks et al., 2007).
These are high numbers. As one of the study’s authors observes:
These figures, if anything, err on the low side. They do not capture recent selective pressures that are just emerging above noise in the data. Nor do they capture older selective pressures that have already pushed many alleles to fixation. The real figures won’t become known until we’ve retrieved the human genome that existed 40,000 years ago—something that is certainly within the realm of possibility.
All this underlines a point I made in an earlier post: human evolution is not a straight line. It’s a logarithmic curve with most of the evolutionary change in the recent past. If we met a Homo erectus face to face, or even a Neanderthal (who was probably just an arctic-adapted Homo erectus), we wouldn’t consider it to be human. It would look to us like an overgrown ape. Nor would its behavior reassure us otherwise.
References
Ehrlich, P.R. (2000). Human Natures. Genes, Cultures, and the Human Prospect. Penguin: New York.
Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA), 104(52), 20753-20758.
The evolution of that body of extragenetic information—cultural evolution—has been centrally important in making us the unique beasts we are. Cultural evolution rests on a foundation of genetic (or biological) evolution—especially that of our brains and tongues—but can proceed at what by comparison is a lightning pace. … cultural evolution can vastly outpace genetic evolution because it’s not constrained by generation time. Our genes are passed only from one generation to relatives in succeeding generations. In contrast, the units of culture—ideas, basically—are passed among both relatives and nonrelatives not only between generations (in both
directions) but also within generations.
So did cultural evolution make genetic evolution obsolete? Paul Ehrlich seemed to draw this conclusion … only to pull himself back. “There are many ways in which culture can alter selection pressures,” he says, noting that genes have co-evolved with changes to diet, farming practices, and shelter (Ehrlich, 2000, p. 64). Indeed, the same properties that make cultural evolution so fast have also been diversifying the adaptive landscape at an unparalleled rate. Whenever our species came up with a cultural innovation—a new technology, domestication of a plant or animal, or the advent of agriculture itself—our environment changed as fundamentally as if we had moved to a new ecosystem.
So which factor has mattered most in determining the pace of human genetic evolution? Has cultural evolution been resolving more and more adaptive problems that were formerly resolved by genetic evolution? Or has genetic evolution been resolving more and more adaptive problems because human environments have been diversifying more and more?
The second factor, apparently. A recent study has concluded that genetic evolution has actually accelerated over the past 40,000 years and even more over the past 10,000-15,000. This is partly because there are many more of us and partly because we are spread over an increasingly diverse range of natural and man-made environments. At least 7% of the human genome appears to have changed since the advent of Homo sapiens. And the rate of change has increased a 100-fold since the advent of agriculture (Hawks et al., 2007).
These are high numbers. As one of the study’s authors observes:
Personally, I can't believe that nobody noticed how extreme these estimates of recent selection really are. I guess that folks doing genomics just weren't as primed in evolutionary theory to perceive how weird the human estimates looked compared to what is measured in the wild on other species, or even over the span of human evolution!
In the earliest studies, when people were finding that 3 or 4 percent of a sample of genes had signs of recent selection, those numbers were already extremely high. They got even higher, as more and more powerful methods of detecting selection came online. Our current estimate is the highest yet, but even this very high number is perfectly consistent with theoretical predictions coming from human population numbers.
These figures, if anything, err on the low side. They do not capture recent selective pressures that are just emerging above noise in the data. Nor do they capture older selective pressures that have already pushed many alleles to fixation. The real figures won’t become known until we’ve retrieved the human genome that existed 40,000 years ago—something that is certainly within the realm of possibility.
All this underlines a point I made in an earlier post: human evolution is not a straight line. It’s a logarithmic curve with most of the evolutionary change in the recent past. If we met a Homo erectus face to face, or even a Neanderthal (who was probably just an arctic-adapted Homo erectus), we wouldn’t consider it to be human. It would look to us like an overgrown ape. Nor would its behavior reassure us otherwise.
References
Ehrlich, P.R. (2000). Human Natures. Genes, Cultures, and the Human Prospect. Penguin: New York.
Hawks, J., E.T. Wang, G.M. Cochran, H.C. Harpending, and R.K. Moyzis. (2007). Recent acceleration of human adaptive evolution. Proceedings of the National Academy of Sciences (USA), 104(52), 20753-20758.
Friday, December 7, 2007
Facial recognition and skin reflectance
Richard Russell, a postdoc at the Harvard Vision Sciences Laboratory, has just published a study that shows that minor differences in skin pigmentation are more critical to facial recognition than facial shape. In this study, “subjects were asked to recognise color images of the faces of their friends. The images were manipulated such that only reflectance or only shape information was useful for recognizing any particular face. Subjects were actually better at recognizing their friends’ faces from reflectance information than from shape information” (Russell & Sinha, 2007).
Mihai Moldovan has another paper showing that ruddiness acts as a signal of male dominance in humans, especially in a context of male-male competition (see my earlier post on this topic). I’ll have more to say when the paper comes out.
Reference
Russell R and Sinha P, (2007). Real-world face recognition: The importance of surface reflectance properties. Perception 36(9), 1368 – 1374
Mihai Moldovan has another paper showing that ruddiness acts as a signal of male dominance in humans, especially in a context of male-male competition (see my earlier post on this topic). I’ll have more to say when the paper comes out.
Reference
Russell R and Sinha P, (2007). Real-world face recognition: The importance of surface reflectance properties. Perception 36(9), 1368 – 1374
Friday, November 30, 2007
Convergent evolution?
In an earlier post, I discussed how mtDNA evidence now shows that the Neanderthals ranged at least as far east as Lake Baikal. This finding is significant because there no longer seems to have been any geographical or ecological barrier to Neanderthal occupation throughout non-tropical Eurasia.
This point has been commented on by Michelle M. (Mica) Glantz, an associate professor in anthropology at Colorado State University. In an interview with anthropologist John Hawks, she argues that the European Neanderthals may simply have been one of several interbreeding Homo erectus populations that inhabited Eurasia. They may have been more Arctic-adapted and more specialized in various ways but they were not genetically isolated, at least not fully, from other Homo erectus populations.
If we keep moving the Neandertal boundary eastward, then wouldn't Neandertals cease being a recognizable entity that is really separate from other archaic groups in the Old World during the Middle Paleolithic? In other words, who isn't a Neandertal in this case? Certainly we do not have enough similarly aged specimens from China and other points east to make thorough comparisons, but really the specimens we do have are usually not included in any of our analyses that are concerned with European Neandertals. Exceptions to this, like Rosenberg et al. (2006) study of Jinniushan, show that Asian specimens often look like they are part of the same cline as European Neandertals.
This discovery is one of several challenges to the traditional view, which sees the Neanderthals as being not only distinct from other archaic humans but also intermediate on the line of descent from Homo erectus to modern humans. Indeed, the Neanderthals increasingly look like an evolutionary dead end. Modern humans seem to owe most, if not all, of their ancestry to a demic expansion that started in East Africa some 80,000 years ago and began to spread out of Africa some 50,000 years ago. There may have been some intermixture with archaic humans already present in Europe and Asia, but even this scenario is looking more and more problematic. We can now compare mtDNA from late European Neanderthals with mtDNA from early modern Europeans and there is no measurable gene flow from the former to the latter (Caramelli et al., 2003). Perhaps some minor intermixture did occur here and there, enough to provide the modern European gene pool with a few advantageous Neanderthal genes. It could not have been greater, however, than the surreptitious insertion of certain viral and bacterial genes into the human genome.
Why, then, did scientists place the Neanderthals above Homo erectus, even to the point of classifying them as a subspecies of Homo sapiens? Because Neanderthal brains were so big, like ours. This resemblance now appears to be just a case of convergent evolution. When humans first spread out of Africa, some 1.8 to 1.7 million years ago, their brains increased in size wherever they entered non-tropical environments, apparently because such environments were mentally more demanding (need to cope with seasonal variation in temperature, to identify and/or create shelters, to hunt over larger and riskier territory, etc.). This in situ evolution seems to have progressed the furthest in populations that we call ‘Neanderthal.’
Then, 50,000 years ago, humans again spread out of Africa, probably because a change in their neural wiring gave them an edge over archaic humans already established in Eurasia. This second wave continued to evolve as it spread into different environments with different adaptive landscapes—a subject that will be the focus of an upcoming PNAS article by Greg Cochran, Henry Harpending, John Hawks, Robert Moyzis, and Eric Wang.
The second wave out of Africa is called Homo sapiens whereas the first wave is called Homo erectus. But these are just names that simplify reality. There was and has been considerable variation and evolution within both ‘species.’
References
Caramelli, D., Laluez-Fox, C., Vernesi, C., Lari, M., Casoli, A., Mallegni, F., Chiarelli, B., Dupanloup, I., Bertranpetit, J., Barbujani, G., & Bertorelle, G. (2003). Evidence for a genetic discontinuity between Neandertals and 24,000-year-old anatomically modern Europeans. Proceedings of the National Academy of Sciences USA, 100, 6593-6597.
Rosenberg KR, Zuné L, Ruff CB. 2006. Body size, proportions, and encephalization in a Middle Pleistocene archaic human from northern China. Proceedings of the National Academy of Sciences USA, 103, 3552-3556. doi:10.1073/pnas.0508681103
This point has been commented on by Michelle M. (Mica) Glantz, an associate professor in anthropology at Colorado State University. In an interview with anthropologist John Hawks, she argues that the European Neanderthals may simply have been one of several interbreeding Homo erectus populations that inhabited Eurasia. They may have been more Arctic-adapted and more specialized in various ways but they were not genetically isolated, at least not fully, from other Homo erectus populations.
If we keep moving the Neandertal boundary eastward, then wouldn't Neandertals cease being a recognizable entity that is really separate from other archaic groups in the Old World during the Middle Paleolithic? In other words, who isn't a Neandertal in this case? Certainly we do not have enough similarly aged specimens from China and other points east to make thorough comparisons, but really the specimens we do have are usually not included in any of our analyses that are concerned with European Neandertals. Exceptions to this, like Rosenberg et al. (2006) study of Jinniushan, show that Asian specimens often look like they are part of the same cline as European Neandertals.
This discovery is one of several challenges to the traditional view, which sees the Neanderthals as being not only distinct from other archaic humans but also intermediate on the line of descent from Homo erectus to modern humans. Indeed, the Neanderthals increasingly look like an evolutionary dead end. Modern humans seem to owe most, if not all, of their ancestry to a demic expansion that started in East Africa some 80,000 years ago and began to spread out of Africa some 50,000 years ago. There may have been some intermixture with archaic humans already present in Europe and Asia, but even this scenario is looking more and more problematic. We can now compare mtDNA from late European Neanderthals with mtDNA from early modern Europeans and there is no measurable gene flow from the former to the latter (Caramelli et al., 2003). Perhaps some minor intermixture did occur here and there, enough to provide the modern European gene pool with a few advantageous Neanderthal genes. It could not have been greater, however, than the surreptitious insertion of certain viral and bacterial genes into the human genome.
Why, then, did scientists place the Neanderthals above Homo erectus, even to the point of classifying them as a subspecies of Homo sapiens? Because Neanderthal brains were so big, like ours. This resemblance now appears to be just a case of convergent evolution. When humans first spread out of Africa, some 1.8 to 1.7 million years ago, their brains increased in size wherever they entered non-tropical environments, apparently because such environments were mentally more demanding (need to cope with seasonal variation in temperature, to identify and/or create shelters, to hunt over larger and riskier territory, etc.). This in situ evolution seems to have progressed the furthest in populations that we call ‘Neanderthal.’
Then, 50,000 years ago, humans again spread out of Africa, probably because a change in their neural wiring gave them an edge over archaic humans already established in Eurasia. This second wave continued to evolve as it spread into different environments with different adaptive landscapes—a subject that will be the focus of an upcoming PNAS article by Greg Cochran, Henry Harpending, John Hawks, Robert Moyzis, and Eric Wang.
The second wave out of Africa is called Homo sapiens whereas the first wave is called Homo erectus. But these are just names that simplify reality. There was and has been considerable variation and evolution within both ‘species.’
References
Caramelli, D., Laluez-Fox, C., Vernesi, C., Lari, M., Casoli, A., Mallegni, F., Chiarelli, B., Dupanloup, I., Bertranpetit, J., Barbujani, G., & Bertorelle, G. (2003). Evidence for a genetic discontinuity between Neandertals and 24,000-year-old anatomically modern Europeans. Proceedings of the National Academy of Sciences USA, 100, 6593-6597.
Rosenberg KR, Zuné L, Ruff CB. 2006. Body size, proportions, and encephalization in a Middle Pleistocene archaic human from northern China. Proceedings of the National Academy of Sciences USA, 103, 3552-3556. doi:10.1073/pnas.0508681103
Saturday, November 24, 2007
Sex linkage of human skin, hair, and eye color
Much of my writing has focused on sexual selection of women and how it may have structured certain pigmentary traits in our species—specifically by lightening skin color and by diversifying hair and eye color. Since most of the relevant alleles are not sex-linked, this selection would have spilled over on to men as well.
Yet, if this selection acted primarily on women, shouldn’t it have tended to favor sex-linked alleles that confine these pigmentary traits to females? All things being equal, wouldn’t such alleles have come to replace those that are not sex-linked? Indeed, Mother Nature loves organisms that don’t waste their energy on things they don’t need. We see this principle in the loss of pigmentation by organisms that live solely in dark caves. It’s not because albino skin is now more useful. It’s because pigmented skin is now useless and may be dispensed with.
Human skin color does show sex linkage. From puberty on, women are lighter-skinned than men in all human populations. This sexual dimorphism seems to be greater in populations that are medium in skin color, perhaps because floor and ceiling effects constrain its expression in populations that are either very dark or very light-skinned (Frost, 2006, pp. 54-60; Frost, 2007; Jablonski & Chaplin, 2000; Madrigal & Kelly, 2006). In women, lightness of skin correlates with thickness of subcutaneous fat, apparently because of a common hormonal causation and not because of a mechanical effect of fat on skin color (Mazess, 1967). It also correlates with digit ratio, which in turn correlates with prenatal estrogenization (Manning et al., 2004). It is this exposure to estrogen before birth that seems to “program” the lightening of female skin after puberty.
Hair color too shows some sex linkage. Hair is darker in girls than in boys before puberty and then lighter afterwards (Keiter, 1952; Leguebe & Twiesselmann, 1976; Olivier, 1960, p. 74; Steggerda, 1941). In a still unpublished British study, digit ratios were found to be higher in blond participants than in darker-haired ones. This finding, if true, suggests increased prenatal estrogenization among people with blond hair.
For eye color, we have no studies that track variation by sex and age. A study of Icelander and Dutch adults found green eyes to be much more prevalent in women than in men (by at least a factor of two). Blue eyes were less prevalent and brown eyes somewhat more prevalent. The participants, however, seem to have been very heterogeneous for age. Many had been recruited for a prostate cancer study among the men or for a breast cancer study among the women (Patrick et al., 2007). Razib discusses this topic on ‘Brown eyed girl’ at GNXP. In the above unpublished British study, digit ratios were found to be higher in light-eyed participants than in brown-eyed ones. This finding, if true, suggests increased prenatal estrogenization among people with non-brown eyes.
References
Frost, P. (2007). Comment on Human skin-color sexual dimorphism: A test of the sexual selection hypothesis, American Journal of Physical Anthropology, 133, 779-781.
Frost, P. (2006). European hair and eye color - A case of frequency-dependent sexual selection? Evolution and Human Behavior, 27, 85-103.
Frost, P. (2005). Fair Women, Dark Men. The Forgotten Roots of Color Prejudice. Cybereditions: Christchurch (New Zealand).
Jablonski, N.G., and Chaplin, G. (2000). The evolution of human skin coloration. Journal of Human Evolution, 39, 57-106.
Keiter, F. (1952). Über ͈Nachdunkeln” und Vererbung der Haarfarben. Z. Morph. Anthrop. 44, 115-126.
Leguebe, A., & Twiesselmann, F. (1976). Variations de la couleur des cheveux avec l’âge. Z. Morph. Anthrop. 67, 168-180.
Madrigal, L, and Kelly, W. (2006). Human skin-color sexual dimorphism: A test of the sexual selection hypothesis. American Journal of Physical Anthropology, 132, 470-482.
Manning, J.T., Bundred, P.E., and Mather, F.M. (2004). Second to fourth digit ratio, sexual selection, and skin colour. Evolution and Human Behavior, 25, 38-50.
Mazess, R.B. (1967). Skin color in Bahamian Negroes. Human Biology, 39, 145‑154.
Olivier, G. (1960). Pratique anthropologique. Paris: Vigot Frères.
Patrick, S., D.F. Gudbjartsson, S.N. Stacey, A. Helgason,T. Rafnar, K.P Magnusson, A. Manolescu, A. Karason, A. Palsson, G. Thorleifsson, M. Jakobsdottir, S. Steinberg, S. Pálsson, F. Jonasson, B. Sigurgeirsson, K. Thorisdottir, R. Ragnarsson, K.R. Benediktsdottir, K.K. Aben, L.A. Kiemeney, J.H. Olafsson, J. Gulcher, A. Kong, U. Thorsteinsdottir, and K. Stefansson. (2007). Genetic determinants of hair, eye and skin pigmentation in Europeans. Nature Genetics Published online: 21 October 2007 doi:10.1038/ng.2007.13
Steggerda, M. (1941). Change in hair color with age. Journal of Heredity, 32, 402-403.
Yet, if this selection acted primarily on women, shouldn’t it have tended to favor sex-linked alleles that confine these pigmentary traits to females? All things being equal, wouldn’t such alleles have come to replace those that are not sex-linked? Indeed, Mother Nature loves organisms that don’t waste their energy on things they don’t need. We see this principle in the loss of pigmentation by organisms that live solely in dark caves. It’s not because albino skin is now more useful. It’s because pigmented skin is now useless and may be dispensed with.
Human skin color does show sex linkage. From puberty on, women are lighter-skinned than men in all human populations. This sexual dimorphism seems to be greater in populations that are medium in skin color, perhaps because floor and ceiling effects constrain its expression in populations that are either very dark or very light-skinned (Frost, 2006, pp. 54-60; Frost, 2007; Jablonski & Chaplin, 2000; Madrigal & Kelly, 2006). In women, lightness of skin correlates with thickness of subcutaneous fat, apparently because of a common hormonal causation and not because of a mechanical effect of fat on skin color (Mazess, 1967). It also correlates with digit ratio, which in turn correlates with prenatal estrogenization (Manning et al., 2004). It is this exposure to estrogen before birth that seems to “program” the lightening of female skin after puberty.
Hair color too shows some sex linkage. Hair is darker in girls than in boys before puberty and then lighter afterwards (Keiter, 1952; Leguebe & Twiesselmann, 1976; Olivier, 1960, p. 74; Steggerda, 1941). In a still unpublished British study, digit ratios were found to be higher in blond participants than in darker-haired ones. This finding, if true, suggests increased prenatal estrogenization among people with blond hair.
For eye color, we have no studies that track variation by sex and age. A study of Icelander and Dutch adults found green eyes to be much more prevalent in women than in men (by at least a factor of two). Blue eyes were less prevalent and brown eyes somewhat more prevalent. The participants, however, seem to have been very heterogeneous for age. Many had been recruited for a prostate cancer study among the men or for a breast cancer study among the women (Patrick et al., 2007). Razib discusses this topic on ‘Brown eyed girl’ at GNXP. In the above unpublished British study, digit ratios were found to be higher in light-eyed participants than in brown-eyed ones. This finding, if true, suggests increased prenatal estrogenization among people with non-brown eyes.
References
Frost, P. (2007). Comment on Human skin-color sexual dimorphism: A test of the sexual selection hypothesis, American Journal of Physical Anthropology, 133, 779-781.
Frost, P. (2006). European hair and eye color - A case of frequency-dependent sexual selection? Evolution and Human Behavior, 27, 85-103.
Frost, P. (2005). Fair Women, Dark Men. The Forgotten Roots of Color Prejudice. Cybereditions: Christchurch (New Zealand).
Jablonski, N.G., and Chaplin, G. (2000). The evolution of human skin coloration. Journal of Human Evolution, 39, 57-106.
Keiter, F. (1952). Über ͈Nachdunkeln” und Vererbung der Haarfarben. Z. Morph. Anthrop. 44, 115-126.
Leguebe, A., & Twiesselmann, F. (1976). Variations de la couleur des cheveux avec l’âge. Z. Morph. Anthrop. 67, 168-180.
Madrigal, L, and Kelly, W. (2006). Human skin-color sexual dimorphism: A test of the sexual selection hypothesis. American Journal of Physical Anthropology, 132, 470-482.
Manning, J.T., Bundred, P.E., and Mather, F.M. (2004). Second to fourth digit ratio, sexual selection, and skin colour. Evolution and Human Behavior, 25, 38-50.
Mazess, R.B. (1967). Skin color in Bahamian Negroes. Human Biology, 39, 145‑154.
Olivier, G. (1960). Pratique anthropologique. Paris: Vigot Frères.
Patrick, S., D.F. Gudbjartsson, S.N. Stacey, A. Helgason,T. Rafnar, K.P Magnusson, A. Manolescu, A. Karason, A. Palsson, G. Thorleifsson, M. Jakobsdottir, S. Steinberg, S. Pálsson, F. Jonasson, B. Sigurgeirsson, K. Thorisdottir, R. Ragnarsson, K.R. Benediktsdottir, K.K. Aben, L.A. Kiemeney, J.H. Olafsson, J. Gulcher, A. Kong, U. Thorsteinsdottir, and K. Stefansson. (2007). Genetic determinants of hair, eye and skin pigmentation in Europeans. Nature Genetics Published online: 21 October 2007 doi:10.1038/ng.2007.13
Steggerda, M. (1941). Change in hair color with age. Journal of Heredity, 32, 402-403.
Friday, November 16, 2007
Natural selection in proto-industrial Europe
Do human populations vary statistically in their mental abilities and predispositions? And if so, why? Such questions have lately refocused from the ‘macro’ to the ‘micro’ level—from differences that may have developed in prehistory between large continental populations to those that may have arisen in historic times between much smaller groups.
In part, the focus is now on the higher intellectual performance of Ashkenazi Jews (Cochran et al., 2006; Murray, 2007). It is also on the attitudinal and behavioural changes that ultimately sparked England’s Industrial Revolution. In a newly released book, Gregory Clark (2007) argues that natural selection gradually raised the English population to a threshold that made this economic sea-change possible, specifically by selecting for middle-class values of non-violence, thrift and foresight. The question remains, of course, as to why this threshold was first reached in England.
This new focus reflects a growing recognition that natural selection does not need aeons of time to change a population significantly (Eberle et al., 2006; Harpending & Cochran, 2002; Voight et al., 2006; Wang et al., 2006). Change can occur over a dozen generations, certainly during the last six millennia of history. This possibility is all the likelier given that these millennia have seen humans specialize in a wide range of occupations, some more mentally demanding than others. Charles Murray, for one, has argued that selection for intelligence was historically weaker in farming and stronger in sales, finance and trade.
He may be right. But the reason, I believe, lies not so much in the occupation itself as in its relations of production.
In the Middle Ages and earlier, farmers had little scope for economic achievement and just as little for the intelligence that contributes to achievement. Most farmers were peasants who produced enough for themselves, plus a surplus for the landowner. A peasant could produce a larger surplus, but what then? Sell it on the local market? The possibilities there were slim because most people grew their own food. Sell it on several markets both near and far? That would mean dealing with a lot of surly highwaymen. And what would stop the landowner from seizing the entire surplus? After all, it was his land and his peasant.
The situation changes when farmers own their land and sell their produce over a wide geographical area. Consider the "Yankee" farmers who spread westward out of New England in the 18th and 19th centuries. They contributed very disproportionately to American inventiveness, literature, education and philanthropy. Although they lived primarily from farming, they did not at all have the characteristics we associate with the word "peasant".
Conversely, trade and finance have not always been synonymous with high achievement. In the Middle Ages, the slow growth economy allowed little room for business expansion within one's immediate locality, and expansion further afield was hindered by brigandage and bad roads. Furthermore, the static economic environment created few novel situations that required true intelligence. How strong is selection for intelligence among people who deal with the same clients, perform the same transactions and charge the same prices year in and year out?
This point has a bearing on the reported IQ differences between Ashkenazi and Sephardic Jews. Charles Murray, like others, believes that the Ashkenazim were more strongly selected for intelligence because more of them worked in sales, finance and trade during the Middle Ages. Now, we have no good data on the occupations of medieval Ashkenazim and Sephardim. But the earliest censuses (18th century for Polish Jews and 19th century for Algerian Jews) show little difference, with the bulk of both groups working in crafts.
There was, however, one major demographic difference. While the Sephardim grew slowly in numbers up to the 20th century, the Ashkenazim increased from about 500,000 in 1650 to 10 million in 1900. The same period saw strong population growth among Europeans in general. This boom used to be attributed to falling death rates alone, but demographers now recognize that rising birth rates were also key, in some countries more so. England, in particular, saw a rise in fertility that contributed two and a half times as much to the increase in growth rates as did the fall in mortality, largely through a younger age of first marriage. This was how England overtook France in total population.
The baby boom was particularly strong among one class of people: semi-rural artisans who produced for the larger, more elastic markets that developed with the expanding network of roads, canals and, later, railways. Their family workshops were the main means for mass-producing textiles, light metalwork, pottery, leather goods and wood furnishings before the advent of factory capitalism. Unlike the craft guilds of earlier periods, they operated in a dynamic economic environment that had few controls over prices, markets or entry into the workforce. "They were not specialized craftsmen in life-trades with skills developed through long years of apprenticeship; they were semi-skilled family labour teams which set up in a line of business very quickly, adapting to shifts in market demand" (Seccombe, 1992, p. 182). Their workforce was their household. In more successful households, the workers would marry earlier and have as many children as possible. In less successful ones, they would postpone marriage or never marry.
In Western Europe, these cottage industries were located in areas like Ulster, Lancashire, Yorkshire, Brittany, Flanders, Alsace, Westphalia, Saxony, the Zurich uplands, the Piedmont and Lombardy. In Eastern Europe, they were concentrated among Ashkenazi Jews. Selection for intelligence among the Ashkenazim may thus have been part of a larger European-wide selection for intelligence among cottage industry workers. These entrepreneurial artisans had optimal conditions for selection: 1) tight linkage between success on an intelligence-demanding task and economic achievement; 2) broad scope for economic achievement; 3) tight linkage between economic achievement and reproductive success; and 4) broad scope for reproductive success. Such artisans were a minority in Western Europe. Among the Ashkenazim, they appear to have been the majority.
In the late 19th century, cottage industries gave way to factories and the tight linkage between economic achievement and reproductive success came undone. Entrepreneurs could now expand production by hiring more workers. Henry Ford, for instance, produced millions of his Model T but had only one child.
Thus, it is not the type of occupation that drives selection for intelligence. It is the relations of production. In particular, do people own their means of production? Do they operate in a large, elastic market that rewards progressively higher levels of ability with commensurate increases in production? Finally, do they meet the increased demand for labour by increasing the size of their families?
References
Clark, G. (2007). A Farewell to Alms: A Brief Economic History of the World. Princeton (NJ): Princeton University Press.
Cochran, G., Hardy, J., & Harpending, H. (2006). Natural history of Ashkenazi intelligence. Journal of Biosocial Science, 38, 659-693.
Eberle, M.A., M.J. Rieder, L. Kruglyak, D.A. Nickerson. (2006). Allele frequency matching between SNPs reveals an excess of linkage disequilibrium in genic regions of the human genome. PLoS Genet 2(9), e142
Harpending, H., & G. Cochran, G. (2002). In our genes. Proceedings of the National Academy of Sciences, 99(1), 10-12.
Murray, C. (2007). Jewish Genius. Commentary, April.
Seccombe, W. 1992. A Millennium of Family Change. Feudalism to Capitalism in Northwestern Europe, London: Verso.
Voight B.F., Kudaravalli, S., Wen, X., &Pritchard, J.K. (2006). A map of recent positive selection in the human genome. PLoS Biol, 4(3), e72
Wang, E.T., Kodama, G., Baldi, P., & Moyzis, R.K. (2006). Global landscape of recent inferred Darwinian selection for Homo sapiens, Proc. Natl. Acad. Sci USA, 103, 135-140.
In part, the focus is now on the higher intellectual performance of Ashkenazi Jews (Cochran et al., 2006; Murray, 2007). It is also on the attitudinal and behavioural changes that ultimately sparked England’s Industrial Revolution. In a newly released book, Gregory Clark (2007) argues that natural selection gradually raised the English population to a threshold that made this economic sea-change possible, specifically by selecting for middle-class values of non-violence, thrift and foresight. The question remains, of course, as to why this threshold was first reached in England.
This new focus reflects a growing recognition that natural selection does not need aeons of time to change a population significantly (Eberle et al., 2006; Harpending & Cochran, 2002; Voight et al., 2006; Wang et al., 2006). Change can occur over a dozen generations, certainly during the last six millennia of history. This possibility is all the likelier given that these millennia have seen humans specialize in a wide range of occupations, some more mentally demanding than others. Charles Murray, for one, has argued that selection for intelligence was historically weaker in farming and stronger in sales, finance and trade.
He may be right. But the reason, I believe, lies not so much in the occupation itself as in its relations of production.
In the Middle Ages and earlier, farmers had little scope for economic achievement and just as little for the intelligence that contributes to achievement. Most farmers were peasants who produced enough for themselves, plus a surplus for the landowner. A peasant could produce a larger surplus, but what then? Sell it on the local market? The possibilities there were slim because most people grew their own food. Sell it on several markets both near and far? That would mean dealing with a lot of surly highwaymen. And what would stop the landowner from seizing the entire surplus? After all, it was his land and his peasant.
The situation changes when farmers own their land and sell their produce over a wide geographical area. Consider the "Yankee" farmers who spread westward out of New England in the 18th and 19th centuries. They contributed very disproportionately to American inventiveness, literature, education and philanthropy. Although they lived primarily from farming, they did not at all have the characteristics we associate with the word "peasant".
Conversely, trade and finance have not always been synonymous with high achievement. In the Middle Ages, the slow growth economy allowed little room for business expansion within one's immediate locality, and expansion further afield was hindered by brigandage and bad roads. Furthermore, the static economic environment created few novel situations that required true intelligence. How strong is selection for intelligence among people who deal with the same clients, perform the same transactions and charge the same prices year in and year out?
This point has a bearing on the reported IQ differences between Ashkenazi and Sephardic Jews. Charles Murray, like others, believes that the Ashkenazim were more strongly selected for intelligence because more of them worked in sales, finance and trade during the Middle Ages. Now, we have no good data on the occupations of medieval Ashkenazim and Sephardim. But the earliest censuses (18th century for Polish Jews and 19th century for Algerian Jews) show little difference, with the bulk of both groups working in crafts.
There was, however, one major demographic difference. While the Sephardim grew slowly in numbers up to the 20th century, the Ashkenazim increased from about 500,000 in 1650 to 10 million in 1900. The same period saw strong population growth among Europeans in general. This boom used to be attributed to falling death rates alone, but demographers now recognize that rising birth rates were also key, in some countries more so. England, in particular, saw a rise in fertility that contributed two and a half times as much to the increase in growth rates as did the fall in mortality, largely through a younger age of first marriage. This was how England overtook France in total population.
The baby boom was particularly strong among one class of people: semi-rural artisans who produced for the larger, more elastic markets that developed with the expanding network of roads, canals and, later, railways. Their family workshops were the main means for mass-producing textiles, light metalwork, pottery, leather goods and wood furnishings before the advent of factory capitalism. Unlike the craft guilds of earlier periods, they operated in a dynamic economic environment that had few controls over prices, markets or entry into the workforce. "They were not specialized craftsmen in life-trades with skills developed through long years of apprenticeship; they were semi-skilled family labour teams which set up in a line of business very quickly, adapting to shifts in market demand" (Seccombe, 1992, p. 182). Their workforce was their household. In more successful households, the workers would marry earlier and have as many children as possible. In less successful ones, they would postpone marriage or never marry.
In Western Europe, these cottage industries were located in areas like Ulster, Lancashire, Yorkshire, Brittany, Flanders, Alsace, Westphalia, Saxony, the Zurich uplands, the Piedmont and Lombardy. In Eastern Europe, they were concentrated among Ashkenazi Jews. Selection for intelligence among the Ashkenazim may thus have been part of a larger European-wide selection for intelligence among cottage industry workers. These entrepreneurial artisans had optimal conditions for selection: 1) tight linkage between success on an intelligence-demanding task and economic achievement; 2) broad scope for economic achievement; 3) tight linkage between economic achievement and reproductive success; and 4) broad scope for reproductive success. Such artisans were a minority in Western Europe. Among the Ashkenazim, they appear to have been the majority.
In the late 19th century, cottage industries gave way to factories and the tight linkage between economic achievement and reproductive success came undone. Entrepreneurs could now expand production by hiring more workers. Henry Ford, for instance, produced millions of his Model T but had only one child.
Thus, it is not the type of occupation that drives selection for intelligence. It is the relations of production. In particular, do people own their means of production? Do they operate in a large, elastic market that rewards progressively higher levels of ability with commensurate increases in production? Finally, do they meet the increased demand for labour by increasing the size of their families?
References
Clark, G. (2007). A Farewell to Alms: A Brief Economic History of the World. Princeton (NJ): Princeton University Press.
Cochran, G., Hardy, J., & Harpending, H. (2006). Natural history of Ashkenazi intelligence. Journal of Biosocial Science, 38, 659-693.
Eberle, M.A., M.J. Rieder, L. Kruglyak, D.A. Nickerson. (2006). Allele frequency matching between SNPs reveals an excess of linkage disequilibrium in genic regions of the human genome. PLoS Genet 2(9), e142
Harpending, H., & G. Cochran, G. (2002). In our genes. Proceedings of the National Academy of Sciences, 99(1), 10-12.
Murray, C. (2007). Jewish Genius. Commentary, April.
Seccombe, W. 1992. A Millennium of Family Change. Feudalism to Capitalism in Northwestern Europe, London: Verso.
Voight B.F., Kudaravalli, S., Wen, X., &Pritchard, J.K. (2006). A map of recent positive selection in the human genome. PLoS Biol, 4(3), e72
Wang, E.T., Kodama, G., Baldi, P., & Moyzis, R.K. (2006). Global landscape of recent inferred Darwinian selection for Homo sapiens, Proc. Natl. Acad. Sci USA, 103, 135-140.
Friday, November 9, 2007
Neanderthal Redheads?
Recently, a Spanish team has analyzed the bones of two Neanderthals and recovered the MC1R gene—the one that controls hair color and, in the case of red hair, lightens skin color. Their findings? First, both of the Neanderthals had an MC1R variant unlike any that now exist in modern humans. This pours cold water on the idea that the unique hair colors of modern Europeans are due to Neanderthal intermixture. Second, the MC1R variant looks like a ‘loss-of-function’ allele, much like the ones that now produce red hair.
According to the lead author, “In Neanderthals, there was probably the whole range of hair colour we see today in modern European populations, from dark to blond right through to red” (Rincon, 2007). Well, maybe yes, but maybe no. Europeans have a diverse palette of hair colors because their MC1R gene has at least 11 functionally different alleles—and not because a new allele replaced the original African one. In fact, the ‘African’ allele is still common in Europe.
Will the Neanderthals be shown to have many MC1R variants, just like modern Europeans? The lead author seems to think so, as does Dr Clive Finlayson, director of the Gibraltar Museum, who says that Neanderthal and modern European hair color may reflect a common “propensity towards the reduction of melanin in populations away from the tropics. … a good example of parallel, or convergent evolution - a similar evolutionary response to the same situation” (Rincon, 2007). Melanin was reduced because there was less natural selection for dark skin (i.e., for protection against solar UV) and more for light skin (i.e., for vitamin D synthesis).
Neither selective pressure, however, would have diversified hair color, at least not that of modern Europeans:
1. When there is less selection for dark skin, ‘loss-of-function’ variants will proliferate at any gene associated with skin color. But such proliferation needs about a million years to produce the hair-color variability that Europeans now display, including c. 80,000 years to produce just the current prevalence of red hair (Harding et al., 2000; Templeton, 2002).
2. When there is more selection for light skin, the original allele at any one gene will be replaced by an allele that optimally reduces skin pigmentation. But the overall number of alleles will remain the same.
Moreover, if we examine the many homozygous and heterozygous combinations of hair color (MC1R) alleles, most have little visible effect on skin pigmentation, except for the ones that produce red hair (Duffy et al., 2004). It is difficult to see how either relaxed selection for dark skin or increased selection for light skin could have given rise to most of these alleles, especially over such a short span of evolutionary time.
It was probably another selective pressure that diversified European hair color. Sexual selection is especially likely because it is known to produce bright color traits, especially polymorphic ones.
Sexual selection is also indicated by the geographic distribution of hair-color diversity. During the last ice age, particularly on the steppe-tundra of northern and eastern Europe, a unique demographic environment intensified sexual selection of women by reducing the supply of men and by limiting polygyny. Hunters die in proportion to the distances they cover, and hunting distance was at a maximum on these open plains with their dispersed and highly mobile herds. Since women depended on men for sustenance, there being little food to gather on the tundra, only the ablest hunter could provide for multiple wives. Result: too many women competing for too few men. In a saturated, competitive market, success hinges on visual merchandising; therefore, a woman with a bright, novel color would have attracted attention and edged out otherwise equal but bland rivals. It is this sexual selection for both brightness and novelty that may have multiplied the number of hair color alleles among early modern Europeans.
This demographic environment was unknown to the Neanderthals. Unlike early modern Europeans, they never colonized the steppe-tundra of northern and eastern Europe under ice age conditions:
When temperatures declined during the Early Pleniglacial (OIS 4), the Neanderthals were apparently unable to cope with periglacial loess-steppe environments on the East European Plain. Much of the latter seems to have been abandoned by Neanderthals at this time, although some areas (notably the southwest regions) were reoccupied during the milder Middle Pleniglacial (OIS 3). By contrast, modern humans successfully colonized the periglacial loess-steppe during the terminal phases of OIS 3 and the subsequent Last Glacial Maximum (OIS 2). (Hoffecker, 2002, p. 136)
The Neanderthals also seem to have been characterized by relatively limited movements and small territories in comparison to recent hunter-gatherers in northern latitudes. (Hoffecker, 2002, p. 135).
These archaic humans seem to have occupied a niche that excluded hunting of herds over expanses of steppe-tundra. Among the many environments that modern humans colonized, this seems to be the one that most intensified sexual selection of women—by reducing both the supply of men and their demand for mates.
So I doubt that the Neanderthals had a wide range of hair colors. They probably all sported the same reddish coat. Coat? Yes, you read right. They were as furry as bears. How else did they survive in subzero temperatures without tailored clothing? (Hoffecker, 2002, pp. 107, 109, 135, 252). Moreover, both needles and the human body louse (which lives in clothing) seem to date back no earlier than 50,000 years ago, i.e., the transition from Neanderthals to modern humans (Harris, 2006).
References
Duffy, D.L., Box, N.F., Chen, W., Palmer, J.S., Montgomery, G.W., James, M.R., Hayward, N.K., Martin, N.G., and Sturm, R.A. (2004). Interactive effects of MC1R and OCA2 on melanoma risk phenotypes. Human Molecular Genetics, 13, 447-461.
Harding, R.M., Healy, E., Ray, A.J., Ellis, N.S., Flanagan, N., Todd, C., Dixon, C., Sajantila, A., Jackson, I.J., Birch?Machin, M.A., and Rees, J.L. (2000). Evidence for variable selective pressures at MC1R. American Journal of Human Genetics, 66, 1351?1361.
Harris, J. R. (2006). Parental selection: A third selection process in the evolution of human hairlessness and skin color. Medical Hypotheses, 66, 1053-1059.
Hoffecker, J.F. (2002). Desolate Landscapes. Ice-Age Settlement in Eastern Europe. New Brunswick: Rutgers University Press.
Laleuza-Fox, C., Römpler, H., Caramelli, D., Stäubert, C., Catalano, G., Hughes, D., Rohland, N., Pilli, E., Longo, L., Condemi, S., de la Rasilla, M., Fortea, J., Rosas, A., Stoneking, M., Schöneberg, T., Bertranpetit, J., Hofreiter, M. (2007). A Melanocortin 1 receptor allele suggests varying pigmentation among Neanderthals. Science. doi:10.1126/science.1147417.
Rincon, P. (2007). Neanderthals 'were flame-haired'. BBC.co.uk. October 25, 2007.
Templeton, A.R. (2002). Out of Africa again and again. Nature, 416, 45-51.
According to the lead author, “In Neanderthals, there was probably the whole range of hair colour we see today in modern European populations, from dark to blond right through to red” (Rincon, 2007). Well, maybe yes, but maybe no. Europeans have a diverse palette of hair colors because their MC1R gene has at least 11 functionally different alleles—and not because a new allele replaced the original African one. In fact, the ‘African’ allele is still common in Europe.
Will the Neanderthals be shown to have many MC1R variants, just like modern Europeans? The lead author seems to think so, as does Dr Clive Finlayson, director of the Gibraltar Museum, who says that Neanderthal and modern European hair color may reflect a common “propensity towards the reduction of melanin in populations away from the tropics. … a good example of parallel, or convergent evolution - a similar evolutionary response to the same situation” (Rincon, 2007). Melanin was reduced because there was less natural selection for dark skin (i.e., for protection against solar UV) and more for light skin (i.e., for vitamin D synthesis).
Neither selective pressure, however, would have diversified hair color, at least not that of modern Europeans:
1. When there is less selection for dark skin, ‘loss-of-function’ variants will proliferate at any gene associated with skin color. But such proliferation needs about a million years to produce the hair-color variability that Europeans now display, including c. 80,000 years to produce just the current prevalence of red hair (Harding et al., 2000; Templeton, 2002).
2. When there is more selection for light skin, the original allele at any one gene will be replaced by an allele that optimally reduces skin pigmentation. But the overall number of alleles will remain the same.
Moreover, if we examine the many homozygous and heterozygous combinations of hair color (MC1R) alleles, most have little visible effect on skin pigmentation, except for the ones that produce red hair (Duffy et al., 2004). It is difficult to see how either relaxed selection for dark skin or increased selection for light skin could have given rise to most of these alleles, especially over such a short span of evolutionary time.
It was probably another selective pressure that diversified European hair color. Sexual selection is especially likely because it is known to produce bright color traits, especially polymorphic ones.
Sexual selection is also indicated by the geographic distribution of hair-color diversity. During the last ice age, particularly on the steppe-tundra of northern and eastern Europe, a unique demographic environment intensified sexual selection of women by reducing the supply of men and by limiting polygyny. Hunters die in proportion to the distances they cover, and hunting distance was at a maximum on these open plains with their dispersed and highly mobile herds. Since women depended on men for sustenance, there being little food to gather on the tundra, only the ablest hunter could provide for multiple wives. Result: too many women competing for too few men. In a saturated, competitive market, success hinges on visual merchandising; therefore, a woman with a bright, novel color would have attracted attention and edged out otherwise equal but bland rivals. It is this sexual selection for both brightness and novelty that may have multiplied the number of hair color alleles among early modern Europeans.
This demographic environment was unknown to the Neanderthals. Unlike early modern Europeans, they never colonized the steppe-tundra of northern and eastern Europe under ice age conditions:
When temperatures declined during the Early Pleniglacial (OIS 4), the Neanderthals were apparently unable to cope with periglacial loess-steppe environments on the East European Plain. Much of the latter seems to have been abandoned by Neanderthals at this time, although some areas (notably the southwest regions) were reoccupied during the milder Middle Pleniglacial (OIS 3). By contrast, modern humans successfully colonized the periglacial loess-steppe during the terminal phases of OIS 3 and the subsequent Last Glacial Maximum (OIS 2). (Hoffecker, 2002, p. 136)
The Neanderthals also seem to have been characterized by relatively limited movements and small territories in comparison to recent hunter-gatherers in northern latitudes. (Hoffecker, 2002, p. 135).
These archaic humans seem to have occupied a niche that excluded hunting of herds over expanses of steppe-tundra. Among the many environments that modern humans colonized, this seems to be the one that most intensified sexual selection of women—by reducing both the supply of men and their demand for mates.
So I doubt that the Neanderthals had a wide range of hair colors. They probably all sported the same reddish coat. Coat? Yes, you read right. They were as furry as bears. How else did they survive in subzero temperatures without tailored clothing? (Hoffecker, 2002, pp. 107, 109, 135, 252). Moreover, both needles and the human body louse (which lives in clothing) seem to date back no earlier than 50,000 years ago, i.e., the transition from Neanderthals to modern humans (Harris, 2006).
References
Duffy, D.L., Box, N.F., Chen, W., Palmer, J.S., Montgomery, G.W., James, M.R., Hayward, N.K., Martin, N.G., and Sturm, R.A. (2004). Interactive effects of MC1R and OCA2 on melanoma risk phenotypes. Human Molecular Genetics, 13, 447-461.
Harding, R.M., Healy, E., Ray, A.J., Ellis, N.S., Flanagan, N., Todd, C., Dixon, C., Sajantila, A., Jackson, I.J., Birch?Machin, M.A., and Rees, J.L. (2000). Evidence for variable selective pressures at MC1R. American Journal of Human Genetics, 66, 1351?1361.
Harris, J. R. (2006). Parental selection: A third selection process in the evolution of human hairlessness and skin color. Medical Hypotheses, 66, 1053-1059.
Hoffecker, J.F. (2002). Desolate Landscapes. Ice-Age Settlement in Eastern Europe. New Brunswick: Rutgers University Press.
Laleuza-Fox, C., Römpler, H., Caramelli, D., Stäubert, C., Catalano, G., Hughes, D., Rohland, N., Pilli, E., Longo, L., Condemi, S., de la Rasilla, M., Fortea, J., Rosas, A., Stoneking, M., Schöneberg, T., Bertranpetit, J., Hofreiter, M. (2007). A Melanocortin 1 receptor allele suggests varying pigmentation among Neanderthals. Science. doi:10.1126/science.1147417.
Rincon, P. (2007). Neanderthals 'were flame-haired'. BBC.co.uk. October 25, 2007.
Templeton, A.R. (2002). Out of Africa again and again. Nature, 416, 45-51.
Wednesday, October 31, 2007
Not so elementary ...
Rummaging around my late mother’s home, I came across a “study in the problem of race” called The Clash of Colour. It dated back to the 1920s and reflected a kind of soft anti-racism that now seems quaint … and impossible.
The author, Basil Mathews, discusses the injustice of a world where whites control nine-tenths of the world’s habitable surface. He denounces the hypocrisy of demanding self-determination for Eastern Europeans but not for Africans and Asians. Turning his attention to African Americans, he speaks even more caustically about Jim Crow laws and lack of equal opportunity. Finally, near the end of the book, he quotes a resolution that the World’s Student Christian Federation adopted in 1922:
We, representing Christian students from all parts of the world, believe in the fundamental equality of all the races and nations of mankind and consider it as part of our Christian vocation to express this reality in all our relationships. (Mathews, 1925, p. 149)
Yet, strangely enough, the author’s antiracism co-exists with a belief in racial differences:
When we talk of the unity of man, we do not mean the uniformity of man. Race is real. It seems certain that—as Dr McDougall says—
“Racial qualities both physical and mental are extremely stable and persistent, and if the experience of each generation is in any manner or degree transmitted as modifications of the racial qualities, it is only in very slight degree, so as to produce any moulding effect only very slowly and in the course of generations.
I would submit the principle that, although differences of racial mental qualities are relatively small, so small as to be indistinguishable with certainty in individuals, they are yet of great importance for the life of nations, because they exert throughout many generations a constant bias upon the development of their culture and their institutions.” (Mathews, 1925, p. 151)
These sentiments sound disturbingly similar to those of Dr. James Watson, the discoverer of the DNA double helix, who recently wrote, “there is no firm reason to anticipate that the intellectual capacities of peoples geographically separated in their evolution should prove to have evolved identically. Our wanting to reserve equal powers of reason as some universal heritage of humanity will not be enough to make it so.” For these words, Dr. Watson was roundly condemned and forced out of his position at the Cold Spring Harbor Laboratory.
James Watson published his book in 2007. Basil Mathews published his in 1925. Between these two dates, antiracism changed. What passed for progressive opinion in the 1920s is now inadmissible.
What happened? There was, to be sure, the world’s revulsion against Nazism. But that isn’t the whole story. Even in comparison to the 1970s, today’s antiracism has become much more radical.
Recently, President Bush pushed hard for an immigration reform that would have reduced white Americans to minority status by the year 2050 while pushing the population total to half a billion. Such a proposal would have been unthinkable thirty years ago. It certainly would not have come from a self-professed conservative. Today, in 2007, such demographic change evokes scarcely a murmur of protest from the left to the right of the political spectrum. America’s elites have converted almost entirely to the anti-racist worldview. And this ‘consensus’ is defended not by debate but by an absence of debate—by a systematic silencing of any dissidence, such as Dr. Watson's.
It’s unhealthy for any belief, however noble, to exist in an echo chamber of constant approval. This is, after all, what totalitarianism is about—the rise to hegemony of one opinion to the detriment of all others. Such is the state of antiracism today. It is no longer the voice of reason that speaks to the screams of bigotry and intolerance. By a strange role reversal, it has become the very thing it used to oppose.
Antiracism, I fear, is painting itself into a corner from which it cannot extricate itself and which it will have to defend with increasingly totalitarian methods. Is this re-enactment of history really necessary? Can we not learn from the past? Must we follow the same trajectory that other hegemonic beliefs have followed with the same tragic consequences?
References
Mathews, B. (1925). The Clash of Colour. A Study in the Problem of Race. London: Edinburgh House Press.
Watson, J.D. (2007). Avoid Boring People: Lessons from a Life in Science. Knopf
The author, Basil Mathews, discusses the injustice of a world where whites control nine-tenths of the world’s habitable surface. He denounces the hypocrisy of demanding self-determination for Eastern Europeans but not for Africans and Asians. Turning his attention to African Americans, he speaks even more caustically about Jim Crow laws and lack of equal opportunity. Finally, near the end of the book, he quotes a resolution that the World’s Student Christian Federation adopted in 1922:
We, representing Christian students from all parts of the world, believe in the fundamental equality of all the races and nations of mankind and consider it as part of our Christian vocation to express this reality in all our relationships. (Mathews, 1925, p. 149)
Yet, strangely enough, the author’s antiracism co-exists with a belief in racial differences:
When we talk of the unity of man, we do not mean the uniformity of man. Race is real. It seems certain that—as Dr McDougall says—
“Racial qualities both physical and mental are extremely stable and persistent, and if the experience of each generation is in any manner or degree transmitted as modifications of the racial qualities, it is only in very slight degree, so as to produce any moulding effect only very slowly and in the course of generations.
I would submit the principle that, although differences of racial mental qualities are relatively small, so small as to be indistinguishable with certainty in individuals, they are yet of great importance for the life of nations, because they exert throughout many generations a constant bias upon the development of their culture and their institutions.” (Mathews, 1925, p. 151)
These sentiments sound disturbingly similar to those of Dr. James Watson, the discoverer of the DNA double helix, who recently wrote, “there is no firm reason to anticipate that the intellectual capacities of peoples geographically separated in their evolution should prove to have evolved identically. Our wanting to reserve equal powers of reason as some universal heritage of humanity will not be enough to make it so.” For these words, Dr. Watson was roundly condemned and forced out of his position at the Cold Spring Harbor Laboratory.
James Watson published his book in 2007. Basil Mathews published his in 1925. Between these two dates, antiracism changed. What passed for progressive opinion in the 1920s is now inadmissible.
What happened? There was, to be sure, the world’s revulsion against Nazism. But that isn’t the whole story. Even in comparison to the 1970s, today’s antiracism has become much more radical.
Recently, President Bush pushed hard for an immigration reform that would have reduced white Americans to minority status by the year 2050 while pushing the population total to half a billion. Such a proposal would have been unthinkable thirty years ago. It certainly would not have come from a self-professed conservative. Today, in 2007, such demographic change evokes scarcely a murmur of protest from the left to the right of the political spectrum. America’s elites have converted almost entirely to the anti-racist worldview. And this ‘consensus’ is defended not by debate but by an absence of debate—by a systematic silencing of any dissidence, such as Dr. Watson's.
It’s unhealthy for any belief, however noble, to exist in an echo chamber of constant approval. This is, after all, what totalitarianism is about—the rise to hegemony of one opinion to the detriment of all others. Such is the state of antiracism today. It is no longer the voice of reason that speaks to the screams of bigotry and intolerance. By a strange role reversal, it has become the very thing it used to oppose.
Antiracism, I fear, is painting itself into a corner from which it cannot extricate itself and which it will have to defend with increasingly totalitarian methods. Is this re-enactment of history really necessary? Can we not learn from the past? Must we follow the same trajectory that other hegemonic beliefs have followed with the same tragic consequences?
References
Mathews, B. (1925). The Clash of Colour. A Study in the Problem of Race. London: Edinburgh House Press.
Watson, J.D. (2007). Avoid Boring People: Lessons from a Life in Science. Knopf
Monday, October 15, 2007
Why are Europeans whiter than North Asians?
In Europe, especially in the north and east, skin is unusually white, almost at the physiological limit of depigmentation, eyes are not only brown but also blue, gray, hazel or green, and hair is not only black but also brown, flaxen, golden or red. Are these color traits directly or indirectly due to selection for light skin at northern latitudes? But why, then, are they absent in populations that are indigenous to similar latitudes in northern Asia and North America?
As one reader of this blog has argued, skies are more overcast in Europe than at similar latitudes in northern Asia and North America. Thus, ancestral Europeans would have experienced less selection for dark skin to protect against skin cancer and sunburn and more selection for light skin to increase synthesis of vitamin D. Since genes for hair and eye color have some effect on skin color, relaxation of selection for dark skin should have allowed defective alleles to proliferate at all pigmentation loci, including those for hair color and eye color.
Actually, at any given latitude, solar UV radiation is just as intense at ground level in Europe as it is in northern Asia and North America. (Jablonski & Chaplin, 2000; see also charts on: http://pages.globetrotter.net/peter_frost61z/European-skin-color.htm). At these latitudes, UV radiation is already weak, so a significant further reduction in solar UV requires continually overcast skies, such as exist only on the coastal fringe of northwestern Europe.
Moreover, it is doubtful that relaxed selection for dark skin could have diversified hair and eye color by allowing defective alleles to proliferate. Two papers have shown that such a scenario would have needed close to a million years to produce the hair-color and eye-color variability that Europeans now display, with the redhead alleles alone being c. 80,000 years old (Harding et al., 2000; Templeton, 2002). Yet modern humans have been in Europe for only 35,000 years or so.
Instead of relaxed selection for dark skin, perhaps there was increased selection for light skin, notably to boost synthesis of vitamin D. This hypothesis solves the time problem but does not explain the increase in the number of MC1R and OCA2 alleles. Natural selection would have simply favored one allele at the expense of all others, i.e., whichever one optimally reduced skin pigmentation.
There are other problems with either hypothesis, or with any that attribute these color traits to weaker solar UV:
1) If we examine the many homozygous and heterozygous combinations of MC1R or OCA2 alleles, most have little visible effect on skin pigmentation, except for the ones that produce red hair or blue eyes (Duffy et al., 2004; Sturm & Frudakis, 2004).
2) If we consider the estimated time of origin of these color traits, at least two of them seem to have appeared long after modern humans had entered Europe's northern latitudes about 35,000 years ago. The whitening of European skin, through allelic changes at AIM1, is dated to about 11,000 years ago (Soejima et al., 2005). No less recent are allelic changes at other skin color loci and at the eye color gene OCA2 (Voight et al., 2006). Did natural selection wait over 20,000 years before acting?
Are there other forces of natural selection that might explain the 'European exception'? Loomis (1970) and Murray (1934) have argued that Europeans are lighter-skinned than indigenous populations at similar latitudes in northern Asia and North America because the latter obtain sufficient vitamin D in their diet from marine fish. This argument may hold true for the Inuit but not for the majority of indigenous populations that live within the zone of minimal UV radiation, essentially above 47º N (Jablonski & Chaplin, 2000). Most, in fact, live far from sea coastlines.
References
Duffy, D.L., Box, N.F., Chen, W., Palmer, J.S., Montgomery, G.W., James, M.R., Hayward, N.K., Martin, N.G., & Sturm, R.A. (2004). Interactive effects of MC1R and OCA2 on melanoma risk phenotypes. Human Molecular Genetics, 13, 447-461.
Harding, R.M., Healy, E., Ray, A.J., Ellis, N.S., Flanagan, N., Todd, C., Dixon, C., Sajantila, A., Jackson, I.J., Birch‑Machin, M.A., & Rees, J.L. (2000). Evidence for variable selective pressures at MC1R. American Journal of Human Genetics, 66, 1351‑1361.
Jablonski, N.G., & Chaplin, G. (2000). The evolution of human skin coloration. Journal of Human Evolution, 39, 57-106.
Loomis, W.F. (1970). Rickets. Scientific American, 223, 77-91.
Murray, F.G. (1934). Pigmentation, sunlight, and nutritional disease. American Anthropologist, 36, 438-445. ,
Soejima, M., Tachida, H., Ishida, T., Sano, A., & Koda, Y. (2005). Evidence for recent positive selection at the human AIM1 locus in a European population. Molecular Biology and Evolution, 23, 179-188.
Sturm, R.A., & Frudakis, T.N. (2004). Eye colour: portals into pigmentation genes and ancestry. Trends in Genetics, 20, 327-332.
Templeton, A.R. (2002). Out of Africa again and again. Nature, 416, 45-51.
Voight, B.F., Kudaravalli, S, Wen, X, Pritchard, J.K. (2006). A map of recent positive selection in the human genome. PLoS Biology, 4(3), e72 doi:10.1371/journal.pbio.0040072
As one reader of this blog has argued, skies are more overcast in Europe than at similar latitudes in northern Asia and North America. Thus, ancestral Europeans would have experienced less selection for dark skin to protect against skin cancer and sunburn and more selection for light skin to increase synthesis of vitamin D. Since genes for hair and eye color have some effect on skin color, relaxation of selection for dark skin should have allowed defective alleles to proliferate at all pigmentation loci, including those for hair color and eye color.
Actually, at any given latitude, solar UV radiation is just as intense at ground level in Europe as it is in northern Asia and North America. (Jablonski & Chaplin, 2000; see also charts on: http://pages.globetrotter.net/peter_frost61z/European-skin-color.htm). At these latitudes, UV radiation is already weak, so a significant further reduction in solar UV requires continually overcast skies, such as exist only on the coastal fringe of northwestern Europe.
Moreover, it is doubtful that relaxed selection for dark skin could have diversified hair and eye color by allowing defective alleles to proliferate. Two papers have shown that such a scenario would have needed close to a million years to produce the hair-color and eye-color variability that Europeans now display, with the redhead alleles alone being c. 80,000 years old (Harding et al., 2000; Templeton, 2002). Yet modern humans have been in Europe for only 35,000 years or so.
Instead of relaxed selection for dark skin, perhaps there was increased selection for light skin, notably to boost synthesis of vitamin D. This hypothesis solves the time problem but does not explain the increase in the number of MC1R and OCA2 alleles. Natural selection would have simply favored one allele at the expense of all others, i.e., whichever one optimally reduced skin pigmentation.
There are other problems with either hypothesis, or with any that attribute these color traits to weaker solar UV:
1) If we examine the many homozygous and heterozygous combinations of MC1R or OCA2 alleles, most have little visible effect on skin pigmentation, except for the ones that produce red hair or blue eyes (Duffy et al., 2004; Sturm & Frudakis, 2004).
2) If we consider the estimated time of origin of these color traits, at least two of them seem to have appeared long after modern humans had entered Europe's northern latitudes about 35,000 years ago. The whitening of European skin, through allelic changes at AIM1, is dated to about 11,000 years ago (Soejima et al., 2005). No less recent are allelic changes at other skin color loci and at the eye color gene OCA2 (Voight et al., 2006). Did natural selection wait over 20,000 years before acting?
Are there other forces of natural selection that might explain the 'European exception'? Loomis (1970) and Murray (1934) have argued that Europeans are lighter-skinned than indigenous populations at similar latitudes in northern Asia and North America because the latter obtain sufficient vitamin D in their diet from marine fish. This argument may hold true for the Inuit but not for the majority of indigenous populations that live within the zone of minimal UV radiation, essentially above 47º N (Jablonski & Chaplin, 2000). Most, in fact, live far from sea coastlines.
References
Duffy, D.L., Box, N.F., Chen, W., Palmer, J.S., Montgomery, G.W., James, M.R., Hayward, N.K., Martin, N.G., & Sturm, R.A. (2004). Interactive effects of MC1R and OCA2 on melanoma risk phenotypes. Human Molecular Genetics, 13, 447-461.
Harding, R.M., Healy, E., Ray, A.J., Ellis, N.S., Flanagan, N., Todd, C., Dixon, C., Sajantila, A., Jackson, I.J., Birch‑Machin, M.A., & Rees, J.L. (2000). Evidence for variable selective pressures at MC1R. American Journal of Human Genetics, 66, 1351‑1361.
Jablonski, N.G., & Chaplin, G. (2000). The evolution of human skin coloration. Journal of Human Evolution, 39, 57-106.
Loomis, W.F. (1970). Rickets. Scientific American, 223, 77-91.
Murray, F.G. (1934). Pigmentation, sunlight, and nutritional disease. American Anthropologist, 36, 438-445. ,
Soejima, M., Tachida, H., Ishida, T., Sano, A., & Koda, Y. (2005). Evidence for recent positive selection at the human AIM1 locus in a European population. Molecular Biology and Evolution, 23, 179-188.
Sturm, R.A., & Frudakis, T.N. (2004). Eye colour: portals into pigmentation genes and ancestry. Trends in Genetics, 20, 327-332.
Templeton, A.R. (2002). Out of Africa again and again. Nature, 416, 45-51.
Voight, B.F., Kudaravalli, S, Wen, X, Pritchard, J.K. (2006). A map of recent positive selection in the human genome. PLoS Biology, 4(3), e72 doi:10.1371/journal.pbio.0040072
Monday, October 8, 2007
Male skin color and ruddiness
Several years ago, my main research interest was the difference in skin pigmentation between women and men. In a nutshell, women are paler in complexion and men browner and ruddier because the latter have more melanin and hemoglobin in their skin. This sex difference dominated skin color variability in earlier social environments; therefore, skin color may have become a visual cue for gender-specific responses (e.g., sexual attraction, gender identification, conflict readiness, social distancing, etc.).
In a rating study, I showed female subjects several pairs of male facial photos, and in each pair one of the faces had been made slightly darker than the other. The darker face was more likely to be preferred by the women in the estrogen-dominant phase of their menstrual cycle (i.e., the first two-thirds) than by those in the progesterone-dominant phase (i.e., the last third). This cyclic change in preference was absent in women on oral contraceptives and in women who were assessing pairs of female faces (Frost, 1994).
At no point in the cycle was the darker male face more popular than the lighter one. It was simply less often disliked during the estrogen-dominant phase. As I saw it, higher estrogen levels seemed to be disabling a negative response to darker individuals. This negative response might be a social-distancing mechanism that keeps conflict readiness at a higher level during social interaction with males.
My study left some questions unanswered. What component of male skin color was triggering this response? Was it ruddiness (hemoglobin) or brownness (melanin)? And exactly what feelings were being triggered?
Some recent findings suggest that the trigger may be male ruddiness and the feelings something akin to intimidation. In the 2004 Olympic Games, opponents in boxing, taekwondo, Greco-Roman wrestling, and freestyle wrestling were randomly assigned red or blue athletic uniforms. For all four competitions, the ones who wore red uniforms were significantly likelier to win. This phenomenon was investigated by Ioan et al. (2007), who asked participants to name the color of words on a computer screen and measured the response time. The men took significantly longer than the women to respond when the words were red. Reducing luminosity increased response time for both men and women, but the gender gap remained. The authors concluded:
Our data suggests that “seeing red” distracts men through a psychological rather than a perceptual mechanism. Such a mechanism would associate red with aggression or dominance and may have a long evolutionary history, as indicated by behavioural evidence from nonhuman primates and other species.
With respect to our species, they state:
In humans, the adult male is ruddier in complexion than the adult female and male hormones greatly increase blood circulation in the skin’s outer layers. Testosterone influences erythropoiesis during male puberty and a decline of testosterone with aging increases the risk of anemia. Furthermore, men with hypogonadism or those taking anti-androgenic drugs frequently have anemia. These data are consistent with a testosterone-dependent ruddiness of the male complexion, as seen in many other species where red coloration acts as a signal of male dominance.
It would be interesting to repeat the above study with female subjects at different phases of the menstrual cycle. I suspect that response time would be longer among subjects in the progesterone-dominant phase than among those in the estrogen-dominant phase. In other words, the gender gap may be due to estrogen disabling a conflict-readiness mechanism that uses ruddiness as a visual cue for male identity.
References
Frost, P. (1994). "Preference for darker faces in photographs at different phases of the menstrual cycle: Preliminary assessment of evidence for a hormonal relationship", Perceptual and Motor Skills, 79, 507-514.
Ioan, S., Sandulache, M., Avramescu, S., Ilie, A., & Neacsu, A. (2007). Red is a distractor for men in competition. Evolution and Human Behavior, 28, 285-293.
In a rating study, I showed female subjects several pairs of male facial photos, and in each pair one of the faces had been made slightly darker than the other. The darker face was more likely to be preferred by the women in the estrogen-dominant phase of their menstrual cycle (i.e., the first two-thirds) than by those in the progesterone-dominant phase (i.e., the last third). This cyclic change in preference was absent in women on oral contraceptives and in women who were assessing pairs of female faces (Frost, 1994).
At no point in the cycle was the darker male face more popular than the lighter one. It was simply less often disliked during the estrogen-dominant phase. As I saw it, higher estrogen levels seemed to be disabling a negative response to darker individuals. This negative response might be a social-distancing mechanism that keeps conflict readiness at a higher level during social interaction with males.
My study left some questions unanswered. What component of male skin color was triggering this response? Was it ruddiness (hemoglobin) or brownness (melanin)? And exactly what feelings were being triggered?
Some recent findings suggest that the trigger may be male ruddiness and the feelings something akin to intimidation. In the 2004 Olympic Games, opponents in boxing, taekwondo, Greco-Roman wrestling, and freestyle wrestling were randomly assigned red or blue athletic uniforms. For all four competitions, the ones who wore red uniforms were significantly likelier to win. This phenomenon was investigated by Ioan et al. (2007), who asked participants to name the color of words on a computer screen and measured the response time. The men took significantly longer than the women to respond when the words were red. Reducing luminosity increased response time for both men and women, but the gender gap remained. The authors concluded:
Our data suggests that “seeing red” distracts men through a psychological rather than a perceptual mechanism. Such a mechanism would associate red with aggression or dominance and may have a long evolutionary history, as indicated by behavioural evidence from nonhuman primates and other species.
With respect to our species, they state:
In humans, the adult male is ruddier in complexion than the adult female and male hormones greatly increase blood circulation in the skin’s outer layers. Testosterone influences erythropoiesis during male puberty and a decline of testosterone with aging increases the risk of anemia. Furthermore, men with hypogonadism or those taking anti-androgenic drugs frequently have anemia. These data are consistent with a testosterone-dependent ruddiness of the male complexion, as seen in many other species where red coloration acts as a signal of male dominance.
It would be interesting to repeat the above study with female subjects at different phases of the menstrual cycle. I suspect that response time would be longer among subjects in the progesterone-dominant phase than among those in the estrogen-dominant phase. In other words, the gender gap may be due to estrogen disabling a conflict-readiness mechanism that uses ruddiness as a visual cue for male identity.
References
Frost, P. (1994). "Preference for darker faces in photographs at different phases of the menstrual cycle: Preliminary assessment of evidence for a hormonal relationship", Perceptual and Motor Skills, 79, 507-514.
Ioan, S., Sandulache, M., Avramescu, S., Ilie, A., & Neacsu, A. (2007). Red is a distractor for men in competition. Evolution and Human Behavior, 28, 285-293.
Saturday, September 29, 2007
Sexual selection and Arctic environments
My 2006 paper is often criticized on one point. If Arctic environments did intensify sexual selection of women among ancestral modern humans, and if this selection did create inter alia the color traits of Europeans (diversification of eye and hair color, extreme depigmentation of skin color), then why are these traits absent among the native peoples of northern Asia and North America? Surely they too are products of Arctic environments.
Yes, they are. But it was not Arctic environments per se that intensified sexual selection of women. It was essentially two changes to the sexual division of labor that, among hunter-gatherers, generally correlate with distance from the equator. First, hunting distance increases with decreasing numbers of game animals per square kilometer, thereby increasing male mortality. Second, food gathering decreases with longer winters, thereby increasing women’s reliance on men for provisioning and increasing the costs of polygyny for men. It is this combination of higher male mortality and limited polygyny that intensifies sexual selection of women.
These points are made by Hoffecker (2002, pp. 7-8):
Hunter-gatherer diet is strongly influenced by latitude and temperature. To begin with, energy demands increase significantly in cold climates and caloric intake in arctic environments may be as much as 30 percent higher than it is in tropical regions. The percentage of meat and fish in the diet of recent hunter-gatherers increases as temperature, moisture, and primary productivity decline, and equals or exceeds 80 percent among most peoples who live in areas with an effective temperature of 10 degrees C or less. …
The high protein-fat diet and hunting and fishing subsistence of hunter-gatherers in northern environments has major implications for foraging strategy. Although cold maritime settings often provide rich concentrations of aquatic resources that require limited mobility, hunter-gatherers in northern continental environments who subsist on terrestrial mammals must forage across large areas in order to secure highly dispersed and mobile prey. Among peoples who rely primarily on nonaquatic foods, there is a correlation between temperature and the average distance of residential moves and a related correlation between the percentage of hunted food in the diet and territory size. Another consequence of low temperatures and a high meat diet is that males procure most or all food resources, generating a more pronounced sexual division of labor.
Hunting distance and male food provisioning of women seem to be at a maximum in a special kind of Arctic environment: ‘continental’ steppe-tundra, where almost all food is in the form of highly dispersed and mobile game animals. Today, this environment is confined to the northern fringes of continental Eurasia and North America. During the last ice age, it lay further south and covered more territory. This was especially so in Europe. The Scandinavian icecap had pushed the steppe-tundra zone far to the south and on to the broad plains stretching from southwestern France through northern Germany and into eastern Europe. This combination of treeless tundra and temperate latitudes created an environment quite unlike the northern barrens we know today. As Jochim (1983, p. 214) notes: “The low-latitude tundras and park-tundras of glacial Europe were richer than any modern northern counterparts.” Long intense sunlight favored a lush growth of mosses, lichens, and low shrubs that fed herds of large herbivores, mainly wild reindeer (a.k.a. caribou) but also mammoth, woolly rhinoceros, horse, bison, red deer, roe deer aurochs, ibex, chamois, saiga antelope, muskox, giant deer, wild ass, elk, and wild boar (Butzer, 1964, p. 138).
Though substantial, this kind of biomass is a volatile food source. Caribou herds in Alaska fluctuate considerably in any one area, in part because they cover long distances in the space of one year but also because they go through long-term demographic cycles of expansion and contraction (Burch, 1972). Among caribou-dependent Inuit, “at least 1 period of hunger or starvation is part of the normal annual cycle” (Burch, 1972, p. 350). In good times, caribou herds do provide a bountiful food source, but at the price of continual camp moves and extensive reconnoitering on foot. This is the real man-killer in Arctic groups that have not yet domesticated reindeer, as Krupnik (1985, p. 126) notes when explaining the Chukchi’s low ratio of men to women:
The herdsmen guarded the herd on foot. There were no herd dogs, and reindeer were not used for transport during the summer months, so that the men had to travel with the herds over the tundra with a minimum of portable possessions. All of this must have sharply intensified the physical burdens on adult, able-bodied men, and caused a higher mortality rate and consequently a proportional decrease of their numbers in the population.
Thus, on the steppe-tundra of the last ice age, the population of human hunters was probably as volatile as its resource base, all the more so if one also considers the climatic oscillations during this period. Moreover, because this population was dispersed over a wide area, its density was not necessarily high enough for long-term viability. Hoffecker (2002, pp. 8-10) writes:
The high mobility requirements of northern continental environments not only incur added time and energy costs, but also carry potential social and reproductive costs for dispersed populations in such environments. Populations must maintain a minimum threshold density in order to remain viable and avoid extinction, and it is estimated that the “minimum equilibrium size” for a mating network of modern hunter-gatherers is between 175 and 475 individuals. The degree of dispersal of these individuals across the landscape (typically grouped into bands containing roughly 25 individuals) cannot exceed their ability to sustain a social network through at least periodic contact and aggregation.
All of these factors hindered sustained human occupation. When temperatures fell during the last glacial maximum (19,000-18,000 BP), depopulation seems to have occurred throughout the steppe-tundra zone, but more so in some areas than in others. Least affected were the warmer and moister areas in Western Europe and the Carpathian basin, where continuous occupation is well attested throughout the glacial maximum (Hoffecker 2002, p. 194). Most affected were the colder and drier areas in northern Asia, where the steppe-tundra zone lay close to the Arctic Circle and far from the Atlantic. Goebel (1999) writes:
During the last glacial maximum (19 to 18 kya), Siberia was devoid of human populations, except perhaps in small refuges like the southern Yenisei or Transbaikal region. … The central Asian steppe also lacks archaeological sites spanning the last glacial maximum, suggesting that increased aridity, lower temperatures, and a lack of woody plants severely limited human settlement in this region as well.
Bioproductivity was clearly lower in northern Asia. This factor, on top of other factors affecting the entire steppe-tundra zone (volatile resource base, critically low density of human population), made any human presence vulnerable to extinction during periods of environmental stress. There likely were repeated cycles of colonization, extinction, and recolonization.
In conclusion, substantial and continuous human settlement seems to have been confined to the European end of the steppe-tundra zone. Only there did all conditions fall into place for sustained sexual selection of women.
References
Burch, Jr., E.S. (1972). The caribou/wild reindeer as a human resource. American Antiquity, 37, 339-368.
Butzer, K.W. (1964). Environment and Archaeology. Chicago: Aldine.
Frost, P. (2006). European hair and eye color - A case of frequency-dependent sexual selection? Evolution and Human Behavior, 27, 85-103
Goebel, T. (1999). Pleistocene human colonization of Siberia and peopling of the Americas: An ecological approach. Evolutionary Anthropology, 8, 208‑227.
Hoffecker, J.F. (2002). Desolate Landscapes. Ice-Age Settlement in Eastern Europe. New Brunswick: Rutgers University Press.
Jochim, M.A. (1983). Palaeolithic cave art in ecological perspective. In Hunter-Gatherer Economy in Prehistory. A European Perspective. G. Bailey (Ed.). Cambridge: Cambridge University Press.
Krupnik, I.I. (1985). The male‑female ratio in certain traditional populations of the Siberian Arctic. Inuit Studies, 9, 115‑140.
Yes, they are. But it was not Arctic environments per se that intensified sexual selection of women. It was essentially two changes to the sexual division of labor that, among hunter-gatherers, generally correlate with distance from the equator. First, hunting distance increases with decreasing numbers of game animals per square kilometer, thereby increasing male mortality. Second, food gathering decreases with longer winters, thereby increasing women’s reliance on men for provisioning and increasing the costs of polygyny for men. It is this combination of higher male mortality and limited polygyny that intensifies sexual selection of women.
These points are made by Hoffecker (2002, pp. 7-8):
Hunter-gatherer diet is strongly influenced by latitude and temperature. To begin with, energy demands increase significantly in cold climates and caloric intake in arctic environments may be as much as 30 percent higher than it is in tropical regions. The percentage of meat and fish in the diet of recent hunter-gatherers increases as temperature, moisture, and primary productivity decline, and equals or exceeds 80 percent among most peoples who live in areas with an effective temperature of 10 degrees C or less. …
The high protein-fat diet and hunting and fishing subsistence of hunter-gatherers in northern environments has major implications for foraging strategy. Although cold maritime settings often provide rich concentrations of aquatic resources that require limited mobility, hunter-gatherers in northern continental environments who subsist on terrestrial mammals must forage across large areas in order to secure highly dispersed and mobile prey. Among peoples who rely primarily on nonaquatic foods, there is a correlation between temperature and the average distance of residential moves and a related correlation between the percentage of hunted food in the diet and territory size. Another consequence of low temperatures and a high meat diet is that males procure most or all food resources, generating a more pronounced sexual division of labor.
Hunting distance and male food provisioning of women seem to be at a maximum in a special kind of Arctic environment: ‘continental’ steppe-tundra, where almost all food is in the form of highly dispersed and mobile game animals. Today, this environment is confined to the northern fringes of continental Eurasia and North America. During the last ice age, it lay further south and covered more territory. This was especially so in Europe. The Scandinavian icecap had pushed the steppe-tundra zone far to the south and on to the broad plains stretching from southwestern France through northern Germany and into eastern Europe. This combination of treeless tundra and temperate latitudes created an environment quite unlike the northern barrens we know today. As Jochim (1983, p. 214) notes: “The low-latitude tundras and park-tundras of glacial Europe were richer than any modern northern counterparts.” Long intense sunlight favored a lush growth of mosses, lichens, and low shrubs that fed herds of large herbivores, mainly wild reindeer (a.k.a. caribou) but also mammoth, woolly rhinoceros, horse, bison, red deer, roe deer aurochs, ibex, chamois, saiga antelope, muskox, giant deer, wild ass, elk, and wild boar (Butzer, 1964, p. 138).
Though substantial, this kind of biomass is a volatile food source. Caribou herds in Alaska fluctuate considerably in any one area, in part because they cover long distances in the space of one year but also because they go through long-term demographic cycles of expansion and contraction (Burch, 1972). Among caribou-dependent Inuit, “at least 1 period of hunger or starvation is part of the normal annual cycle” (Burch, 1972, p. 350). In good times, caribou herds do provide a bountiful food source, but at the price of continual camp moves and extensive reconnoitering on foot. This is the real man-killer in Arctic groups that have not yet domesticated reindeer, as Krupnik (1985, p. 126) notes when explaining the Chukchi’s low ratio of men to women:
The herdsmen guarded the herd on foot. There were no herd dogs, and reindeer were not used for transport during the summer months, so that the men had to travel with the herds over the tundra with a minimum of portable possessions. All of this must have sharply intensified the physical burdens on adult, able-bodied men, and caused a higher mortality rate and consequently a proportional decrease of their numbers in the population.
Thus, on the steppe-tundra of the last ice age, the population of human hunters was probably as volatile as its resource base, all the more so if one also considers the climatic oscillations during this period. Moreover, because this population was dispersed over a wide area, its density was not necessarily high enough for long-term viability. Hoffecker (2002, pp. 8-10) writes:
The high mobility requirements of northern continental environments not only incur added time and energy costs, but also carry potential social and reproductive costs for dispersed populations in such environments. Populations must maintain a minimum threshold density in order to remain viable and avoid extinction, and it is estimated that the “minimum equilibrium size” for a mating network of modern hunter-gatherers is between 175 and 475 individuals. The degree of dispersal of these individuals across the landscape (typically grouped into bands containing roughly 25 individuals) cannot exceed their ability to sustain a social network through at least periodic contact and aggregation.
All of these factors hindered sustained human occupation. When temperatures fell during the last glacial maximum (19,000-18,000 BP), depopulation seems to have occurred throughout the steppe-tundra zone, but more so in some areas than in others. Least affected were the warmer and moister areas in Western Europe and the Carpathian basin, where continuous occupation is well attested throughout the glacial maximum (Hoffecker 2002, p. 194). Most affected were the colder and drier areas in northern Asia, where the steppe-tundra zone lay close to the Arctic Circle and far from the Atlantic. Goebel (1999) writes:
During the last glacial maximum (19 to 18 kya), Siberia was devoid of human populations, except perhaps in small refuges like the southern Yenisei or Transbaikal region. … The central Asian steppe also lacks archaeological sites spanning the last glacial maximum, suggesting that increased aridity, lower temperatures, and a lack of woody plants severely limited human settlement in this region as well.
Bioproductivity was clearly lower in northern Asia. This factor, on top of other factors affecting the entire steppe-tundra zone (volatile resource base, critically low density of human population), made any human presence vulnerable to extinction during periods of environmental stress. There likely were repeated cycles of colonization, extinction, and recolonization.
In conclusion, substantial and continuous human settlement seems to have been confined to the European end of the steppe-tundra zone. Only there did all conditions fall into place for sustained sexual selection of women.
References
Burch, Jr., E.S. (1972). The caribou/wild reindeer as a human resource. American Antiquity, 37, 339-368.
Butzer, K.W. (1964). Environment and Archaeology. Chicago: Aldine.
Frost, P. (2006). European hair and eye color - A case of frequency-dependent sexual selection? Evolution and Human Behavior, 27, 85-103
Goebel, T. (1999). Pleistocene human colonization of Siberia and peopling of the Americas: An ecological approach. Evolutionary Anthropology, 8, 208‑227.
Hoffecker, J.F. (2002). Desolate Landscapes. Ice-Age Settlement in Eastern Europe. New Brunswick: Rutgers University Press.
Jochim, M.A. (1983). Palaeolithic cave art in ecological perspective. In Hunter-Gatherer Economy in Prehistory. A European Perspective. G. Bailey (Ed.). Cambridge: Cambridge University Press.
Krupnik, I.I. (1985). The male‑female ratio in certain traditional populations of the Siberian Arctic. Inuit Studies, 9, 115‑140.
Friday, September 21, 2007
Dear Ann Coulter
While killing time in a bookstore, I came across Godless: The Church of Liberalism by Ann Coulter. According to the blurb: “Liberals' absolute devotion to Darwinism, Coulter shows, has nothing to do with evolution's scientific validity and everything to do with its refusal to admit the possibility of God as a guiding force.”
It’s news to me that liberals are devoted to Darwinism. I’ve seen just as much Darwin-bashing on the political left as I’ve seen on the political right. More so, in fact.
But another thing bothered me with Anne’s book. It was her equation of religion with conservatism, especially in the realm of social issues. She is not alone in this regard. Most social conservatives seem to feel that religion is their one and only mainstay. It’s almost as if they feel that their values must be accepted on faith alone and cannot be defended by rational argument.
This point was made recently by columnist Heather Mac Donald:
… So in the American Conservative piece I wanted to offer some resistance to the assumption of conservative religious unanimity. I tried to point out that conservatism has no necessary relation to religious belief, and that rational thought, not revelation, is all that is required to arrive at the fundamental conservative principles of personal responsibility and the rule of law. I find it depressing that every organ of conservative opinion reflexively cheers on creationism and intelligent design, while delivering snide pot shots at the Enlightenment. Which of the astounding fruits of empiricism would these Enlightenment-bashers dispense with: the conquest of cholera and other infectious diseases, emergency room medicine, jet travel, or the internet, to name just a handful of the millions of human triumphs that we take for granted?
Is Heather Mac Donald less of a social conservative because she is not religious? Conversely, is the political right more socially conservative when it talks the talk of religious folk? Heather addressed these questions in her recent article “What is Left? What is Right?":
Skeptical conservatives--one of the Right's less celebrated subcultures--are conservatives because of their skepticism, not in spite of it. They ground their ideas in rational thinking and (nonreligious) moral argument. And the conservative movement is crippling itself by leaning too heavily on religion to the exclusion of these temperamentally compatible allies. Conservative atheists and agnostics support traditional American values. They believe in personal responsibility, self-reliance, and deferred gratification as the bedrock virtues of a prosperous society. They view marriage between a man and a woman as the surest way to raise stable, law-abiding children. They deplore the encroachments of the welfare state on matters best left to private effort.
They also find themselves mystified by the religiosity of the rhetoric that seems to define so much of conservatism today. Our Republican president says that he bases "a lot of [his] foreign policy decisions" on his belief in "the Almighty" and in the Almighty's "great gifts" to mankind. What is one to make of such a statement? According to believers, the Almighty's actions are only intermittently scrutable; using them as a guide for policy, then, would seem reckless.
Over thirty years ago, social conservatives hitched their wagon to religion through groups like the Moral Majority. They succeeded electorally, being key to the election of several right-of-center governments. Indeed, they—and not economic conservatives—have been the main voting base for such governments. Yet they have lost out to economic conservatives in shaping public policy. How come?
It’s just that most people out there don’t believe in the Bible. To win them over, you have to come up with something better than: “Because the Bible says so!” Faith-based arguments simply don’t cut it when the time comes to present talking points and influence policymaking.
Ann, it’s not because of Darwin that social conservatives today have so little impact. The fault lies more in the dubious alliances they’ve made in order to elect governments more responsive to their concerns. It also lies in not having arguments that make sense to secular people. Before lashing out at Darwin, you should take a cold hard look at this faith-based strategy. It isn’t working.
It’s news to me that liberals are devoted to Darwinism. I’ve seen just as much Darwin-bashing on the political left as I’ve seen on the political right. More so, in fact.
But another thing bothered me with Anne’s book. It was her equation of religion with conservatism, especially in the realm of social issues. She is not alone in this regard. Most social conservatives seem to feel that religion is their one and only mainstay. It’s almost as if they feel that their values must be accepted on faith alone and cannot be defended by rational argument.
This point was made recently by columnist Heather Mac Donald:
… So in the American Conservative piece I wanted to offer some resistance to the assumption of conservative religious unanimity. I tried to point out that conservatism has no necessary relation to religious belief, and that rational thought, not revelation, is all that is required to arrive at the fundamental conservative principles of personal responsibility and the rule of law. I find it depressing that every organ of conservative opinion reflexively cheers on creationism and intelligent design, while delivering snide pot shots at the Enlightenment. Which of the astounding fruits of empiricism would these Enlightenment-bashers dispense with: the conquest of cholera and other infectious diseases, emergency room medicine, jet travel, or the internet, to name just a handful of the millions of human triumphs that we take for granted?
Is Heather Mac Donald less of a social conservative because she is not religious? Conversely, is the political right more socially conservative when it talks the talk of religious folk? Heather addressed these questions in her recent article “What is Left? What is Right?":
Skeptical conservatives--one of the Right's less celebrated subcultures--are conservatives because of their skepticism, not in spite of it. They ground their ideas in rational thinking and (nonreligious) moral argument. And the conservative movement is crippling itself by leaning too heavily on religion to the exclusion of these temperamentally compatible allies. Conservative atheists and agnostics support traditional American values. They believe in personal responsibility, self-reliance, and deferred gratification as the bedrock virtues of a prosperous society. They view marriage between a man and a woman as the surest way to raise stable, law-abiding children. They deplore the encroachments of the welfare state on matters best left to private effort.
They also find themselves mystified by the religiosity of the rhetoric that seems to define so much of conservatism today. Our Republican president says that he bases "a lot of [his] foreign policy decisions" on his belief in "the Almighty" and in the Almighty's "great gifts" to mankind. What is one to make of such a statement? According to believers, the Almighty's actions are only intermittently scrutable; using them as a guide for policy, then, would seem reckless.
Over thirty years ago, social conservatives hitched their wagon to religion through groups like the Moral Majority. They succeeded electorally, being key to the election of several right-of-center governments. Indeed, they—and not economic conservatives—have been the main voting base for such governments. Yet they have lost out to economic conservatives in shaping public policy. How come?
It’s just that most people out there don’t believe in the Bible. To win them over, you have to come up with something better than: “Because the Bible says so!” Faith-based arguments simply don’t cut it when the time comes to present talking points and influence policymaking.
Ann, it’s not because of Darwin that social conservatives today have so little impact. The fault lies more in the dubious alliances they’ve made in order to elect governments more responsive to their concerns. It also lies in not having arguments that make sense to secular people. Before lashing out at Darwin, you should take a cold hard look at this faith-based strategy. It isn’t working.
Friday, September 14, 2007
When did Europeans become 'white'?
It is often assumed that Europeans have always looked much like they do now. Even Neanderthals are often depicted as white folk who need a shave and a haircut. Yet, clearly, Europeans have not always been European. At some point in time, their ancestors came from somewhere else and looked like people from somewhere else.
The current thinking is that modern humans arrived in Europe about 35,000 years ago by way of the Middle East and ultimately from Africa. When did these proto-Europeans begin to look like their present-day descendants? Probably long after. The current phenotype seems to reflect later in situ changes to morphology and coloration that are still far from uniform among Europeans.
In a study of prehistoric European skeletons, Holliday (1997) found that early Upper Paleolithic skeletons had ‘tropical’ body proportions and clustered with recent Africans. Late Upper Paleolithic and Mesolithic skeletons clustered with recent Europeans. The shift to a more European phenotype is placed by Holliday (1997) at around 20,000 BP— during the last ice age and well after the arrival of modern humans.
Even later are the changes to European skin color, eye color, and hair color. Whitening of the skin, through allelic changes at the AIM1 gene, is dated to about 11,000 years ago (Soejima et al., 2005). Allelic changes at other skin color loci are similarly dated to the late Upper Paleolithic or even the Holocene (Voight et al., 2006). No less recent is diversification of eye color alleles at the OCA2 gene (Voight et al., 2006). Diversification of hair color alleles at the MC1R gene has yet to be reliably dated but is likely contemporaneous. This whitening of the skin and diversification of eye and hair color clearly constitute an in situ evolution within Europe, being most prominent within a zone centered on the East Baltic and covering the north and the east. Within this zone, skin is unusually white, almost at the physiological limit of depigmentation, eyes are not only brown but also blue, gray, hazel or green, and hair is not only black but also brown, flaxen, golden or red. As one moves further east and south, these color traits disappear and merge into the standard human pattern of dark skin, brown eyes, and black hair.
If I had unlimited funding, I would like to retrieve nuclear DNA from European skeletal remains that date from the arrival of modern humans (c. 35,000 BP) to the near present (c. 5,000 BP). I would then study changes in genes for skin color, eye color, hair color, hair length and form, and facial morphology. My hunch is that all of these changes took place within a narrow time-frame, most likely between the glacial maximum (c. 20,000-15,000 BP, when Europe was largely depopulated) and the end of the last ice age (c. 10,000 BP).
These changes were driven not so much by adaptation to the natural environment as by intense female-female competition for mates. Men were in short supply among early Europeans, especially among those who pursued reindeer herds across the northern and eastern plains. This was because of 1) very long hunting distances, which greatly increased death rates among young men; and 2) limited opportunities for food gathering, which made women dependent on men for provisioning and thus ruled out polygyny for all but the ablest of hunters. With too many women competing for too few men, conditions were optimal for sexual selection of women. Men were able to translate their subtlest preferences into mate choice. Intense sexual selection is particularly indicated by a shift to brightly colored traits, especially color polymorphisms—such as those of eye and hair color (Frost, 2006).
How did Europeans look previously? They probably looked distinctly non-European. Holliday, as already discussed, noted the ‘tropical’ and even African appearance of early modern humans in Europe. Other anthropologists have noted the same, particularly in relation to a pair of skeletons discovered in 1901 at Grimaldi, northern Italy. The skeletons were initially dated to the beginnings of modern human occupation in Europe, c. 30,000 BP. Associated artifacts have since been radiocarbon dated to 14,000-19,000 BP but may come from more recent layers of occupation (Bisson et al., 1996).
These skeletons exhibit an array of dental and morphological characteristics normally found in sub-Saharan Africans. As Boule and Vallois (1957, pp. 285) report:
When we compare the dimensions of the bones of their limbs, we see that the leg was very long in proportion to the thigh, the forearm very long in proportion to the whole arm; and that the lower limb was exceedingly long relative to the upper limb. Now these proportions reproduce, but in greatly exaggerated degree, the characters presented by the modern Negro. …
The skulls likewise look non-European. The face is wide but not high. The nose is broad and flat. The upper jaw projects forward whereas the chin is weakly developed. The well-preserved dentition is not at all European. Among currently living populations, the ones who most closely resemble the Grimaldi humans seem to be the Khoisan peoples of southern Africa. Boule and Vallois (1957, pp. 290-291) write:
For our part, we have been greatly struck by the resemblances these Grimaldi Negroids bear to the group of South African tribes, the Bushmen and the Hottentots. Comparisons which we have been able to make with the material at our disposal, in particular with the skeleton of the Hottentot Venus, have led us to note, for instance, the same dolichocephalic character, the same prognathism, the same flattening of the nose, the same development of the breadth of the face, the same form of jaw, and the same great size of teeth. The only differences are to be found in the stature and perhaps in the height of the skull.
We know less about their soft-tissue characteristics. Alongside the skeletons were a number of female statuettes with big breasts, protruding bellies, full hips, and large buttocks. The hair seems to be short and matted (Boule & Vallois, 1957, p. 311).
These Grimaldi humans may have been ancestral to later European populations:
Verneau has investigated the survivals of the Grimaldi race at different prehistoric periods. He has first of all compared this type with the Cro-Magnon, which succeeded it in place. ‘At first sight’, he says, ‘the two races appear to differ greatly from each other; but on examining them in detail, we see that there is no reason why they should not have had some ties of kinship.’ Verneau even declares that the Grimaldi Negroids ‘may have been the ancestors of the hunters of the Reindeer Age’. (Boule & Vallois, 1957, p. 291).
Interestingly, Grimaldi-like humans are reported to have persisted in parts of Europe as late as the Neolithic:
Verneau likewise discovered, in both prehistoric and modern races, survivals or reappearances of the Grimaldi types.
‘In Brittany, as well as in Switzerland and in the north of Italy, there lived in the Polished Stone period, in the Bronze Age and during the early Iron Age, a certain number of individuals who differed in certain characters from their contemporaries’, in particular in the dolichocephalic character of their skull, in possessing a prognathism that was sometimes extreme, and a large grooved nose. This is a matter of partial atavism which in certain cases, as in the Neolithic Breton skull from Conguel, may attain to complete atavism. Two Neolithic individuals from Chamblandes in Switzerland are Negroid not only as regards their skulls but also in the proportions of their limbs. Several Ligurian and Lombard tombs of the Metal Ages have also yielded evidences of a Negroid element.
Since the publication of Verneau’s memoir, discoveries of other Negroid skeletons in Neolithic levels in Illyria and the Balkans have been announced. The prehistoric statues, dating from the Copper Age, from Sultan Selo in Bulgaria are also thought to portray Negroids. In 1928 René Bailly found in one of the caverns of Moniat, near Dinant in Belgium, a human skeleton of whose age it is difficult to be certain, but which seems definitely prehistoric. It is remarkable for its Negroid characters, which give it a resemblance to the skeletons from both Grimaldi and Asselar.
It is not only in prehistoric times that the Grimaldi race seems to have made its influence felt. Verneau has been able to see, now in modern skulls and now in living subjects, in the Italian areas of Piedmont, Lombardy, Emilia, Tuscany, and the Rhone Valley, numerous characters of the old fossil race (Boule & Vallois, pp. 291-292).
Although the concept of atavism or ‘throwback’ is no longer widely accepted, there may have been some human groups in Europe that still looked African long after most had moved away from this phenotype. Indeed, if sexual selection were the cause, the phenotypic transformation should have occurred unevenly, beginning among populations on the former steppe-tundra of northern and eastern Europe, and then percolating outward through gene flow. In some peripheral regions, the transformation may still have been incomplete at the dawn of history.
Observations similar to those of Boule and Vallois have appeared elsewhere in the literature. Angel (1972) noted that 14% of skeletal samples from early Neolithic Greece displayed apparently Negroid traits, in contrast to later periods.
To be sure, there is a lot of pooh-poohing in the literature about the Grimaldi skeletons. Some say that the skeletal restoration must have been defective, or that the pressure of overlying layers had distorted the skull and the jaw, or that the apparently Negroid traits are of the sort that occur sporadically in Europeans.
For Carleton Coon (1962, p. 577), Europeans were ‘Caucasoid’ throughout the entire Upper Paleolithic:
There was, in fact, only one Upper Paleolithic European race. It was Caucasoid and it inhabits Europe today. We know this not only from skeletons but also from the representations of the human body in Upper Paleolithic art.
With reference to the Grimaldi skeletons, specifically their dentition, Coon (1962, p. 584) states:
These are dental characteristics of the Negro, but not exclusively. They are also seen on a number of teeth from Krapina and on those of Neanderthals, and are also present, as we have just mentioned, in the Mount Carmel population. An upper canine from the Magdalenian maxilla of Farincourt has the same features. The Grimaldi child was no more Negroid than the Palestinians of Skhul and many living Europeans of the Mediterranean region.
These statements are true, more or less. The dentition of sub-Saharan Africans does conserve many archaic characteristics that are absent in other modern humans but are present in Neanderthals, in the Skhul-Qafzeh hominins, and in other hominids (Irish, 1998). But the Grimaldi skeletons are clearly modern human. And while it is true that individual Negroid traits occur sporadically in living Europeans, it would be unusual, very unusual, for all of them to co-occur in a single living European. Finally, as Coon himself points out, other European skeletons from the Upper Paleolithic also show these traits.
Carleton Coon believed in the multiregional model of human evolution. He felt that European modern humans evolved out of European Neanderthals. So they could not have come from elsewhere. The Grimaldi skeletons, and others like them, must be an aberration … unless one accepts the ‘Out-of-Africa’ model. From the standpoint of this second model, we have one more piece in a puzzle linking modern Europeans to a demographic expansion that began to spread out of Africa some 50,000 years ago and that reached their continent about 35,000 years ago.
According to the Out-of-Africa model, the first modern humans in Europe could not have looked ‘white’. Indeed, they probably did not for at least the next 15,000 years. Their physical appearance seems to have changed, and radically so, within a later and relatively narrow time-frame, probably the second half of the last ice age.
The cause? It does not seem to have been natural selection, i.e., gradual adaptation to the ecological conditions of Europe. As I have discussed elsewhere, the cause is more consistent with an intensification of sexual selection of women, as a result of the unusually strong female-female competition for mates that prevailed among herd-hunting peoples in ice-age Europe. Since most genes are not sex-linked, this selection spilled over on to members of both sexes, thus modifying the appearance of the entire population (Frost, 2006).
References
Angel, J.L. (1972). Review of Blacks in Antiquity, American Anthropologist, 74, 159-160.
Bisson, M.S., Tisnerat, N., & White, R. (1996). Radiocarbon dates from the Upper Paleolithic of the Barma Grande. Current Anthropology, 37, 156–162.
Boule, M. & Vallois, H.V. (1957). Fossil Men. New York: Dryden Press.
Coon, C.S. (1962). The Origin of Races. New York: Alfred A. Knopf.
Frost, P. (2006). "European hair and eye color - A case of frequency-dependent sexual selection?" Evolution and Human Behavior, 27, 85-103 http://www.sciencedirect.com/science/journal/10905138
Holliday, T.W. (1997). Body proportions in Late Pleistocene Europe and modern human origins, Journal of Human Evolution, 32, 423-447.
Irish, J.D. (1998). Ancestral dental traits in recent Sub-Saharan Africans and the origins of modern humans. Journal of Human Evolution, 34, 81-98.
Soejima, M., Tachida, H., Ishida, T., Sano, A., & Koda, Y. (2005). Evidence for recent positive selection at the human AIM1 locus in a European population. Molecular Biology and Evolution, 23, 179-188.
Voight, B.F., Kudaravalli, S, Wen, X, Pritchard, J.K. (2006). A map of recent positive selection in the human genome. PLoS Biology, 4(3), e72 doi:10.1371/journal.pbio.0040072
The current thinking is that modern humans arrived in Europe about 35,000 years ago by way of the Middle East and ultimately from Africa. When did these proto-Europeans begin to look like their present-day descendants? Probably long after. The current phenotype seems to reflect later in situ changes to morphology and coloration that are still far from uniform among Europeans.
In a study of prehistoric European skeletons, Holliday (1997) found that early Upper Paleolithic skeletons had ‘tropical’ body proportions and clustered with recent Africans. Late Upper Paleolithic and Mesolithic skeletons clustered with recent Europeans. The shift to a more European phenotype is placed by Holliday (1997) at around 20,000 BP— during the last ice age and well after the arrival of modern humans.
Even later are the changes to European skin color, eye color, and hair color. Whitening of the skin, through allelic changes at the AIM1 gene, is dated to about 11,000 years ago (Soejima et al., 2005). Allelic changes at other skin color loci are similarly dated to the late Upper Paleolithic or even the Holocene (Voight et al., 2006). No less recent is diversification of eye color alleles at the OCA2 gene (Voight et al., 2006). Diversification of hair color alleles at the MC1R gene has yet to be reliably dated but is likely contemporaneous. This whitening of the skin and diversification of eye and hair color clearly constitute an in situ evolution within Europe, being most prominent within a zone centered on the East Baltic and covering the north and the east. Within this zone, skin is unusually white, almost at the physiological limit of depigmentation, eyes are not only brown but also blue, gray, hazel or green, and hair is not only black but also brown, flaxen, golden or red. As one moves further east and south, these color traits disappear and merge into the standard human pattern of dark skin, brown eyes, and black hair.
If I had unlimited funding, I would like to retrieve nuclear DNA from European skeletal remains that date from the arrival of modern humans (c. 35,000 BP) to the near present (c. 5,000 BP). I would then study changes in genes for skin color, eye color, hair color, hair length and form, and facial morphology. My hunch is that all of these changes took place within a narrow time-frame, most likely between the glacial maximum (c. 20,000-15,000 BP, when Europe was largely depopulated) and the end of the last ice age (c. 10,000 BP).
These changes were driven not so much by adaptation to the natural environment as by intense female-female competition for mates. Men were in short supply among early Europeans, especially among those who pursued reindeer herds across the northern and eastern plains. This was because of 1) very long hunting distances, which greatly increased death rates among young men; and 2) limited opportunities for food gathering, which made women dependent on men for provisioning and thus ruled out polygyny for all but the ablest of hunters. With too many women competing for too few men, conditions were optimal for sexual selection of women. Men were able to translate their subtlest preferences into mate choice. Intense sexual selection is particularly indicated by a shift to brightly colored traits, especially color polymorphisms—such as those of eye and hair color (Frost, 2006).
How did Europeans look previously? They probably looked distinctly non-European. Holliday, as already discussed, noted the ‘tropical’ and even African appearance of early modern humans in Europe. Other anthropologists have noted the same, particularly in relation to a pair of skeletons discovered in 1901 at Grimaldi, northern Italy. The skeletons were initially dated to the beginnings of modern human occupation in Europe, c. 30,000 BP. Associated artifacts have since been radiocarbon dated to 14,000-19,000 BP but may come from more recent layers of occupation (Bisson et al., 1996).
These skeletons exhibit an array of dental and morphological characteristics normally found in sub-Saharan Africans. As Boule and Vallois (1957, pp. 285) report:
When we compare the dimensions of the bones of their limbs, we see that the leg was very long in proportion to the thigh, the forearm very long in proportion to the whole arm; and that the lower limb was exceedingly long relative to the upper limb. Now these proportions reproduce, but in greatly exaggerated degree, the characters presented by the modern Negro. …
The skulls likewise look non-European. The face is wide but not high. The nose is broad and flat. The upper jaw projects forward whereas the chin is weakly developed. The well-preserved dentition is not at all European. Among currently living populations, the ones who most closely resemble the Grimaldi humans seem to be the Khoisan peoples of southern Africa. Boule and Vallois (1957, pp. 290-291) write:
For our part, we have been greatly struck by the resemblances these Grimaldi Negroids bear to the group of South African tribes, the Bushmen and the Hottentots. Comparisons which we have been able to make with the material at our disposal, in particular with the skeleton of the Hottentot Venus, have led us to note, for instance, the same dolichocephalic character, the same prognathism, the same flattening of the nose, the same development of the breadth of the face, the same form of jaw, and the same great size of teeth. The only differences are to be found in the stature and perhaps in the height of the skull.
We know less about their soft-tissue characteristics. Alongside the skeletons were a number of female statuettes with big breasts, protruding bellies, full hips, and large buttocks. The hair seems to be short and matted (Boule & Vallois, 1957, p. 311).
These Grimaldi humans may have been ancestral to later European populations:
Verneau has investigated the survivals of the Grimaldi race at different prehistoric periods. He has first of all compared this type with the Cro-Magnon, which succeeded it in place. ‘At first sight’, he says, ‘the two races appear to differ greatly from each other; but on examining them in detail, we see that there is no reason why they should not have had some ties of kinship.’ Verneau even declares that the Grimaldi Negroids ‘may have been the ancestors of the hunters of the Reindeer Age’. (Boule & Vallois, 1957, p. 291).
Interestingly, Grimaldi-like humans are reported to have persisted in parts of Europe as late as the Neolithic:
Verneau likewise discovered, in both prehistoric and modern races, survivals or reappearances of the Grimaldi types.
‘In Brittany, as well as in Switzerland and in the north of Italy, there lived in the Polished Stone period, in the Bronze Age and during the early Iron Age, a certain number of individuals who differed in certain characters from their contemporaries’, in particular in the dolichocephalic character of their skull, in possessing a prognathism that was sometimes extreme, and a large grooved nose. This is a matter of partial atavism which in certain cases, as in the Neolithic Breton skull from Conguel, may attain to complete atavism. Two Neolithic individuals from Chamblandes in Switzerland are Negroid not only as regards their skulls but also in the proportions of their limbs. Several Ligurian and Lombard tombs of the Metal Ages have also yielded evidences of a Negroid element.
Since the publication of Verneau’s memoir, discoveries of other Negroid skeletons in Neolithic levels in Illyria and the Balkans have been announced. The prehistoric statues, dating from the Copper Age, from Sultan Selo in Bulgaria are also thought to portray Negroids. In 1928 René Bailly found in one of the caverns of Moniat, near Dinant in Belgium, a human skeleton of whose age it is difficult to be certain, but which seems definitely prehistoric. It is remarkable for its Negroid characters, which give it a resemblance to the skeletons from both Grimaldi and Asselar.
It is not only in prehistoric times that the Grimaldi race seems to have made its influence felt. Verneau has been able to see, now in modern skulls and now in living subjects, in the Italian areas of Piedmont, Lombardy, Emilia, Tuscany, and the Rhone Valley, numerous characters of the old fossil race (Boule & Vallois, pp. 291-292).
Although the concept of atavism or ‘throwback’ is no longer widely accepted, there may have been some human groups in Europe that still looked African long after most had moved away from this phenotype. Indeed, if sexual selection were the cause, the phenotypic transformation should have occurred unevenly, beginning among populations on the former steppe-tundra of northern and eastern Europe, and then percolating outward through gene flow. In some peripheral regions, the transformation may still have been incomplete at the dawn of history.
Observations similar to those of Boule and Vallois have appeared elsewhere in the literature. Angel (1972) noted that 14% of skeletal samples from early Neolithic Greece displayed apparently Negroid traits, in contrast to later periods.
To be sure, there is a lot of pooh-poohing in the literature about the Grimaldi skeletons. Some say that the skeletal restoration must have been defective, or that the pressure of overlying layers had distorted the skull and the jaw, or that the apparently Negroid traits are of the sort that occur sporadically in Europeans.
For Carleton Coon (1962, p. 577), Europeans were ‘Caucasoid’ throughout the entire Upper Paleolithic:
There was, in fact, only one Upper Paleolithic European race. It was Caucasoid and it inhabits Europe today. We know this not only from skeletons but also from the representations of the human body in Upper Paleolithic art.
With reference to the Grimaldi skeletons, specifically their dentition, Coon (1962, p. 584) states:
These are dental characteristics of the Negro, but not exclusively. They are also seen on a number of teeth from Krapina and on those of Neanderthals, and are also present, as we have just mentioned, in the Mount Carmel population. An upper canine from the Magdalenian maxilla of Farincourt has the same features. The Grimaldi child was no more Negroid than the Palestinians of Skhul and many living Europeans of the Mediterranean region.
These statements are true, more or less. The dentition of sub-Saharan Africans does conserve many archaic characteristics that are absent in other modern humans but are present in Neanderthals, in the Skhul-Qafzeh hominins, and in other hominids (Irish, 1998). But the Grimaldi skeletons are clearly modern human. And while it is true that individual Negroid traits occur sporadically in living Europeans, it would be unusual, very unusual, for all of them to co-occur in a single living European. Finally, as Coon himself points out, other European skeletons from the Upper Paleolithic also show these traits.
Carleton Coon believed in the multiregional model of human evolution. He felt that European modern humans evolved out of European Neanderthals. So they could not have come from elsewhere. The Grimaldi skeletons, and others like them, must be an aberration … unless one accepts the ‘Out-of-Africa’ model. From the standpoint of this second model, we have one more piece in a puzzle linking modern Europeans to a demographic expansion that began to spread out of Africa some 50,000 years ago and that reached their continent about 35,000 years ago.
According to the Out-of-Africa model, the first modern humans in Europe could not have looked ‘white’. Indeed, they probably did not for at least the next 15,000 years. Their physical appearance seems to have changed, and radically so, within a later and relatively narrow time-frame, probably the second half of the last ice age.
The cause? It does not seem to have been natural selection, i.e., gradual adaptation to the ecological conditions of Europe. As I have discussed elsewhere, the cause is more consistent with an intensification of sexual selection of women, as a result of the unusually strong female-female competition for mates that prevailed among herd-hunting peoples in ice-age Europe. Since most genes are not sex-linked, this selection spilled over on to members of both sexes, thus modifying the appearance of the entire population (Frost, 2006).
References
Angel, J.L. (1972). Review of Blacks in Antiquity, American Anthropologist, 74, 159-160.
Bisson, M.S., Tisnerat, N., & White, R. (1996). Radiocarbon dates from the Upper Paleolithic of the Barma Grande. Current Anthropology, 37, 156–162.
Boule, M. & Vallois, H.V. (1957). Fossil Men. New York: Dryden Press.
Coon, C.S. (1962). The Origin of Races. New York: Alfred A. Knopf.
Frost, P. (2006). "European hair and eye color - A case of frequency-dependent sexual selection?" Evolution and Human Behavior, 27, 85-103 http://www.sciencedirect.com/science/journal/10905138
Holliday, T.W. (1997). Body proportions in Late Pleistocene Europe and modern human origins, Journal of Human Evolution, 32, 423-447.
Irish, J.D. (1998). Ancestral dental traits in recent Sub-Saharan Africans and the origins of modern humans. Journal of Human Evolution, 34, 81-98.
Soejima, M., Tachida, H., Ishida, T., Sano, A., & Koda, Y. (2005). Evidence for recent positive selection at the human AIM1 locus in a European population. Molecular Biology and Evolution, 23, 179-188.
Voight, B.F., Kudaravalli, S, Wen, X, Pritchard, J.K. (2006). A map of recent positive selection in the human genome. PLoS Biology, 4(3), e72 doi:10.1371/journal.pbio.0040072
Friday, September 7, 2007
Whither I?
I have almost finished writing a manuscript that will follow up on an earlier article published in 2006: "European hair and eye color - A case of frequency-dependent sexual selection?" Evolution and Human Behavior 27: 85-103. At this point, I should start planning my research sabbatical.
The research project itself is pretty much decided on. I wish to replicate a study I had published in 1994: "Preference for darker faces in photographs at different phases of the menstrual cycle: Preliminary assessment of evidence for a hormonal relationship", Perceptual and Motor Skills 79: 507-514. In this earlier study, I showed women several pairs of male facial photos, one of which had been made slightly darker than the other. Preference for the darker face was significantly higher among subjects in the estrogen-dominant phase of their menstrual cycle (i.e., the first two-thirds) than among those in the progesterone-dominant phase (i.e., the last third). This cyclic effect was absent in women on oral contraceptives and in women who were assessing pairs of female faces.
This study is poorly known, even among people who are interested in this sort of thing. It was pre-Internet (and is still unavailable online) and came out in a second-tier journal. Perhaps more importantly, I was working on my own with nobody to pitch my findings to a wider audience.
A team of psychologists at St. Andrews University (Scotland) replicated most of my findings in 2005 before learning about my earlier study (Jones, B.C., Perrett, D.I., Little, A.C., et al. 2005. "Menstrual cycle, pregnancy and oral contraceptive use alter attraction to apparent health in faces", Proc. R. Soc. B 272:347-354). As part of a broader research effort—the effects of “apparent health” on mate choice—they presented women with pairs of male faces that slightly differed in shape, color, and texture. Their subjects’ preferences showed the same variation with menstrual cycle that I had observed. Unfortunately, they did not try to identify which physical difference was driving this cyclic change in preference. It was probably the difference in skin color, but it might also have been the difference in skin texture or facial shape.
I hope to replicate my earlier study, but this time with a much larger pool of subjects and better photos (color and not B/W). I also hope to determine which skin pigment is driving this cyclic change. Is it male ruddiness (i.e., hemoglobin) or male brownness (i.e., melanin)? And just what component of ‘preference’ is being affected? Just what are the psychological effects of minor variation in skin color within the context of male-female interaction?
The obvious research location would be St. Andrews University. In fact, I had earlier applied to do research there and been accepted. But that was three years ago. A number of personal problems have since intervened, notably the settlement of my late mother’s estate. Now, with these problems out of the way, I have to reassess things.
Going to St. Andrews offers several advantages:
1. The investigators there are already familiar with this line of research.
2. Their research center has state-of-the-art equipment and is staffed with some of the best minds in cognitive psychology.
3. Language would not be a problem. I could easily find my way around after a short period of adaptation. The British are helpful people and make good research associates.
But there are disadvantages:
1. Academics at St. Andrews are busy people who are already overwhelmed with their own research work. I remember a piece of advice I was given during my doctoral studies: “It’s better to choose a supervisor who is less prestigious but who will spend time with you and be willing to go to bat for you, especially for things like grant proposals and job openings.”
2. The investigators there are working within a paradigm that differs somewhat from my own. They see skin color as an index of health (pale skin = bad health, dark skin = good health, cf. Hamilton and Zuk hypothesis). My feeling is that other mental algorithms are involved. This is something of a fault-line between “adaptationist” and “non-adaptationist” views of mate choice and sexual selection. If I’m not careful, I could be stigmatized as a “non-adaptationist.”
3. The offer from St. Andrews did not come with a scholarship. I would have to pay my own expenses in a country that has one of the highest costs of living in the world.
4. The word “skin color” automatically calls to mind issues of race and ethnicity. This is especially so in Great Britain. I could easily be caricatured as a “race scientist” or some such animal.
The research project itself is pretty much decided on. I wish to replicate a study I had published in 1994: "Preference for darker faces in photographs at different phases of the menstrual cycle: Preliminary assessment of evidence for a hormonal relationship", Perceptual and Motor Skills 79: 507-514. In this earlier study, I showed women several pairs of male facial photos, one of which had been made slightly darker than the other. Preference for the darker face was significantly higher among subjects in the estrogen-dominant phase of their menstrual cycle (i.e., the first two-thirds) than among those in the progesterone-dominant phase (i.e., the last third). This cyclic effect was absent in women on oral contraceptives and in women who were assessing pairs of female faces.
This study is poorly known, even among people who are interested in this sort of thing. It was pre-Internet (and is still unavailable online) and came out in a second-tier journal. Perhaps more importantly, I was working on my own with nobody to pitch my findings to a wider audience.
A team of psychologists at St. Andrews University (Scotland) replicated most of my findings in 2005 before learning about my earlier study (Jones, B.C., Perrett, D.I., Little, A.C., et al. 2005. "Menstrual cycle, pregnancy and oral contraceptive use alter attraction to apparent health in faces", Proc. R. Soc. B 272:347-354). As part of a broader research effort—the effects of “apparent health” on mate choice—they presented women with pairs of male faces that slightly differed in shape, color, and texture. Their subjects’ preferences showed the same variation with menstrual cycle that I had observed. Unfortunately, they did not try to identify which physical difference was driving this cyclic change in preference. It was probably the difference in skin color, but it might also have been the difference in skin texture or facial shape.
I hope to replicate my earlier study, but this time with a much larger pool of subjects and better photos (color and not B/W). I also hope to determine which skin pigment is driving this cyclic change. Is it male ruddiness (i.e., hemoglobin) or male brownness (i.e., melanin)? And just what component of ‘preference’ is being affected? Just what are the psychological effects of minor variation in skin color within the context of male-female interaction?
The obvious research location would be St. Andrews University. In fact, I had earlier applied to do research there and been accepted. But that was three years ago. A number of personal problems have since intervened, notably the settlement of my late mother’s estate. Now, with these problems out of the way, I have to reassess things.
Going to St. Andrews offers several advantages:
1. The investigators there are already familiar with this line of research.
2. Their research center has state-of-the-art equipment and is staffed with some of the best minds in cognitive psychology.
3. Language would not be a problem. I could easily find my way around after a short period of adaptation. The British are helpful people and make good research associates.
But there are disadvantages:
1. Academics at St. Andrews are busy people who are already overwhelmed with their own research work. I remember a piece of advice I was given during my doctoral studies: “It’s better to choose a supervisor who is less prestigious but who will spend time with you and be willing to go to bat for you, especially for things like grant proposals and job openings.”
2. The investigators there are working within a paradigm that differs somewhat from my own. They see skin color as an index of health (pale skin = bad health, dark skin = good health, cf. Hamilton and Zuk hypothesis). My feeling is that other mental algorithms are involved. This is something of a fault-line between “adaptationist” and “non-adaptationist” views of mate choice and sexual selection. If I’m not careful, I could be stigmatized as a “non-adaptationist.”
3. The offer from St. Andrews did not come with a scholarship. I would have to pay my own expenses in a country that has one of the highest costs of living in the world.
4. The word “skin color” automatically calls to mind issues of race and ethnicity. This is especially so in Great Britain. I could easily be caricatured as a “race scientist” or some such animal.
Wednesday, August 29, 2007
Sexual Selection and Human Phenotypic Variation
Does sexual selection explain many phenotypic differences among human populations? This hypothesis was first put forward by Charles Darwin and has been given a new twist by Henry Harpending. The following is a question raised about its falsifiability on the hbd mailing list and my reply.
*********************************************************
I've been thinking about Prof. Henry Harpending's paper "Human Diversity and it's History" (see online copy at http://harpend.dsl.xmission.com/Documents/prize.pdf). You probably are familiar with it's ideas already but I will recap and you can thus check whether I understand it correctly.
Essentially Harpending argues that human phenotype differences have their origin in sexual selection and that this process is inherently conservative. That's 'conservative' in contrast to 'normal' gene flow processes. So external phenotype characteristics are more likely to stay in place versus "neutral" or "unseen" (and thus not sexually selected) characteristics. Thus phenotypes are more likely to reflect the actual human migration history, than analysis of genotypes or other neutral measures, e g blood types etc.
This hypothesis, if true, means that the last generation or so of 20th century physical anthropologists (eg Hooton, Coon, Howells, Birdsell etc) who took metrical analysis of phenotype differences to the max were "barking up the right tree", and a subsequent generation who focused on measuring phenotypical neutral traits (esp blood etc) may have been "barking up the wrong tree".
I'm slowly coming to my point so forgive me for dragging it out.
As best as I can tell the Harpending hypothesis here doesn't outline any prospective falsification tests. I was wondering if you had heard of, or had any ideas that could be used to test this elegant hypothesis?
I'm interested in this as Harpending's hypothesis would seem to me to be in a position to bolster Joseph Birdsell's trihybrid hypothesis of the historical origins of the Australian aboriginals, although of course, if correct, it would have universal applicability.
Tim Gillin
***********************************************************
There are two tests:
1. Look at human populations with intense male-male competition for mates. This kind of mate competition only partly selects for physical traits preferred by the sex in short supply (females). It also selects for physical traits that help males intimidate or fight off other males (e.g., increased body size, higher bone density, larger muscle mass, higher testosterone levels, etc.).
2. Look at human populations with intense female-female competition for mates. This kind of mate competition selects more for those physical traits preferred by the sex in short supply (males). Such traits will stimulate mate-choice algorithms or any mental algorithm that monitors the visual environment. The selected traits thus tend to be vividly colored. At high intensities of mate competition, "color polymorphisms" will develop: novel colors will have a slight edge over less novel ones, so that vividly colored phenotypes will not only proliferate but also diversify. When any one phenotype becomes too common, the selective pressure shifts to others that are rarer and more novel.
In most species, the first scenario is much more common. Sexual selection is usually about too many males competing for too few females. This is because reproduction predisposes females to invest more in their offspring, particularly during pregnancy and early infant care. During these times of life, females are unavailable for reproduction and drop out of the "mate market." Unless males can match female reproductive investment, they can best serve their reproductive interests by inseminating other females. So, at any given moment, too many males will be competing for too few females.
Unlike the situation in most species, human males have the potential to match female reproductive investment, in part because their offspring are dependent for a longer time and in part because humans have colonized temperate and arctic environments where women are less able to provide for themselves through food gathering. This 'paternal investment' is less important among the agricultural peoples of sub-Saharan Africa and New Guinea, where year-round agriculture enables women to be self-sufficient. Women even become net providers of food. The costs of polygyny thus become negative and men best serve their reproductive interests by acquiring as many wives as possible.
So we can test the Harpending hypothesis by comparing low polygyny/high paternal investment populations with high polygyny/low paternal investment populations. This kind of comparison was done when Winkler and Christiansen (1993) studied two Namibian peoples, the !Kung (hunter-gatherers and weakly polygynous) and the Kavango (agriculturalists and highly polygynous). The latter were found to have markedly higher levels of both total testosterone and DHT, as well as a much more robust physique. The authors suggested that this hormonal difference may account for the !Kung’s neotenous appearance, i.e., sparse body hair, small stature, pedomorphic morphology, and light yellowish skin.
Reference
Winkler, E-M., & Christiansen, K. (1993). Sex hormone levels and body hair growth in !Kung San and Kavango men from Namibia. American Journal of Physical Anthropology, 92, 155-164.
Subscribe to:
Posts (Atom)