Saturday, June 27, 2015

Young, male, and single


 
The Babylonian Marriage Market, by Edwin Long (1829-1891). There are too many young men on the mate market, particularly in the White American community.

 

It sucks being young, male, and single. Don't think so? Go to the Interactive Singles Map of the United States and see how it looks for the 20 to 39 age group. Almost everywhere single men outnumber single women.

And the real picture is worse. For one thing, the imbalance is greater among singles without children. This is not a trivial factor, since single mothers are "single" only in the sense of being available for sexual relations. They are still raising offspring from a previous relationship and many are not interested in having more children.

Then there's polygamy—or "polyamory," to use the preferred term—where a minority of men controls sexual access to a larger number of women. If we compare the 1940-1949 and 1970-1979 cohorts of American adults, we find an increase in the number of median lifetime partners from 2.6 to 5.3 among women and from 6.7 to 8.8 among men (Liu et al., 2015). Because this figure is more variable for men than for women, young women are more likely to be sexually active than young men. This is crudely seen in infection rates for chlamydia—the most common sexually transmitted disease. Hispanic Americans still show the traditional pattern of greater sexual activity among men than among women, the rates being 7.24% of men and 4.42% of women. White Americans display the reverse: 1.38% of men and 2.52% of women (Miller et al., 2004).

Finally, there’s a racial angle. This sex ratio is more skewed among White Americans than among African Americans, mainly because the latter have a lower sex ratio at birth and a higher death rate among young men.

It's hard to avoid concluding that a lot of young white men are shut out of the marriage market ... or any kind of heterosexual relationship. This wife shortage was once thought to be temporary, being due to baby-boomer men getting divorced and marrying younger women from the smaller "baby bust" cohort. With time, they would get too old to compete with young men, and the problem should resolve itself.

Today, the crest of the baby boom is entering the seventh decade of life, yet the update to the Interactive Singles Map shows no change to the gender imbalance. So what gives? It appears that demographers have focused too much on the baby-boomer effect and not enough on other factors that matter just as much and, more importantly, show no signs of going away. These factors can be summarized as follows.

Re-entry of older men into the mate market

We have a mate market where 20 to 50 year old men are competing for 20 to 40 year old women. That in itself is nothing new. But something else is.

The baby boom eclipsed an equally important but longer-term trend: more and more men are living past the age of 40. With or without the baby boom, we’ll still see large numbers of older men getting divorced and marrying younger women. The cause isn’t just liberal divorce laws. It’s also the fact we have far more older guys out there as a proportion of the population.

Sure, we will also see younger men pairing up with "cougars" but there are limits to that option, as noted in a New Zealand study:

The male partner may want to partner up with someone younger or have children, which may not be possible with an older woman (for physical reasons or because she chooses not to have (more) children). The younger male partner may not want to become a step-father to existing children. Research has shown that childbearing can be the ultimate deal breaker in this kind of relationship. (Lawton and Callister, 2010)

Persistence of the imbalanced sex ratio at birth

About 105 males are born for every 100 females among people of European origin. This sex ratio used to decline to parity during childhood because of higher infantile mortality among boys. It then declined even farther in early adulthood because of war, industrial accidents, and other hazards. This isn't the distant past. If you talk with women who came of age in the postwar era, they will tell you about their fears of remaining single past the age of thirty. At that age, very few single men were left to go around.

Well, things have changed. The skewed sex ratio at birth is now persisting well into adulthood, thanks to modern medicine and the relative peace that has prevailed since 1945. Women begin to outnumber men only in the 35-39 age group in the United States and in the 40-44 age group in the United Kingdom.

Equalization of male and female same-sex preference

Historically, same-sex preference was more common among men than among women. This gender gap appears to be closing, according to a recent study:

The percent distributions were quite similar for men and women; however, a higher percentage of men identified as gay (1.8%) compared with women who identified as gay/lesbian (1.4%), and a higher percentage of women identified as bisexual (0.9%) compared with men (0.4%). (CDCP, 2014, p. 5) 

Disparities in outmarriage

At present, there are more White American women outmarrying than White American men, particularly in younger age groups. This disparity is mainly in marriages with African American men, there being no gender difference in marriages with Hispanic Americans and the reverse gender difference in marriages with Asian Americans (Jacobs and Labov, 2002; Passel et al., 2010). Overall, this factor further skews the ratio of young single men to young single women in the White American community. 

This disparity isn't new. What is new is its extent, for both legal and common-law marriages. An idea may be gleaned from statistics on children born to White American women, specifically the proportion fathered by a non-White partner. For the U.S. as a whole the proportion in 2013 was between 11% and 20% (the uncertainty is due to 190,000 births for which the father's race was not stated). By comparison, the proportion in 1990 was between 5% and 13% (Centers for Disease Control and Prevention, 2013; see also Silviosilver, 2015).

Whenever this issue comes up for discussion, there are often reassurances that the disparity will disappear in a post-racial world that has been cleansed of "White privilege." I'm not so sure. The European female phenotype seems to be very popular, and this was so even when white folks were geopolitical weaklings. Today, the term “white slavery” is merely a synonym for prostitution, but it originally meant the enslavement of fair-skinned women for sale to clients in North Africa, the Middle East, and South Asia.  At the height of this trade, between 1500 and 1650, over 10,000 Eastern Europeans were enslaved each year for export (Kolodziejczyk, 2006; Skirda, 2010). The overwhelming majority were young women and pre-pubertal boys who were valued for their physical appearance. And yet they were powerless.

No, I don't think this kind of preference will disappear as whites lose "privilege."

Exit strategies

So more and more young men are being left on the shelf, particularly in White America. How do they cope? Mostly by turning to porn from Internet websites, videocassettes, or magazines. Love dolls are another option and may grow in popularity as they become more human-like, not only physically but also in their ability to talk and interact.

Another option is outmarriage. In the past, this trend largely concerned older men marrying East Asian or Hispanic women, but we’re now seeing plenty of young men outmarrying via Internet dating sites. Despite the local supply of single women in the African American community, there is a much stronger tendency to look abroad, generally to women in Eastern Europe, South America, or East Asia.

Then there's gender reassignment, which means either entering the other side of the mate market or tapping into the lesbian market. It’s a viable strategy, all the more so because many white boys can be turned into hot trans women. I'm not saying that some young men actually think along those lines, but gender reassignment is functioning that way.

Finally, there's "game." My attitude toward game is like my attitude toward gender reassignment. Both are attempts to push the envelope of phenotypic plasticity beyond its usual limits, and neither can fully achieve the desired result. A lot of boys aren't wired for game, and there are good reasons why, just as there are good reasons why some people are born male. Male shyness isn't a pathology. It's an adaptation to a social environment that values monogamy and high paternal investment while stigmatizing sexual adventurism. Our war on male shyness reflects our perverse desire to create a society of Don Juans and single mothers.

But if game works, why not? Whatever floats your boat.

Conclusion

Ideally, this gender imbalance should be dealt with at the societal level, but I see little chance of that happening in the near future. If anything, public policy decisions will probably worsen the current imbalance. Changes to public policy generally result from a long process that begins when people speak up and articulate their concerns, yet it's unlikely that even this first step will be taken any time soon. Young single men prefer to remain silent and invent nonexistent girlfriends. They also tend to be marginal in the main areas of discourse creation, like print and online journalism, TV, film, and radio production, book writing, etc. Leaf through any magazine, and you'll probably see more stuff about the problems of single women.

So this imbalance will likely continue to be addressed at the individual level through individual strategies.

References 

Centers for Disease Control and Prevention. (2014). Sexual Orientation in the 2013 National Health Interview Survey: A Quality Assessment, Vital and Health Statistics, 2(169), December
http://www.cdc.gov/nchs/data/series/sr_02/sr02_169.pdf 

Centers for Disease Control and Prevention. (2013). Vital Statistics Online
http://www.cdc.gov/nchs/data_access/Vitalstatsonline.htm  (for discussion, see Silviosilver, 2015 http://www.unz.com/pfrost/the-last-push-back-against-liberalism/#comment-896920) 

Jacobs, J.A. and T.B. Labov. (2002). Gender differentials in intermarriage among sixteen race and ethnic groups, Sociological Forum, 17, 621-646.
http://link.springer.com/article/10.1023/A:1021029507937 

Kolodziejczyk, D. (2006). Slave hunting and slave redemption as a business enterprise: The northern Black Sea region in the sixteenth to seventeenth centuries, Oriente Moderno, 86, 1, The Ottomans and Trade, pp. 149-159.
http://www.jstor.org/discover/10.2307/25818051?sid=21105312761261&uid=3737720&uid=3739448&uid=2&uid=4 

Lawton, Z. and P. Callister. (2010). Older Women-Younger Men Relationships: the Social Phenomenon of 'Cougars'. A Research Note, Institute of Policy Studies Working Paper 10/02
http://ips.ac.nz/publications/files/be0acfcb7d0.pdf 

Liu, G., S. Hariri, H. Bradley, S.L. Gottlieb, J.S. Leichliter, and L.E. Markowitz. (2015). Trends and patterns of sexual behaviors among adolescents and adults aged 14 to 59 years, United States, Sexually Transmitted Diseases, 42, 20-26.
http://journals.lww.com/stdjournal/Abstract/2015/01000/Trends_and_Patterns_of_Sexual_Behaviors_Among.6.aspx 

Miller, W.C., C.A. Ford, M. Morris, M.S. Handcock, J.L. Schmitz, M.M. Hobbs, M.S. Cohen, K.M. Harris, and J.R. Udry. (2004). Prevalence of chlamydial and gonococcal infections among young adults in the United States, JAMA, 291, 2229-2236.
http://jama.jamanetwork.com/article.aspx?articleid=198722

Passel, J.S., W. Wang, and P. Taylor. (2010). One-in-seven new U.S. marriages is interracial or interethnic, Pew Research Center, Social & Demographic Trends,
http://www.pewsocialtrends.org/2010/06/04/ii-overview-2/

Skirda, A. (2010). La traite des Slaves. L'esclavage des Blancs du VIIIe au XVIIIe siècle, Paris, Les Éditions de Paris Max Chaleil. 

Soma, J. (2013). Interactive Singles Map
http://jonathansoma.com/singles/
 

Saturday, June 20, 2015

Gender reassignment of children. Does it really help?


"Flower boy" (on the right) - In 70-80% of cases, gender confusion will clear up on its own (Wikicommons: Recoplado).

 

I remember feeling some attraction to girls in Grade 2, but it really wasn't until Grade 8 that everything fell into place. I'm talking about puberty. Before high school, I was a boy and not a young man.

I didn't consider myself abnormal. Yes, many boys in Grade 8 had deeper voices, as well as signs of facial hair, but just as many did not, and a few would not have been "sexually functional." As for the earlier grades, certainly before Grade 7, most of us could have passed for little girls—just change the clothing, the hairstyle, and voilà!

Today, puberty is starting earlier. Ontario schools will begin explaining it in ... Grade 4. This falling age is largely due to the changing ethnic and racial origins of the student population, as well as things like overeating (in the case of girls) and perhaps our more sexualized culture.

Nonetheless, a lot of boys remain pre-pubertal throughout most of primary school, and some may have trouble coming to terms with their male identity. They experience what is called “gender confusion.” This is hardly surprising. Testosterone levels are low before puberty, and some boys, especially the ones who have been less androgenized in the womb, may genuinely feel like a girl. I also suspect that modern culture makes things worse by creating expectations that even adult males have trouble meeting. Go to any fitness center and you'll see plenty of young men trying to bring their bodies into line with the "rippled look."

Gender confusion, known medically as gender identity disorder, affects children of both sexes but boys much more so, at least in North America. One clinic reported a ratio of 6.6 boys for each girl, the sex imbalance being attributed partly to greater intolerance of feminine behavior in boys (Zucker et al., 1997). This disorder seems to be partly heritable, although we face a similar problem of perspective here as with the referral statistics (Heylens et al., 2012). To what degree does the heritable component reside in how these children objectively behave, and not in one behavior that may or may not alarm another person, usually a parent? In practice, it’s the latter. It’s whatever behavior that makes a parent bring the child to a clinician’s office.

Gender reassignment

We now come to the issue of medical treatment, specifically "gender reassignment." This treatment has recently been condemned by Dr. Paul McHugh, the former psychiatrist-in-chief for Johns Hopkins Hospital:

Then there is the subgroup of very young, often prepubescent children who notice distinct sex roles in the culture and, exploring how they fit in, begin imitating the opposite sex. Misguided doctors at medical centers including Boston's Children's Hospital have begun trying to treat this behavior by administering puberty-delaying hormones to render later sex-change surgeries less onerous—even though the drugs stunt the children's growth and risk causing sterility. (McHugh, 2015)

Is treatment really necessary? McHugh points out: "When children who reported transgender feelings were tracked without medical or surgical treatment at both Vanderbilt University and London's Portman Clinic, 70%-80% of them spontaneously lost those feelings."

McHugh has been accused by the transgender community of misrepresenting the facts:

McHugh also mischaracterizes the treatment of gender nonconforming children. As McHugh states, most gender nonconforming children do not identify as transgender in adulthood.  However, those who receive puberty blocking drugs do not do so until puberty, when trans identity is likely to persist. These drugs allow adolescents and their parents to work with doctors to achieve the best outcome. This approach was demonstrated to be successful in research in the Netherlands before being adopted widely in the U.S. (WPATH, 2015) 

The above text is disingenuous in two ways. First, puberty-blocking drugs are not administered until puberty for an obvious reason: they would be ineffective earlier. The decision to use them, however, is made at an earlier time and often much earlier. Second, these drugs keep hormonal levels from rising, thus maintaining the boy or girl in the same hormonal state and possibly in the same state of gender confusion. Logically, one should wait a few years to see what effect puberty might have.

Is the use of these drugs legitimate? We’re talking about a radical intervention in the normal process of maturation, and this intervention begins before the age of consent, i.e., 16 years of age in most Western countries. Moreover, the eventual gender reassignment will never be complete. Although it’s possible to turn a male into a semblance of a female, such a “female” can never bear children. This isn’t a minor point, given that many male transsexuals wish to maintain a male heterosexual orientation, even to the point of marrying and becoming fathers.

For all these reasons, use of these drugs should be delayed until adulthood, when consent becomes morally defendable, when the risks of sterility are lower, and when the gender confusion may prove to be transitory.

A boy is not a little man

The transgender community likes to talk a good talk about "gender fluidity." Ironically, such fluidity is reduced by gender reassignment, which imposes a relatively unchanging adult dichotomy on pre-pubertal individuals who are going through rapid physical and psychological change. This brings us to a second irony. The transgender community complains about how it was once medically pathologized. Yet here it is pathologizing cases of gender confusion that are not unusual among young children and that are consistent with normal child development.

We should remember that both sexes begin with a body plan that is more female than male. This plan is modified at two points of the life cycle: first, in the womb, when the body’s tissues are primed by a surge of androgens or estrogens; and then at puberty, when boys and girls diverge in the levels of their circulating sex hormones, which in turn trigger profound changes in growth and development.

This truth was known to our ancestors. As late as the early 20th century, people accepted that little boys are more akin to little girls than to grown men. This was why both sexes would be dressed in female clothing until school age, and a mother would often boast that her little boy was as pretty as a girl.

[…] infants and small children had for hundreds of years been dressed alike, in frocks, so that family portraits from previous centuries made it difficult to tell the young boys from the girls. “Breeching,” as a rite of passage, was a sartorial definition of maleness and incipient adulthood, as, in later periods, was the all-important move from short pants to long. Gender differentiation grew increasingly desirable to parents as time went on. By the closing years of the twentieth century the sight of little boys in frilly dresses has become unusual and somewhat risible; a childhood photograph of macho author Ernest Hemingway, aged almost two, in a white dress and large hat festooned with flowers, was itself the focus of much amused critical commentary when reproduced in a best-selling biography—especially when it was disclosed that Hemingway’s mother had labelled the photograph of her son “summer girl.”  (Garber, 1997, pp. 1-2)

Hemingway hated those baby pictures, as well as the stories about how his mother would call him “Ernestine” and tell strangers that he and his sister were twin girls. During her declining years, he threatened to cut off his financial support if she ever gave an interview about his childhood (Onion, 2013; Winer, 2008). He saw her as the typical Victorian mother who sought to momify and symbolically castrate her male offspring. With other writers of his time, particularly psychologists and advice columnists, he helped bring about a reform of sexual conventions that, among other things, would sweep away the custom of cross-dressing little boys.

(see here for an early childhood photo of Hemingway and here for similar photos of H.P. Lovecraft

I remember how I felt seeing such photos when doing research on my family tree. What the?? Today, I feel differently: this cross-dressing strikes me as being healthy, even beautiful in its own way. It avoids the problem of imposing male identity too early in life and thereby forcing slower-developing boys to choose between the identity imposed by society and the one generated by their own mental state—which may still be insufficiently male. It is this situation, and the resulting gender confusion, that is now putting many boys at risk of gender reassignment. Yet there’s nothing wrong with most of them. They just need more time to grow up.

As an extreme example, let’s take the case of "pseudohermaphrodites"—males who look female at birth because their penis resembles a clitoris and because their testes remain inside the body. They are typically raised as girls until puberty, at which time the penis grows in size, the testes descend into the scrotum, and they become like men physically and psychologically. When 18 pseudohermaphrodites were studied in the Dominican Republic, it was found that 16 of them had made the transition from girlhood to manhood with no evidence of psychosexual maladjustment (Imperato-Mcginley et al., 1979). A similar situation often arose among Canada’s Inuit whenever a newborn received the name of a deceased relative. If the child was a boy and the relative a woman, it would be raised as a girl until puberty and as a man thereafter. Such individuals became not only husbands and fathers but also respected shamans (Saladin d'Anglure, 2005).

In short, gender confusion in childhood poses no threat to normal child development. Indeed, whether we acknowledge it or not, all boys start off being more like little girls than the men they will become. This “early girlhood” may actually play a key role in their psychosexual development, and our ancestors might have had good reasons to believe that boyhood begins later. But that raises a troubling question: by trying to masculinize this early phase of life, have we opened the door to unknown consequences?

So if you have a young boy who’s confused about his gender identity, the chances are very good that he’ll successfully transition to manhood ... as long as he’s not given puberty-blocking drugs. This is not a medical condition that needs treatment.

References 

Garber, M.B. (1997). Vested Interests: Cross-Dressing and Cultural Anxiety, Psychology Press.
https://books.google.ca/books?id=eeASHasS0oUC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Heylens, G., G. De Cuypere, K.J. Zucker, C. Schelfaut, E.Elaut, H. Vanden Bossche, E. De Baere, and G. T’Sjoen. (2012). Gender identity disorder in twins: A review of the case report literature, The Journal of Sexual Medicine, 9, 751-757.
http://onlinelibrary.wiley.com/doi/10.1111/j.1743-6109.2011.02567.x/abstract 

Imperato-Mcginley, J., R.E. Petersen, T. Gautier, and E. Sturia. (1979). Male pseudohermaphroditism secondary to 5a-reductase deficiency—A model for the role of androgens in both the development of the male phenotype and the evolution of a male gender identity, Journal of Steroid Biochemistry, 11, 637-645.
http://www.sciencedirect.com/science/article/pii/0022473179900931 

McHugh, P. (2015). Transgender surgery isn't the solution, The Wall Street Journal, June 12
http://www.wsj.com/articles/paul-mchugh-transgender-surgery-isnt-the-solution-1402615120 

Onion, R. (2013). Pages from Hemingway’s baby books, Slate, July 23
http://www.slate.com/blogs/the_vault/2013/07/23/hemingway_scrapbooks_grace_hemingway_s_records_of_son_ernest_hemingway_s.html  

Saladin d'Anglure, B. (2005). The 'Third Gender' of the Inuit, Diogenes, 52, 134-144.
http://dio.sagepub.com/content/52/4/134.short 

Winer, A. (2008). Why Hemingway used to wear women’s clothing, Mental_floss, December 18
http://mentalfloss.com/article/20396/why-hemingway-used-wear-womens-clothing 

WPATH (2015). Wall Street Journal Editorial Critiques Transgender Health July 2, 2014
http://www.wpath.org/site_page.cfm?pk_association_webpage_menu=1635&pk_association_webpage=4905 

Zucker, K.J., S.J. Bradley, and M. Sanikhani. (1997). Sex differences in referral rates of children with gender identity disorder: some hypotheses, Journal of Abnormal Child Psychology, 25, 217-227.
http://link.springer.com/article/10.1023/A:1025748032640#page-1

Saturday, June 13, 2015

Feeling the other's pain


 
In the Reign of Terror, by Jessie Macgregor (1891). We don’t respond equally to signs of emotional distress in other people (Wikicommons)



We like to think that all people feel empathy to the same degree. In reality, it varies a lot from one person to the next, like most mental traits. We are half-aware of this when we distinguish between "normal people" and "psychopaths," the latter having an abnormally low capacity for empathy. The distinction is arbitrary, like the one between "tall" and "short." As with stature, empathy varies continuously among the individuals of a population, with psychopaths being the ones we find beyond an arbitrary cut-off point and who probably have many other things wrong with them. By focusing on the normal/abnormal dichotomy, we lose sight of the variation that occurs among so-called normal individuals. We probably meet people every day who have a low capacity for empathy and who nonetheless look and act normal. Because they seem normal, we assume they are as empathetic as we are. They aren’t.

Like most mental traits, empathy is heritable, its heritability being estimated at 68% (Chakrabarti and Baron-Cohen, 2013). It has two distinct components: cognitive empathy and affective empathy. Some researchers identify a third component, pro-social behavior, but its relationship to the other two seems tangential.

Cognitive empathy appears to be the evolutionarily older component of the two. It is the capacity to understand how another person is feeling and then predict how different actions will affect that person’s emotional state. But this capacity can be used for selfish purposes. Examples are legion: the con artist; many telemarketers; the rapist who knows how to charm his victims ...

Affective empathy is the younger component, having developed out of cognitive empathy. It is the capacity not just to understand another person's emotional state but also to identify with it. A person with high affective empathy will try to help someone in distress not because such help is personally advantageous or legally required, but because he or she is actually feeling the same distress.

Affective empathy may have initially evolved as a means to facilitate relations between a mother and her children. Later, and to varying degrees, it became extended to other human relationships. This evolutionary trajectory is perceptible in young children:

Children do not display empathic concern toward all people equally. Instead, they show bias toward individuals and members of groups with which they identify. For instance, young children of 2 years of age display more concern-related behaviors toward their mother than toward unfamiliar people. Moreover, children (aged 3-9 years) view social categories as marking patterns of interpersonal obligations. They view people as responsible only to their own group members, and consider within-group harm as wrong regardless of explicit rules, but they view the wrongness of between-group harm as contingent on the presence of such rules. (Decety and Cowell, 2014)

Similarly, MRI studies show that adults are much more likely to experience emotional distress when they see loved ones in pain than when they see strangers in pain. A stranger in distress will evoke a response only to the degree that the observer has a high capacity for affective empathy. The higher the capacity the more it will encompass not only loved ones but also less related individuals, including total strangers and nonhumans:

Humans can feel empathic concern for a wide range of 'others', including for nonhuman animals, such as pets (in the Western culture) or tamagotchi (in Japan). This is especially the case when signs of vulnerability and need are noticeable. In support of this, neural regions involved in perceiving the distress of other humans, such as the anterior cingulate cortex and insula, are similarly activated when witnessing the distress of domesticated animals (Decety and Cowell, 2014)

While we associate affective empathy with morality, the two are not the same, and there are situations where the two come into conflict. In most societies, kinship is the main organizing principle of social relations, and morality affirms this principle by spelling out the duties to one's parents, one's kin, and one's ethny. The importance of kinship may be seen in the Ten Commandments, which we wrongfully assume to be universal in application. We are told we must not kill, steal, lie, or commit adultery if the victims are "thy neighbor," which is explained as meaning "the children of thy people" (Leviticus 19:18). High-empathy individuals may thus subvert morality if they view all human distress as being equal in value. At best, they will neglect loved ones in order to help an indefinitely large number of needy strangers. At worst, strangers may develop strategies to exploit high-empathy individuals, i.e., to milk them for all they are worth.

Mapping empathy in the human brain

Empathy appears to arise from specific mechanisms in the brain, and not from a more general property, like general intelligence. It is produced by a sequence of mental events, beginning with "mirror neurons" that fire in tandem with the observed behavior of another person, thereby generating a mental model of this behavior. Copies of the model are sent elsewhere in the brain to decode the nature and purpose of the behavior and to predict the sensory consequences for the observed person. Affective empathy goes further by feeding these predicted consequences into the observer's emotional state (Carr et al., 2003).

Recent MRI research has confirmed that empathy is associated with increased development of certain regions within the brain. Individuals who score high on cognitive empathy have denser gray matter in the midcingulate cortex and the adjacent dorsomedial prefontal cortex, whereas individuals who score high on affective empathy have denser gray matter in the insula cortex (Eres et al.,2015). A high capacity for affective empathy is also associated with a larger amygdala, which seems to control the way we respond to facial expressions of fear and other signs of emotional distress (Marsh et al., 2014).

Can these brain regions be used to measure our capacity for affective empathy? Two studies, one American and one English, have found that "conservatives" tend to have a larger right amygdala (Kanai et al.,2011; Schreiber et al., 2013). This has been spun, perhaps predictably, as proof that the political right is fear-driven (Hibbing et al., 2014). A likelier explanation is that "conservatives" are disproportionately drawn from populations that have, on average, a higher capacity for affective empathy. 

Do human populations vary in their capacity for affective empathy?

Is it possible, then, that this capacity varies among human populations, just as it varies among individuals? I have argued that affective empathy is more adaptive in larger, more complex societies where kinship obligations can no longer restrain behavior that seriously interferes with the ability of individuals to live together peacefully and constructively (Frost, 2015). Whereas affective empathy was originally expressed mainly between a mother and her children, it has become progressively extended in some populations to a wider range of interactions. This evolutionary change may be compared to the capacity to digest milk sugar: initially, this capacity was limited to early childhood, but in dairy cattle cultures it has become extended into adulthood.

I have also argued that this evolutionary change has gone the farthest in Europeans north and west of the Hajnal Line (Frost, 2014a). In these populations, kinship has been a weaker force in organizing social relations, at least since the early Middle Ages and perhaps since prehistoric times. There has thus been selection for mechanisms, like affective empathy, that can regulate social interaction between unrelated individuals. This selection may have intensified during two time periods:

- An initial period corresponding to the emergence of complex hunter/fisher/gatherers during the Mesolithic along the shores of the North Sea and the Baltic. Unlike other hunter-gatherers, who were typically small bands of individuals, these people were able to form large coastal communities by exploiting abundant marine resources. Such communities were beset, however, by the problem of enforcing rule compliance on unrelated people, the result being strong selection for rule-compliant individuals who share certain predispositions, namely affective empathy, proneness to guilt, and willingness to obey moral rules and to expel anyone who does not (Frost, 2013a; Frost, 2013b).

- A second period corresponding to the spread of Christianity among Northwest Europeans, particularly with the outbreeding, population growth, and increase in manorialism that followed the Dark Ages (hbd chick, 2014). The result was a "fruitful encounter" between the two: on the one hand, Christianity, with its emphasis on internalized morality, struck a responsive chord in these populations; on the other hand, the latter modified Christianity, increasing its emphasis on faith, compassion, and original sin (Frost, 2014b).

Conclusion 

Recent research has brought much insight into the nature of empathy, which should no longer be viewed as being simply a noble precept. We now understand it as the outcome of a sequence of events in specific regions of the brain. We have also learned that individuals vary in their capacity for empathy and that most of this variability is heritable, as is the case with most mental traits. Moreover, empathy has two components—cognitive and affective—and the strength of one in relation to the other likewise varies. Although we often consider affective empathy to be desirable, it can have perverse and even pathological effects in some contexts.


References

Carr, L., M. Iacoboni, M-C. Dubeau, J.C. Mazziotta, and G.L. Lenzi. (2003). Neural mechanisms of empathy in humans: A relay from neural systems for imitation to limbic areas, Proceedings of the National Academy of Sciences (USA), 100, 5497-5502.
http://www.ucp.pt/site/resources/documents/ICS/GNC/ArtigosGNC/AlexandreCastroCaldas/7_CaIaDuMaLe03.pdf  
Chakrabarti, B. and S. Baron-Cohen. (2013). Understanding the genetics of empathy and the autistic spectrum, in S. Baron-Cohen, H. Tager-Flusberg, M. Lombardo. (eds). Understanding Other Minds: Perspectives from Developmental Social Neuroscience, Oxford: Oxford University Press.
http://books.google.ca/books?hl=fr&lr=&id=eTdLAAAAQBAJ&oi=fnd&pg=PA326&ots=fHpygaxaMQ&sig=_sJsVgdoe0hc-fFbzaW3GMEslZU#v=onepage&q&f=false  

Decety, J. and J. Cowell. (2014). The complex relation between morality and empathy, Trends in Cognitive Sciences, 18, 337-339
http://spihub.org/site/resource_files/publications/spi_wp_135_decety.pdf 

Eres, R., J. Decety, W.R. Louis, and P. Molenberghs. (2015). Individual differences in local gray matter density are associated with differences in affective and cognitive empathy, NeuroImage, 117, 305-310.
http://www.sciencedirect.com/science/article/pii/S1053811915004206 

Frost, P. (2013a). The origins of Northwest European guilt culture, Evo and Proud, December 7
http://evoandproud.blogspot.ca/2013/12/the-origins-of-northwest-european-guilt.html 

Frost, P. (2013b). Origins of Northwest European guilt culture, Part II, Evo and Proud, December 14
http://evoandproud.blogspot.ca/2013/12/origins-of-northwest-european-guilt.html 

Frost, P. (2014a). Compliance with Moral Norms: a Partly Heritable Trait? Evo and Proud, April 12
http://evoandproud.blogspot.ca/2014/04/compliance-with-moral-norms-partly.html

Frost, P. (2014b). A fruitful encounter, Evo and Proud, September 26
http://evoandproud.blogspot.ca/2014/09/a-fruitful-encounter.html 

Frost, P. (2015). Two paths, The Unz Review, January 24
http://www.unz.com/pfrost/two-paths/ 

hbd chick (2014). Medieval manorialism’s selection pressures, hbd chick, November 19
https://hbdchick.wordpress.com/2014/11/19/medieval-manorialisms-selection-pressures/ 

Hibbing, J.R., K.B. Smith, and J.R. Alford. (2014). Differences in negativity bias underlie variations in political ideology, Behavioral and Brain Sciences, 37, 297-350
http://www.geoffreywetherell.com/Hibbing%20et%20al%20paper%20and%20commentaries%20(1).pdf 

Kanai, R., T. Feilden, C. Firth, and G. Rees. (2011). Political orientations are correlated with brain structure in young adults, Current Biology, 21, 677 - 680.
http://www.cell.com/current-biology/abstract/S0960-9822(11)00289-2 

Keysers, C. and V. Gazzola. (2014). Dissociating the ability and propensity for empathy, Trends in Cognitive Sciences, 18, 163-166.
http://www.cell.com/trends/cognitive-sciences/pdf/S1364-6613(13)00296-9.pdf 

Marsh, A.A., S.A. Stoycos, K.M. Brethel-Haurwitz, P. Robinson, J.W. VanMeter, and E.M. Cardinale. (2014). Neural and cognitive characteristics of extraordinary altruists, Proceedings of the National Academy of Sciences, 111, 15036-15041.
http://www.pnas.org/content/111/42/15036.short 

Schreiber, D., Fonzo, G., Simmons, A.N., Dawes, C.T., Flagan, T., et al. (2013). Red Brain, Blue Brain: Evaluative Processes Differ in Democrats and Republicans. PLoS ONE 8(2): e52970.
http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0052970  

Saturday, June 6, 2015

Imagining the future, imagining death


 
On Star Trek, African Americans were underrepresented among guest actors, who were just as likely to be part-Asian actresses like France Nuyen (Wikicommons)

 

Only six years separate the production of Logan's Run (1976) from that of Blade Runner (1982), yet those intervening years form a watershed in how science fiction imagined the future. The first movie depicts the year 2274. The setting is futuristic, and the people so beautiful that one significant detail may go unnoticed. Eventually, the penny drops—everyone is white! The future looks very different in the second movie. We’re only in the year 2019, and whites are already a minority in Los Angeles; indeed, if we exclude the replicants, there don't seem to be many left.

This change in our imagined future is especially noticeable if we compare pre-1980 movies with post-1980 remakes. In The Time Machine (1960), the future is inhabited by two races: the Eloi and the Morlocks. Both are descended from present-day humans, but only the Eloi still look human. Not only that, they have fair skin and blonde hair. It's the year 802701, and those folks are still around! The Eloi look a lot different in the 2002 remake: they are now a dark-skinned people of mixed Afro-Asian descent, in contrast to the pale Morlocks. This physical difference is absent from the original film and the book itself, which repeatedly describes the Eloi as fair-skinned: "[I was] surrounded by an eddying mass of bright, soft-colored robes and shining white limbs" (Wells, 1898, p. 24); "I would watch for [Weena’s] tiny figure of white and gold" (Wells, 1898, p.41); "I looked at little Weena sleeping beside me, her face white and starlike under the stars" (Wells, 1898, p. 57). In the remake, the only people who look approximately white are the Über-Morlocks ... and they feed on human flesh. A fair-skinned viewer would be torn between two conflicting responses: a desire to identify with the Über-Morlocks as People Who Look Like Me and a desire to hate them as morally worthless. This situation is almost the reverse of the original story line: the Time Traveller is misled by the familiar appearance of the Eloi and develops affection for them, even love, only to realize that they are as different from him as the hideous Morlocks.

Even before 1980, we see some awareness in sci-fi that whites would, one day, no longer have societies of their own. Star Trek (1966-1969) led the way in this direction; nonetheless, the ship’s crew looks overwhelmingly white, partly because the American population was still overwhelmingly white during those years and partly because of the small pool of African American actors. Very few of the latter appear in guest roles, which were just as often filled by part-Asian actresses like France Nuyen, born of a Vietnamese father and a Roma mother (Elaan of Troyius), or Barbara Luna, of mixed Filipino and European descent (Mirror, Mirror). This was the 1960s, when antiracism was still taking shape and partly driven, apparently, by a desire to see exotic-looking women.

All the same, those years saw a general tendency to raise the visibility of African Americans on both the big screen and the little screen. Sci-fi was no exception, particularly by the 1980s. In the Alien series (1979, 1986, 1992), the casts are multiracial, although whites still predominate. Just as significantly, the taboo against a non-white killing a white is broken, albeit in a seemingly acceptable way:

In Alien itself the representative of the company is an android named Ash (something white) - he is a white man who is not human. This is revealed when an African-American crew member pulls off Ash's head: the black man reveals the nothingness of the white man and destroys him by depriving him of his brain, the site of his spirit. The crew bring this severed head back to temporary electronic life to find out how the alien can be destroyed. He tells them that it is indestructible and one of the crew realizes that he admires it. 'I admire its purity', he says, adding in a cut to an extreme, intensifying close-up, 'unclouded by conscience, remorse or delusions of morality.' Purity and absence of affect, the essence of the aspiration of whiteness, said in a state of half-life by a white man who has never really been alive anyway. (Dyer, 2000)

It is really only with Blade Runner (1982) that popular culture began to acknowledge the imminence of white demise. We think of the 1980s as the Reagan Era, a time when White America pushed back after a long retreat during the previous two decades. In reality, the retreat picked up speed. The endgame was already apparent to anyone who gave it much thought, like Blade Runner's scriptwriters. Thus, in the year 2019, we see whites inhabiting a world that is no longer theirs, with some like Sebastian living alone within the decaying shell of their past—the grand but neglected building where most of the action takes place. The least pathetic white is Rachael, a replicant. She also seems the least WASP-looking with her dark hair and her family photos, which suggest a southern European, Armenian, or Jewish origin. The photos themselves are a lie—like the loner Deckard she has no real collective identity, but she does have an imagined one.

We now come to a common theme of love stories: how a fallen man is redeemed by the love of a woman. Here, the fallen man is Deckard—a remnant of a White America in terminal decline. The woman is Rachael, who wants to give him a future of love, marriage, and family, even though this prospect is no more viable than her own imaginary past.

Rachael offers the possibility of developing true emotions [...] The two dark 'whites' [Rachael and Gaff] offer something definite, real, physical to the nothingness of the indifferently fair white man. In the first version, Deckard and Rachael escape, the film ending with a lyrical (if naff) flight away from Los Angeles and perhaps Earth: the dark woman's discovery of true feeling (she weeps) redeems the fair - truly white - man's emptiness. This ending is absent from the 'director's cut'; the dark woman cannot redeem the fair man (Dyer, 2000)

Blade Runner is a film noir with no happy ending in the traditional sense. Even if the two of them did escape to build a life together, it's hard to see how this new life could evolve into anything more than two deracinated individuals with no past and no clear future. Can Rachael have children? Doubtful. It's also doubtful whether Deckard would want to settle down and become a family man. What would he do to support a family? Go back to hunting replicants?

The film does not address these questions. Nor should it. Whether you are for or against, white demise is something to be addressed collectively, and not at the level of individuals. This point is made in the writings of Richard Dyer and other postmodernists who welcome a future of collective death and feel that whites should come to terms with it:

Whites often seem to have a special relation with death, to yearn for it but also to bring it to others. [...] I have been wary of dwelling on the fearfulness - sometimes horrible, sometimes bleak - of the white association with death. To do so risks making whites look tragic and sad and thus comes perilously close to a 'me-too', 'we're oppressed', 'poor us' position that seems to equalise suffering, to ignore that active role of whites in promulgating inequality and suffering. It could easily be taken as giving us a let-out from acknowledging the privilege and effortless power of even the most lowly of those designated as white. Yet, if the white association with death is the logical outcome of the way in which whites have had power, then perhaps recognition of our deathliness may be the one thing that will make us relinquish it.

This sounds ominous. It strangely resembles what some people wrote in the 19th century about the disappearing American Indian and the disappearing Australian Aborigines. It was all for the best, some argued. As "savages" declined in numbers and disappeared, their lands would be resettled and better societies created. Today, whites are being seen in this light. Their departure from existence will purportedly bring an end to inequality and suffering, thus making the world a better place.

So goes the narrative, and few seem to be challenging it, no matter how outrageous it becomes.

Conclusion 

Imagined reality often foretells the real thing—not because the imaginers have a special knack for prediction, but because they end up playing an active role in shaping the future. The death of White America was already being imagined over three decades ago by people who, ultimately, had become reconciled to that fate and even looked forward to it. Moreover, this endgame seems to have struck a responsive chord among the public. As Dyer (2000) argues, "the death of whiteness is, as far as white identity goes, the cultural dominant of our times, that we really do feel we're played out."
 

References

King, C.R. and D.J. Leonard. (2004). Is neo white? Reading race, watching the trilogy, in M. Kapell and W.G. Doty (eds). Jacking in to the Matrix Franchise: cultural reception and interpretation, (pp. 32-46), A&C Black.
https://books.google.ca/books?id=ETf0her6UDgC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Dyer, R. (2000). Whites are nothing: Whiteness, representation and death, in I. Santaolalla (ed.) "New" Exoticisms: Changing Patterns in the Construction of Otherness, (pp. 135-155), Rodopi
https://books.google.ca/books?id=ew0q5AxMfkEC&printsec=frontcover&hl=fr&source=gbs_ge_summary_r&cad=0#v=onepage&q&f=false 

Wells, H.G. (1898). The Time Machine, online edition
http://www.literaturepage.com/read/thetimemachine.html