Tableau III, Piet Mondrian
(1872-1944). The polygenic score can provide a measure of innate cognitive
ability in various human populations. However, it is less valid for African Americans, apparently because of differences in the genetic
architecture of cognition.
When IQ is measured in
European Americans and African Americans, the two groups differ on average by
about 15 points. Is the difference genetic? Or is it due to different
environments?
After years of debate, we are coming
close to an answer. The weight of evidence is shifting, especially because of
two unrelated developments:
- We can now easily measure
ethnic ancestry by means of genetic data. Previously, we had to rely on
self-report or indirect measures like skin color.
- We can now measure the genetic
component of cognitive ability: the polygenic score. This score is a summation
of alleles associated with high educational attainment. Initially a crude
measure, it is getting better and better as we identify more and more of these
alleles.
Both research tools were used
in a recent study. Lasker et al. (2019) applied them to the
Philadelphia Neurodevelopmental Cohort, a sample of 9421 individuals from the
Philadelphia area who received medical care from the Children's Hospital of
Philadelphia network. They ranged in age from 8 to 21 with a mean of 14.2. They
were 51.7% female, 55.8% European American, 32.9% African American, and 11.4% other.
All of them were genotyped and given a series of cognitive tests.
This dataset had advantages
over those of previous studies:
- All participants came from
the same geographic area.
- Heritabilities of cognitive
ability were already estimated by another research team, specifically 0.61 for
the African American participants and 0.72 for the European American
participants.
- Skin, hair, and eye color
could be estimated from the genetic data to control for the effects of
"colorism" (discrimination favoring lighter-skinned over
darker-skinned African Americans)
- Polygenic scores could be
calculated from the genetic data
The main disadvantage was the participants’
young age. Before adulthood the brain is plastic and still developing, so the
heritability of cognitive ability is lower.
IQ results
IQ scores were 100 for
European Americans, 98 for self-described biracial Americans, and 85 for
African Americans. The three groups were respectively 99% European, 80%
European, and 19% European.
African Americans only
European admixture significantly
correlated with IQ among the African American participants. The correlation
remained significant after controlling for either skin color or socioeconomic
status. Interestingly, skin and hair color didn't significantly correlate with
IQ independently of European admixture, but eye color did. Brown eyes
correlated positively with IQ. No explanation was offered by the authors. Did
they get the same finding with European Americans?
Biracial Americans only - smarter than expected
As with African Americans,
skin color didn't seem to influence intelligence independently of European
admixture. On the other hand, "biracial status had a significant effect
independently of European ancestry." In other words, racially mixed
individuals who identified equally as African American and European American,
and not just as African American, tended to be more intelligent than what their
degree of European admixture would predict.
The term "biracial"
as a badge of identity is recent and seems to be most popular among middle-class
people:
Interestingly, many of the
respondents here who identify as biracial are middle class, educated in private
schools, and raised in predominantly white neighborhoods with mostly white
social networks (Rockquemore and Brunsma 2008, p. xxii)
It may be, then, that
self-identified "biracial" people have parents who are, on average,
of higher quality than other people of the same racial background.
African Americans, Biracial Americans, and European Americans combined
When all three groups were combined,
the most important factor was European admixture. Next came socioeconomic
status, which correlated with cognitive ability independently of European
admixture. Finally, self-identification as a European American had an effect
over and above that of European admixture. The last factor suggests that
European American culture has a positive influence on cognitive ability.
Polygenic score results
The polygenic scores ran into
a problem that others have noted: the genetic architecture of cognition seems
to be different in African Americans. This is a problem because researchers have
used only Europeans or European Americans to identify genetic variants that are
associated with high educational attainment. Those variants did correlate with
cognitive ability in the African American sample, but to a much lower degree
than in the European American sample. Their validity as a measure of cognitive
ability was only 20% of what it was in the European American sample.
The authors used a subset of
the same variants to create a polygenic score that would be less sensitive to
linkage disequilibrium decay and thus more valid across different human
populations. This polygenic score had good validity in both the African- and
European-American samples (r = 0.112 and r = 0.227 respectively).
The authors then tried to
create an even better polygenic score by excluding variants that are rare in
African Americans. There was no effect on the results for either African
Americans or European Americans. But what about the reverse? Perhaps cognitive
ability is improved by some variants that are rare in European Americans but
common in African Americans.
Conclusion
This is an excellent study, on
a par with the Minnesota Transracial Adoption Study (Frost 2019). The main
problem is the participants’ young age. Had adults been used, there would have
been less noise in the data, and the results would have been better.
Another problem is the
apparently different genetic architecture of cognition in people of sub-Saharan
African origin. Piffer (2019) has noted that polygenic scores underestimate
African American IQ. He disagrees, however, with the "different genetic
architecture" hypothesis, pointing out that no divergence exists between
mean population IQ and the polygenic scores he has calculated for various
sub-Saharan African groups (Esan, Gambians, Luhya, Mende, Yoruba). None of them,
however, are Igbo, and the Igbo are really the one group that stands out from
other West Africans on measures of intellectual and educational attainment
(Frost 2015a, 2015b). They also contributed to the gene pool of African
Americans: "Many of the enslaved Igbo people in the United States were
concentrated in Virginia's lower Tidewater region and at some points in the
18th century they constituted over 30% of the enslaved black population"
(Wikipedia 2019).
While the polygenic score is a
good measure of raw cognitive ability in most humans, we need to develop a
modified version for people whose ancestry comes primarily from sub-Saharan
Africa.
Lasker, J., B.J. Pesta, J.G.R.
Fuerst, and E.O.W. Kirkegaard. (2019). Global ancestry and cognitive ability. Psych 1(1) https://www.mdpi.com/2624-8611/1/1/34
Piffer, D. (2019). Evidence
for Recent Polygenic Selection on Educational Attainment and Intelligence
Inferred from Gwas Hits: A Replication of Previous Findings Using Recent Data. Psych 1(1): 55-75 https://www.mdpi.com/2624-8611/1/1/5
GDP
per capita as a function of future orientation (Preis et al. 2012)
To
what degree do we value the short term over the long term? The answer varies
not only from individual to individual but also from society to society.
Hunter-gatherers, for instance, value the short term. Perishable food cannot be
stored for future use and, in any case, is not normally obtained in large
enough amounts to make storage worthwhile. If a hunter gets more meat than his
family can consume, he'll give it away to others in the local band.
There
are exceptions, especially at northern latitudes. Meat can be stored in caches
during winter and in cold lake waters during summer. With limited opportunities
for food gathering, women specialize in technologies that need more cognitive
input and longer-term thinking, like garment making, needlework, weaving,
leatherworking, pottery, and use of kilns. Finally, men hunt over longer
distances and therefore plan over the longer term. Northern hunting peoples
thus broke free of the short-term mental straitjacket imposed by hunting and
gathering. In time, their descendants would spread south and rise to the
challenges of social complexity (Frost 2019).
Those
northern hunting peoples were better able to exploit the opportunities created
by farming, but the transition from one lifestyle to the other was still far
from easy. Farming requires not only longer-term thinking but also less monotony
avoidance and higher thresholds for expression of personal violence. In recent
times, hunter-gatherers usually refused offers to be settled on farms. They saw
farming as akin to slavery.
The
change in mindset didn't end with the transition to farming. There were
different types of farming, and some required longer-term investment than
others. Those types generated stronger selection for future orientation.
Language as a
mirror of cultural evolution
Galor
et al. (2018) argue that language is a mirror of cultural evolution. It can
show a society’s degree of commitment to a long-term mindset, as well as other
psychological traits.
The periphrastic
future tense
The
authors studied the relationship between future orientation and forms of the
future tense that express intention and obligation, rather than simply
prediction:
Languages
differ in the structure of their future tense. In particular, linguists
distinguish between languages that are characterized by an inflectional versus
periphrastic future tense [...]. Inflectional future tense is associated with
verbs that display morphological variation (i.e., a change in the verb form
that is associated with the future tense). In contrast, periphrastic future
tense is characterized by roundabout or discursive phrases, such as `will',
`shall', `want to', `going to' in the English language [...] (Galor et al.
2018, p. 6)
[U]nlike
the inflectional future tense, the periphrastic future tense is formed by terms
that express a desire, an intention, an obligation, a commitment as well as a
movement towards a goal. In particular, in the English language, "shall
has developed from a main verb meaning 'to owe', will from a main verb meaning
'to want', and the source of be going to is still transparent" [...].
Moreover, "intention and prediction are most commonly expressed by the
periphrastic future, while the synthetic one is more common in generic
statements, concessives, and suppositions" [...]. Inflectional futures
"also appear systematically (often obligatorily) in sentences which
express clear predictions about the future (which are independent of human
intentions and planning), whereas less grammaticalized constructions [i.e., periphrastic]
often tend to be predominantly used in talk of plans and intentions - a fact
which is explainable from the diachronic sources of future tenses" [...]
(Galor et al. 2018, p. 6)
Galor
et al. (2018, p. 16) used pre-1500 AD data to estimate the return on
agricultural investment ("crop return") in the homeland of a language’s
speakers. They found a positive correlation between this return on investment
and the existence of a periphrastic future tense. They concluded that "a
one standard deviation increase in crop return in the language's contemporary
homeland is associated with a 6 percentage points increase in the probability
that the language is characterized by a periphrastic future tense."
Using
the World Values Survey, the authors also found a positive correlation between
the existence of a periphrastic future tense and future orientation. The
correlation held true both for the people of the world as a whole and for Old
World peoples who speak languages originating in the Old World (Galor et al.
2018, p. 23).
Interestingly,
the return on agricultural investment did not correlate with other linguistic
characteristics, like the existence of the past tense or the perfect tense, the
existence of possessive classifications, the existence of coding for
evidentiality, the number of consonants, and the number of colors (Galor et al.
2018, pp. 18-19).
Grammatical gender
The
authors also looked into the relationship between grammatical gender and the
sexual division of labor in a language's homeland:
Further,
consider ancient civilizations that had been characterized by a sexual division
of labor and consequently by the existence of gender bias. Linguistic traits
that had fortified the existing gender biases have plausibly emerged and
persisted in these societies over time. In particular, geographical
characteristics that had been associated with the adoption of agricultural
technology that had contributed to a gender gap in productivity, and thus to
the emergence of distinct gender roles in society (e.g., the suitability of
land for the usage of the plow […]), may have fostered the emergence and the
prevalence of sex-based grammatical gender in the course of human history.
(Galor et al. 2018, p. 2)
Galor
et al (2018, p. 24) found a negative correlation between grammatical gender and
“plow negative” crops (i.e., crops not requiring use of the plow and, hence,
requiring less male participation). A one standard deviation increase in the
potential caloric yield of plow negative crops was associated with a 13
percentage point decrease in the probability that the language has grammatical
gender.The correlation was reversed in
the case of all crops, the caloric yield now being associated with a 17
percentage point increase in the probability that the language has grammatical
gender.
Politeness
distinctions in pronouns
Finally,
Galor et al. (2018) looked into the relationship between politeness
distinctions in pronouns and ecological diversity, which they related to the
emergence of hierarchical societies.
Linguistic
traits that had reinforced existing hierarchical structures and cultural norms
had conceivably emerged and persisted in these stratified societies in the
course of human history. In particular, politeness distinctions in pronouns
(e.g., the differential use of "tu" and "usted" in the Spanish
language, "Du" and "Sie" in German, and "tu" and
"vous" in French) had conceivably appeared and endured in
hierarchical societies. Thus, geographical characteristics, such as ecological
diversity that had been conducive to the emergence of hierarchical societies
(Fenske, 2014), may have contributed to the emergence of politeness
distinctions. (Galor et al. 2018, p. 2)
Galor
et al. (2018, p. 32) found a significant relationship between politeness
distinctions and ecological diversity in a language's homeland. A one standard
deviation increase in ecological diversity corresponded to a 15 percentage
point increase in the probability that the language has politeness
distinctions.
I'm
skeptical about the last finding. Is ecological diversity conducive to
hierarchical societies? The authors refer to a study that mostly uses African
data. More to the point, the study seeks to link ecological diversity to
centralized states. Centralization of state power and social hierarchization
are not the same thing. Japan, for instance, had a weak central state for much
of its history and yet was very hierarchical, as seen in the politeness
distinctions of the Japanese language.
Conclusion
Although
the authors refer to work by L.L. Cavalli-Sforza, Peter Richerson, and Robert
Boyd on gene-culture coevolution, they avoid discussing the possibility that
selection for future orientation, gender specialization, and hierarchical
politeness has influenced not only culture and language but also human biology.
The coevolution they propose is simply between culture and language. It can be
summed up as follows:
-
Certain patterns of mind and behavior have been favored to varying degrees in
different societies.
-
These cultural patterns are transposed into language.
-
Language then reinforces those cultural patterns: "In light of the
apparent coevolution of cultural and linguistic characteristics in the course
of human history, emerging linguistic traits have conceivably reinforced the
persistent effect of cultural factors on the process of development"
(Galor et al. 2018, p. 1).
Language
is not a passive mirror of culture. It can also act upon culture. For instance,
the way we perceive the future, and its relative importance to us, may be
shaped by the way we speak. This is of course the Sapir-Whorf hypothesis.In a farming society, the periphrastic future
tense might make it easier to envision farming methods and technologies that
pay off over the longer term. Similar arguments have been made for grammatical
gender and politeness distinctions. The way we speak influences our thoughts
and behavior.
Again,
the authors leave it to the reader to go one step farther: patterns of mind and
behavior may influence the frequencies of alleles in the gene pool.
Frost,
P. (2019). The Original Industrial Revolution. Did Cold Winters Select for
Cognitive Ability? Psych 1(1):
166-181 https://doi.org/10.3390/psych1010012
Galor,
O., O. Özak, and A. Sarid. (2018). Geographical
Roots of the Coevolution of Cultural and Linguistic Traits (November 7,
2018). Available at SSRN: https://ssrn.com/abstract=3284239 or http://dx.doi.org/10.2139/ssrn.3284239
Preis,
T., H.S. Moat, H.E. Stanley, and S.R. Bishop. (2012). Quantifying the advantage
of looking forward. Scientific Reports
2: 350 https://www.nature.com/articles/srep00350
Vocabulary
decline in adult non-Hispanic White Americans (controlled for years of
education completed)
"Are
Americans more intelligent than a few decades ago, or less intelligent?"
So asks psychologist Jean Twenge in her introduction to a recent paper on
vocabulary decline in Americans. The findings are disconcerting, to say the
least:
We
examined trends over time in vocabulary, a key component of verbal
intelligence, in the nationally representative General Social Survey of U.S.
adults (n=29,912). Participants answered multiple-choice questions about the
definitions of 10 specific words. When controlled for educational attainment,
the vocabulary of the average U.S. adult declined between the mid-1970s and the
2010s. Vocabulary declined across all levels of educational attainment (less
than high school, high school or 2-year college graduate, bachelor's or
graduate degree), with the largest declines among those with a bachelor's or
graduate degree. (Twenge et al. 2019)
The
last decline was especially large: more than half a standard deviation. In
general, vocabulary test scores have fallen by 8.5%. Ethnic change doesn’t seem
responsible, since non-Hispanic whites have had almost the same decline: 7.2%.
So
what's going on? The authors considered the explanation they first raised:
Americans have become less intelligent despite the increase in education.
First,
Americans' vocabularies might be shrinking despite the increase in education.
This is plausible given the steep decline in the amount of time high school
students spend reading [...] and the decline in SAT verbal scores over time
[...]. This explanation could account for the narrowing of abilities between
those without high school educations and those with college educations. The
difference in vocabulary by education was approximately 3.4 correct answers in
1974-79 but dropped to 2.9 correct answers by 2010-16. However, this
explanation would not account for the decline in performance in all educational
groups. (Twenge et al. 2019)
Uh,
why not? The last sentence makes sense if the explanation is simply that
postsecondary education has become less effective. But what if vocabulary has
declined because the capacity for learning words and retaining them has also declined?
The cause may be genetic. Can we at least ask that question?
Lower admission
standards? Mismatch between cause and effect
The
authors then consider another explanation: because college admission standards
have been lowered, people of lower ability have been going on to postsecondary
education in larger numbers; those who don't are increasingly the least able.
If
education does not improve vocabulary, but educational attainment increases,
those with higher ability will be increasingly selected into the higher
education groups, leaving those with lower ability in the lowest educational
attainment groups. Thus, the no high school degree group will be left with
those of lowest ability, and the college graduate group will have absorbed more
with only moderate ability. (Twenge et al. 2019)
That
explanation is popular, but it doesn’t really match the findings. The vocabulary
decline was steepest during the late 1970s and early 1980s. It then levelled
off. A second decline may have begun in 2008, but it’s still too early to say
(see Figure 1 reproduced above). Most of the decline doesn't correspond to any previous
change in college enrollment by recent high school graduates. The enrollment rate
rose slowly from 45.7% in 1959 to 49.4% in 1980. It began to grow faster only
in the mid-1980s, breaking through the 60% level in 1991 and the 70% level in
2009 (Bureau of Labor Statistics 2010).
So
the alleged cause doesn’t match the presumed effect. The steep increase in college
enrollment from the mid-1980s onward could not have caused the steep vocabulary
decline during the late 1970s and early 1980s. Keep in mind that most of the GSS
respondents had completed their education some years earlier, almost ten years
earlier on average. So the average respondent in the late 1970s had to meet
college admission standards that existed in the late 1960s.
Most of the
decline has been among early boomers
Because
the GSS was first administered in 1974, we don't know when the steep vocabulary
decline began. But we do know when it ended: in the mid-1980s, among
respondents who were born on average thirty years earlier. A genetic cause
would imply a rapid deterioration in the gene pool from 1945 to 1955 and a
slower deterioration thereafter. I have no idea what that cause could be.
If
we're looking for a cultural cause, it would have acted most strongly on the
same cohort of "early boomers." Perhaps it was their increasing exposure
to TV and their decreasing exposure to high literature. Those cultural changes
were already a fait accompli for "late boomers," who experienced a more
gradual dumbing down of vocabulary on TV and in print. The post-2008 vocabulary
decline, if it’s real, might reflect the growing importance of iPhone texting since
the late 2000s.
That
cultural explanation has some support from the data and is favorably mentioned
by the authors. For one thing, comparison with the results of another test
(WAIS) suggests that the decline has been mostly in passive vocabulary, i.e.,
the words we understand but don’t use spontaneously in speech (Twenge et al.
2019). We’re less proficient in "bookish" language:
Perhaps
American culture became less intellectual, either because of or in response to
a lowering of verbal ability among those who read books. Authors aim to sell
more copies of their books, and thus may adjust their vocabulary level to the
skills and preferences of a wider slice of the population. Or, perhaps authors
lowered the vocabulary level of their books for some other reason such as an
interest in getting out a message without linguistic complexity getting in the
way. For example, the Bible has been revised repeatedly to make it more
accessible with the King James Version, the most complex and lyrical English
language version, being succeeded by the simpler New International Version,
Living Bible, and New Revised Standard Version. (Twenge et al. 2019)
The
last point rings true. When I was studying Shakespeare in high school my mother
could explain words I had trouble understanding. She had never gone beyond
Grade 10, but she could read the Bible in the King James Version, as well as a
lot of high-brow literature. This was true for many ordinary adults in the
1970s. Today, regular reading of the Bible is unusual and almost always
confined to modern English versions.
Conclusion
Yes,
college has become a less interesting place for learning vocabulary, and for learning
in general. Yes, a big reason is the growing number of students who don’t
really belong there, and the consequent lowering of standards. Yes, America’s
cultural and linguistic mix is changing, and for that reason alone the average
American would have a smaller English vocabulary.
Nonetheless,
those factors fail to explain why non-Hispanic white Americans know fewer words
today than they did a half-century ago, especially in their passive vocabulary.
Something else is going on, and it seems to be a shift away from high
literature and toward simpler audiovisual media: TV, video, text messaging …
Twenge,
J.M., W.K. Campbell, and R.A. Sherman. (2019). Declines in vocabulary among
American adults within levels of educational attainment, 1974-2016. Intelligence 76: 101377 https://www.gwern.net/docs/iq/2019-twenge.pdf
Before the bath –
William-Adolphe Bouguereau (1825-1905)
What
is it about women's feet? They are the part of a woman’s body that men most
often fetishize. A study on the frequencies of different fetishes concluded:
"Feet and objects associated with feet were the most common target of
preferences [...] We found podophilia prominent (about half of Feticist groups
subscribers) in our sample" (Scorolli et al. 2007).
That
finding is in line with many others:
-
Podophilia was common in a sample of male adolescents and young adults with either
autistic disorder (AD) or borderline/mild mental retardation (MR):
"Partialism (a sexual interest in body parts) was common in the AD group:
four individuals got sexually aroused by body parts (three by feet, one by
bellies) compared to none of the MR group" (Hellemans et al. 2010).
-
A former escort girl and stripper "reported that [her] most frequent
requests were (1) those involving a foot or shoe fetish, (2) those to sell to
the male client her underwear, and (3) those to urinate into her underwear
before selling it to the client" (Cernovsky 2015).
-
Online searches that include the term "fetish" most often co-occur
with the term "foot" (Anon 2007)
-
The Austrian psychologist Wilhelm Stekel noted that ''the most widespread form
of partialism is preference for feet” (Stekel 1952, p.169)
Female
feet have been eroticized even by Victorian writers like George du Maurier
(1834-1896):
"That's
my foot," she said, kicking off her big slipper and stretching out the
limb. "It's the handsomest foot in all Paris. There's only one in all
Paris to match it, and here it is," and she laughed heartily (like a merry
peal of bells), and stuck out the other.
And
in truth they were astonishingly beautiful feet, such as one only sees in
pictures and statues—a true inspiration of shape and color, all made up of
delicate lengths and subtly modulated curves and noble straightnesses and happy
little dimpled arrangements in innocent young pink and white. (Du Maurier 1894,
p. 174)
The cause?
There
has been a lot of speculation. Ramachandran and Hirstein (1998) attributed
podophilia to accidental cross-talk between adjacent regions of the cortex:
In
the Penfield homunculus the genitals are adjacent to the foot and, as one might
expect, we found that two [amputee] patients reported experiencing sensations
in their phantom foot during sexual intercourse. [...] (One wonders whether
foot-fetishes in normal individuals may also result from such accidental 'cross
wiring'—an idea that is at least more plausible than Freud's view that such
fetishes arise because of a purported resemblance between the foot and the
penis.)
Actually,
Sigmund Freud proposed three hypotheses. He listed them in a footnote and
apparently had no strong opinions on the subject. His first hypothesis was that
feet are fetishized because they are strong-smelling. His second was that
“[t]he foot replaces the penis which is so much missed in the woman.” Finally,
he suggested that foot fetishism arises from male desire being redirected away
from the female genital area because of “prohibition and repression” (Freud
1920, n19).
The
third hypothesis seems to me the most interesting. A young man may focus on a woman’s
feet because he cannot look too long at other parts of her body either because
of social discomfort (in the case of her face or her breasts) or because they
are concealed by clothing, so he looks at a body part that is exposed and freely
observable. This is especially a problem in societies where an unmarried woman
is expected to cover herself when seen by a man from outside her family (i.e.,
neither her father nor her brothers). Only her face, hands, and feet may be
seen, and sometimes not even her face. Her feet thus become a focus of male erotic
interest and sexual fantasizing. With repeated reinforcement and conditioning,
they may even become a primary source of sexual arousal.
The
reinforcement and conditioning hypothesis has two problems:
1).
In Western societies, socks and other footwear have been worn indoors and out since
the eighteenth century, and women’s arms, legs, and upper chests have become
denuded since the early twentieth century. If feet no longer rank among the top
three areas of naked female skin, shouldn’t podophilia be a lot less common
nowadays?
2).
Although puberty seems to be key to development of podophilia, a survey of foot
fetishists showed that about half of them remembered feeling attraction to feet
at earlier ages:
45
per cent thought that the fetishism was linked to pleasurable experiences
during childhood. Many men had their first feelings of sexual pleasure with a
member of the family's feet (fathers, uncles, brothers), the experience
connected to innocent activities such as tickling or washing feet [...]
(Peakman 2013, p. 379)
The
mental circuitry thus seems to be already in place by childhood, at an age when
sexual fantasizing is still rudimentary at best.
Hardwiring?
Perhaps
some of that circuitry has become hardwired, through a process of gene-culture
coevolution. In societies where young unmarried women had to conceal most of
their body surface from public view, foot fetishizing may have developed as a
safe form of premarital eroticism. That kind of social environment rewarded
“good boys” who played by the rules of premarital sexuality, while penalizing
“bad boys” who didn’t. The first group would tend to have a certain mix of
hardwired sexual predispositions: not only inhibition of overt sexual interest
but also displacement of sexual interest into areas that are not socially
penalized. Over time, and with each passing generation, those predispositions
would have become prevalent in the gene pool.
That
sounds weird, but we see such hardwiring in the courtship behavior of other
mammals, which typically try to attract a potential mate through behavioral
patterns drawn from other areas of social interaction, such as between a mother
and her infants. In some cases, courtship can incorporate stress-induced
behavior. Feelings of stress cause the male to preen himself, and preening thus
becomes a regular and expected part of courtship, at which point there is
strong selection to make it a hardwired component of the behavioral sequence (Manning
1972, pp. 112-118).
That
kind of opportunism seems to characterize much of our sexual behavior. Kissing,
for instance, was initially done only between a mother and her infants or as a
gesture of respect between a subordinate and his superior. It then became
sexualized in some societies but not in others (Frost 2015). Did podophilia
follow a similar evolutionary path? Did it begin as a side effect of sexual
repression and later became incorporated into love play? Like kissing, it may
have developed as a safe alternative to sexual intercourse. Unlike kissing, it
has not reached the same level of social acceptance. Keep in mind that even
kissing is frowned upon in many societies.
To
test this hypothesis, we need cross-cultural data. Is podophilia more frequent
in those societies where, at least until recent times, most of a woman’s body
surface was hidden from the gaze of male strangers?
Hellemans,
H., H. Roeyers, W. Leplae, T. Dewaele, and D. Deboutte. (2010). Sexual Behavior
in Male Adolescents and Young Adults with Autism Spectrum Disorder and
Borderline/Mild Mental Retardation. Sexuality
and Disability 28(2): 93-104. https://biblio.ugent.be/publication/1092003/file/6745031
Manning,
A. (1972). An Introduction to Animal
Behaviour. 2nd edition. London: Edward Arnold.
Scorolli, C., S. Ghirlanda, M.
Enquist, S. Zattoni, and E.A. Jannini. (2007). Relative prevalence of different
fetishes. International Journal of Impotence
Research 19: 432-437. https://www.nature.com/articles/3901547?ref=dod-jptc.org
Stekel,
W. (1952). Sexual aberrations: The
phenomena of fetishism in relation to sex (Vol. 1) (Trans., S. Parker). New
York: Liveright Publishing Corporation.