Thursday, 4 June 2009

A pseudo-epidemic?

In the late 19th century, a major concern was the poor health of industrial populations, particularly in England but also in other Western countries. The cause? For the medical profession, it seemed to be lack of sunlight. In densely packed tenements under the pall of factory smoke, not enough sunlight was getting through to kill bacteria in the air and on exposed surfaces. This argument seemed clinched by the high prevalence of rickets in children. Rickets develops when not enough calcium is absorbed from the gut. Since this absorption is aided by vitamin D, which the skin synthesizes in response to UV-B light, the conclusion was that people were not getting enough sunlight.

This consensus spread from the medical community to public health authorities, as evidenced at the 1913 congress of Québec health services: "Sunlight must illuminate all classrooms for several hours in the day; the sun is a factor of gaiety, and it is the best natural disinfectant" (Labarre, 1913). Another speaker denounced the dimly lit homes of Canadian farmers:

Natural light, in these homes, hardly has more than pure air its rightful place. It is in many cases intercepted by blinds, shutters, thick curtains … In a word, our people have not learned or do not sufficiently understand these laws of hygiene, fundamental laws of capital importance, which consist in wise and frequent ventilation and in salutary lighting of houses by letting the sun's beneficial rays flood in abundantly. (Savard, 1913)

A movement thus took shape to bring sunlight into every area of life. The means were many and varied: public beaches, fresh-air camps, summer resorts, outdoor youth movements, relocation of the working class to the suburbs, early closure of businesses to let workers walk home in the sun, and more and bigger windows on buildings.

The ‘sunshine movement’ became culturally dominant during the 1920s, fueled in part by the Spanish flu epidemic of 1918. The mid-decade was the tipping point. A 1925 novel The Great Gatsby features a woman with "sun-strained eyes," a "slender golden arm," a "brown hand," a "golden shoulder," and a "face the same brown tint as the fingerless glove on her knee" (Fitzgerald, 1992, pp. 15, 47, 57, 84, 185). In 1926, a Connecticut radio station announced that "a coat of tan seems to be the latest style in natural coloring at this season of the year. [It has] been increasing in favor during the last few years" (Nickerson, 1926). In 1929, a fashion magazine, The Delineator, affirmed that all women would appear incompletely beautiful if not made entirely brown or at least golden by the sun (Cole, 1929). The same year, the readers of Vogue were told, "The 1929 girl must be tanned" and "A golden tan is the index of chic" (Vogue, 1929).

This was a big change, as recalled later in a 1938 poem by Patience Strong (Strong, 1938, p. 37). It begins with a crowded beach “full of lovely girls in scant attire – stretched out full length upon the sand beneath the Sun’s fierce fire.” Then amidst the throng, she sees a lone girl:

Her pretty little parasol she carried with an air;
she wore long gloves – a shady hat – and how the
folks did stare! Protected from the sun, her skin
looked smooth and soft as silk; her cheeks were pink
as roses, and her throat as pale as milk.

And suddenly like magic she had disappeared from
view. She had vanished like a vision that dissolves
into the blue. “Come back! Come back!” I cried to
her. But she had passed away;
and then I knew that I had seen the Ghost of Yesterday.

Much of our modern culture can be traced to the sunshine movement. Without it, we would have no public beaches or winter trips to the Caribbean. Early afternoon would be a time for staying indoors. We would have a more densely built urban environment with less sprawl and taller buildings closer to streets. Demographics, too, would look different. The suburbs not having the same allure, the old-stock population would have remained in the inner city, with the suburbs being home to newer groups (as is the pattern in France). Perhaps even sexual morality might have taken another path. After all, it was the sunshine movement that increasingly exposed the human body to public view, notably on beaches and in the street. Public space thus became sexualized to a degree hitherto unthinkable.

Ironically, this cultural revolution may have all begun through a misunderstanding. Doubts have already been expressed about whether lack of sunlight explains the poor health of industrial towns and cities in late 19th century Britain. Malnutrition and poor sanitation were likelier causes. Now there is reason to doubt whether this factor explains the rickets epidemic of the same period.

Today, rickets is most common not where sunlight is weak but where sunlight is quite strong—the Middle East and South Asia. The cause is dietary, specifically low consumption of calcium and high consumption of foods rich in phytic acid, such as unleavened bread or chapatti (Berlyne et al., 1973; Harinarayan, Ramalakshmi, Prasad, Sudhakar, Srinivasarao, Sarma, & Kumar, 2007). Phytic acid strongly binds to calcium and makes it unusable, with the result that less calcium is available to the body. It is this calcium depletion—and not lack of vitamin D—that causes rickets in the Middle East and South Asia.

In the Western world, phytic acid is present in industrially processed cereals, particularly the high-fiber ones that have become popular in recent years (Sandberg, 1991). Before the industrial age, it was much less present in Western diets:

In the archaeological record, rickets is rare or absent in preagricultural human skeletons, while the prevalence increases during medieval urbanization and then explodes during industrialism. In the year 1900, an estimated 80-90 per cent of Northern European children were affected. This can hardly be explained only in terms of decreasing exposure to sunlight and decreased length of breast-feeding. An additional possible cause is a secular trend of increasing intake of phytate since cereal intake increased during the Middle Ages and since old methods of reducing the phytate content such as malting, soaking, scalding, fermentation, germination and sourdough baking may have been lost during the agrarian revolution and industrialism by the emergence of large-scale cereal processing. The mentioned methods reduce the amount of phytic acid by use of phytases, enzymes which are also present in cereals. These enzymes are easily destroyed during industrial cereal processing. (Paleolithic Diet Symposium)

We thus have the apparent paradox of rickets in the face of normal vitamin D levels. This was shown in a case study from the 1970s of rickets in a Bedouin woman:

Vitamin D was present in normal amounts in the plasma of our patient so this excludes the premise that she was deprived of vitamin D. Bedouin women are sunburned over the anterior half of their head and forearms. They go about their tasks at home unveiled. Vitamin D levels would be expected to be normal from the area of skin available for irradiation and the intensity of sunlight in this area. (Berlyne et al., 1973)

She might still have been vitamin-D deficient. Recommended vitamin D levels have since been raised and now range between 75 nmol/L and 150 nmol/L. These new levels, however, are based on data from a North American population that is consuming ever higher levels of phytic acid, particularly with the popularity of high-fiber diets. It’s also doubtful whether such levels can be attained even with considerable sun exposure. Binkley et al. (2007) studied the vitamin D status of 93 healthy young adults from Hawaii. They had an average of 22.4 hours per week of unprotected sun exposure, 40% reported never having used sunscreen, and all were visibly tanned. Yet their mean vitamin D level was 79 nmol/L and 51% had levels below 75 nmol/L.

This study may surprise those who’ve heard that 15 minutes of sunshine every other day will provide more than enough vitamin D. Well, that figure is just a back-of-the-hand calculation. It makes a lot of assumptions about things we don’t fully know. The truth is that we still know little about the different feedback loops that maintain vitamin D in the human body, especially at the levels that now seem necessary.

This study also calls into question the media-fueled perception that North Americans are facing severe vitamin D deficiency because of sun avoidance and excessive sunscreen use. Such a perception is at odds with the rising incidence of skin cancer, particularly among 20-30 year olds. The trend actually seems to be pointing in the other direction: people are exposing themselves more to the sun, not less.

All this is not to say that vitamin D cannot help people who lack calcium because they consume too much phytic acid. Of course it can. Modern diets have created a new adaptive equilibrium that requires higher levels of vitamin D. We could, however, get the same health outcome by changing industrial processing of cereals, specifically by eliminating the heat treatments that inactivate phytases and by allowing these enzymes to reduce the phytic acid content.


Berlyne, G.M., Ari, J.B., Nord, E., & Shainkin, R. (1973). Bedouin osteomalacia due to calcium deprivation caused by high phytic acid content of unleavened bread. The American Journal of Clinical Nutrition, 26, 910-911.

Binkley N, Novotny R, Krueger D, et al. (2007). Low vitamin D status despite abundant sun exposure. Journal of Clinical Endocrinology & Metabolism, 92, 2130 –2135.

Cole, C.C. (1929). La Revue Moderne, July 1929, p. 16.

Fitzgerald, F.S. (1992). The Great Gatsby, New York: Collier Books.

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Labarre, M.J.P. (1913). De l’hygiène scolaire et de son influence sur le physique et le moral des écoliers, Bulletin Sanitaire, Conseil d’hygiène de la province de Québec, 13, 86-98.

Nickerson, E.C. (1926). Nature's Cosmetics, Bulletin sanitaire, 26(5),134-140.

Sandberg, A.S. (1991). The effect of food processing on phytate hydrolysis and availability of iron and zinc. Advances in Experimental Medical Biology, 289, 499-508.

Savard, A. (1913). Ce que doit être l’Organisation Municipale pour la lutte contre la Tuberculose, Bulletin Sanitaire, Conseil d’hygiène de la province de Québec, 13, 129-150.

Strong, P. (1938). The Sunny Side, London: Muller.

Vogue (1929). June 22, pp. 99, 100.


Tod said...

"The highest individual serum 25(OH)D concentration obtained from sunshine was 225 nmol/L ; in a farmer in Puerto Rico"(Vieth 99)

This was at 18º 15' North of the Equator "Sunrise/Sunset Average: 6:54 am to 6:21 pm. Because of the latitude of Puerto Rico( 18º 15' North of the Equator) the sun is high overhead all year, there are no great variations from day to day between the times of sunrise and sunset."

What is the highest serum 25(OH)D level attained at the latitude of Berne, Switzerland ?

"Of the 30 outdoor workers in whom we measured 25(OH)D in late summer, 3 had levels above 200 nmol/liter (i.e. 211, 205, and 203 nmol/liter); their sun exposure occurred in Nebraska, Kansas, and North Dakota, at 41.2°, 39.0°, and 46.8°N latitude, respectively".

The modest difference between the world record highest serum 25(OH)D concentration obtained from sunshine and that of a minority of those working outdoors at 46.8°N latitude shows how for those of the same skin color and actual UVB exposure, there is in fact variation in serum 25(OH)D concentrations. One would assume the extent to which an individuals physiology and metabolism limits their serum level when in abundant sunshine is a characteristic that is not only variable but heritable.

Vieth in a 2005 presentation says that humans have not genetically altered in 100,000 years. Although he mentions lightening of skin in northern latitudes for him white skinned northern Europeans are still adapted to Africa's massive over abundance of UVB which is the reason for vitamin D deficits in higher latitudes with weaker UVB which is absent in some months.

The outdoor worker at 46.8°N latitude in whose measured serum 25(OH)D in late summer, was at 203 nmol/liter at 46.8°N latitude had a higher than average limit on vitamin D synthesis. What I think this shows is a basic weakness in the contention that Europeans still deal with vitamin D in a way than hasn't changed in 100,000 years.

Thinking the implications of this putative nulifying of evolution specific to the system of vitamin D contol through makes it impossible to accept. It would require a belief that the known variation in vitamin D levels for people of the same UVR exposure an skin colour is in no way influenced by heritable characteristics, or that the system is somehow inaccessible to natural selection.

Natural selection would have made people like Mr. 203 nmol/liter at 46.8°N the majority if it was trying to maximize vitamin D production. He is somewhat unusual for his limit on vitamin D synthesis not for his white skin.

Another way of looking at the continuation of what Dr. Vieth describes (Ref 1b) as -

"a system better designed to cope with an abundance of supply, not a lack of it".
"remarkably inefficient."
"no way to correct for deficiency"

There must have been stabilizing selection operating in northern Europe that winnowed out the rare mutants who lacked any kind of limit on their vitamin D levels otherwise people with no defence against ever increasing serum 25(OH)D concentrations would have become the majority. The lack of ill effects in north Europeans who spend their lives at the equator proves purifying selection has continued to operate against any maximizing of vitamin D levels for 30,000 years. This is despite the lack of UVB for several months in the winter.
"Based on the average rate of decline observed in our subjects, it can be estimated that in individuals for whom summer sun exposure is the principal source of vitamin D, a late summer 25(OH)D level of approximately 127 nmol/liter is needed to avoid levels falling to less than 75 nmol/liter by late winter."

Huge amounts of vitamin D are available in the N. European summer to store for the UVB-less months when levels are dependant on the vit' D the body has stored. If after entering northern Europe evolution's purifying selection treated Vitamin D as a substance that it was worth paying a price to limit it is reasonable to assume that higher levels have a very big down side.

Tod said...


I don't think that Phytic acid affects people in the West now unless they are following a very unusual diet. Westerners eat the kind of foods that don't have much.

"Gluten grain flours digest their own phytic acid very quickly when soaked, due to the presence of the enzyme phytase. Because of this, bread is very low in phytic acid. Buckwheat flour also has a high phytase activity. The more intact the grain, the slower it breaks down its own phytic acid upon soaking. Some grains, like rice, don't have much phytase activity so they degrade phytic acid slowly. Other grains, like oats and kasha, are toasted before you buy them, which kills the phytase." (3b)

or counteract it .
"The paper "Dietary Fibre and Mineral Bioavailability" listed another method of increasing mineral absorption from whole grains that I wasn't aware of. Certain foods can increase the absorption of minerals from whole grains high in phytic acid. These include: foods rich in vitamin C such as fruit or potatoes; meat including fish; and dairy." (3a)

There is also a lot of Vitamin D fortification in Western countries.

Trevor Marshall is very skeptical of the claims being made for vitamin D.
"Vitamin D supplementation of food and baby formula has spread throughout the world, even to the less economically developed countries. It is thus very difficult to find a population which can be studied in order to ascertain whatthe level of natural metabolic homeostasis for 25-D might actually be. Two studies do provide a glimpse, however. The first founda “high prevalence of Vitamin D deficiency in Chilean healthy postmenopausal women(40).” The average level of serum 25-D sampled from 90 “healthy ambulatory women” showed that 27% of premenopausal, and 60% ofpostmenopausal women, had 25-D levels under 50 nmol/L. A study(41)showing “Hypovitaminosis D is common in bothveiled and nonveiled Bangladeshi women” found a 25-D level less than 40 nmol/L in approximately 80% of the healthy young women. These studies show a wide variation in levels of 25-D being generated by populations whose diets have probably not yet been significantly altered by ‘The Sunshine Vitamin,’ indicating that the unsupplemented metabolic homeostasis is probably in the range 23-60 nmol/L, and that it falls with advancing age." (4a)


1a)Vitamin D supplementation, 25-hydroxyvitamin D concentrations,1, and safety,1999, Reinhold Vieth, University of Toronto.

1b)Vieth, in 'Vitamin D'
Second Edition
Editors D. Feldman, F. Glorieux
The Pharmacology of Vitamin D, Including Fortification Strategies here.

Prospects for Vitamin D Nutrition, Reinhold Vieth 2005 [Video presentation].

Effects of Above Average Summer Sun Exposure on Serum 25-Hydroxyvitamin D and Calcium Absorption. here

Dietary Fiber and Mineral Availability

A few thoughts on Minerals, Milling, Grains and Tubers.

Vitamin D discovery outpaces FDA decision-making

Vitamin D: the alternative hypothesis;Paul J. Alberta, Amy D. Proalb, Trevor G. Marshall

Peter Frost said...


Thanks, I'm checking through your references. In 1991, Sandberg said that heat inactivates phytases during industrial processing of cereals. Is this no longer the case? Do we have data on phytic acid content in present-day cereal products (e.g., bread, high-fiber cereals, etc.)?

Tod said...

"The influence of phytic acid on calcium and magnesium absorption would seem of minor importance.[...]The high phytic acid content of cereals and legumes also appears to be of little or no importance in relation to calcium nutrition in infants. Davidsson et al. (10) reported a relatively high apparent calcium absorption (~60%) in young infants (aged 7–17 wk) fed a high phytate (0.76%) complementary food based on soy flour and whole wheat. Calcium absorption was similar from a low phytate (0.07%) porridge based on wheat flour and milk powder (11). As with magnesium, there is a strong homeostatic control of calcium via both intestinal absorption and urinary excretion, and any calcium deficiency is more likely to be due to a low intake rather than a low absorption."
(Ref 1)

"Phytic acid is a strong chelator of important minerals such as calcium, magnesium, iron, and zinc, and can therefore contribute to mineral deficiencies in people whose diets rely on these foods for their mineral intake, such as those in developing countries.[6][7] It also acts as an acid, chelating the vitamin niacin, which is basic, causing the condition known as pellagra. [8] In this way, it is an anti-nutrient, despite its theraputic effects (see below) which simultaneously make it a (phyto)-nutrient.[1] For people with a particularly low intake of essential minerals, especially young children and those in developing countries, this effect can be undesirable".(Ref 2)

Phytic acid content in present-day western food products is countered by other foods and calcium sources in the diet like dairy products. Ordinary bread is low in phylate.

Are low vitamin D levels causing disease or are they actually low as a result of disease as Trevor Marshall contends?
Marshall's argument is complex he seems to attribute most disease including auto-immune conditions to low level infections much like Cochran and Ewald.

1)Influence of Vegetable Protein Sources on Trace Element and Mineral Bioavailability,Phytic acid
From Wikipedia
Richard F. Hurrell2

2)Phytic acid From Wikipedia.

Mike said...

The data on vitamin D and cancer prevention is as extensive as the data on smoking causing cancer. It is time for everyone to know about the facts. take a look at The site also offers a good newsletter and has recently launched a new micropill formulation of vitamin D

Tod said...

Actually Mike the data on vitamin D are replete with the putative benefits of higher levels as good as the similar case for antioxidant supplements was. Mortality in Randomized Trials of Antioxidant Supplements
"Conclusions Treatment with beta carotene, vitamin A, and vitamin E may increase mortality. The potential roles of vitamin C and selenium on mortality need further study.

Lung cancer from smoking is new . Moreover it has been (largely) inaccessible to natural selection as those old enough to die from it are usually past prime age for reproducing or indirectly influencing their reproductive success.

Lets say people had been smoking for 30,000 years, (which is how long modern humans have been living in Northern Europe and wearing clothes). Even with the above caveat (that getting cancer from smoking rarely affect the number of ones descendants). I can assure you that after 30,000 years smoking would cause hardly any lung cancer today.

Hence there are excellent evolutionary reasons to believe that if vitamin D is so very good for you at the higher levels, any problem caused by low serum levels would have been taken care of by now.

Compare the latitude of the US to how far north France, Germany, and Britain are. Is it credible that after evolving to optimize vitamin D at latitudes in the 50's people would suffer from any insufficiency at far lower latitudes. Even Toronto gets far more UV for more of the year than northern Europe.

If Vitamin D is so very good for you and northern Europeans have evolved where it is absent for half the year why do they shut off synthesis after this limit of 10,000 IU (Vieth)in 20 minutes is reached? They ought to have evolved to make all they can while the sun shines and then store for the winter.

The Vitamin D level is limited, no matter how much time you spend outdoors (at the equator you won't push your levels up that much, now why would something that good be restricted. The reason has to be that it is not an unalloyed benefit when outside a certain range.

"Two studies showed that in response to a given set of ultraviolet light treatment sessions, the absolute rise in serum 25(OH)D concentration was inversely related to the basal 25(OH)D concentration. In the study by Mawer et al (34), the increase in 25(OH)D in subjects with initial 25(OH)D concentrations <25 nmol/L was double the increase seen in subjects with initial concentrations >50 nmol/L. Snell et al (27) showed that in subjects with initial 25(OH)D concentrations <10 nmol/L, ultraviolet treatments increased 25(OH)D by 30 nmol/L, but in those with initial 25(OH)D concentrations approaching 50 nmol/L, the increase was negligible."(Vieth 99)

Tod said...


That sounds to me as if 25(OH)D concentrations approaching 50 nmol/L are not so low for optimum health.

Ingested vitamin D is perhaps overwhelming the natural homeostasis by a massive dosage when it raises levels. Low dosages should be absorbed better than they seem to be.

"The assembled data from many vitamin D supplementation studies reveal a curve for vitamin D dose versus serum 25-hydroxyvitamin D [25(OH)D] response that is surprisingly flat up to 250 µg (10000 IU) vitamin D/d. To ensure that serum 25(OH)D concentrations exceed 100 nmol/L, a total vitamin D supply of 100 µg (4000 IU)/d is required." (Vieth 99)

"Potential adverse effects of high intakes of calcium and vitamin D need to be elucidated".
Calcium and vitamin D intakes are positively associated with brain lesions in depressed and non-depressed elders

Vitamin D and aging.

"Taken together, aging shows a U-shaped dependency on hormonal forms of vitamin D suggesting that there is an optimal concentration of vitamin D in delaying aging phenomena. Our recent study shows that calcidiol is an active hormone. Since serum calcidiol but not calcitriol is fluctuating in physiological situations, calcidiol might determine the biological output of vitamin D action. Due to its high serum concentration and better uptake of calcidiol-DBP by the target cells through the cubilin-megalin system, calcidiol seems to be an important circulating hormone. Therefore, serum calcidiol might be associated with an increased risk of aging-related chronic diseases"

I notice Mike's site are offering free Vitamin D for kids.
Infant Vitamin D Supplementation and
Allergic Conditions in Adulthood
"To conclude, our findings suggest an association between large-dose vitamin D supplementation in infancy and an increased risk of atopy, allergic rhinitis, and asthma later in life. Further study is required to determine whether these observations
could imply that vitamin D supplementation in infancy may have long-term effects on immune regulation, or if they reflect some unmeasured determinants of vitamin D supplementation".

Tod said...

The paper - "Dietary Fibre and Mineral Bioavailability" cited by Stephan at Whole Health Source (linked to above as Ref 3a) explains why Dunnigan found that consuming unleavened bread while avoiding meat makes a remarkable difference to the likelyhood of rickets despite the vitamin D content of meat being very low.

"The UVR-diet model derived from the Glasgow Asian population indicates that the prevalence, seventy and age distribution of privational rickets and osteomalacia are precisely delineated by the practice of varying degrees of vegetarianism or near-vegetarianism. High-fibre and -phytate diets are potentially rachitogenic in children, particularly at the pubertal growth spurt, but appear relatively innocuous in adults. Exposure to sufficient UVR neutralizes the rachitogenic or osteomalacic potential of the vegetarian diet in the presence of an adequate intake of dietary Ca...The rarity of privational osteomalacia in white vegetarians in the UK appears related to their pursuit of a healthy lifestyle with high levels of outdoor exposure. " (An epidemiological model of privational rickets and osteomalacia, Dunnigan - 2007).

The turn of the century rickets was indeed more to do with malnutrition. Milk (calcium) was really for infants then. Phylates do have an affect on vitamin D but that is still being corrected by the diet in the West. True, a minority of people have shifted towards the phytic rich and calcium poor vegan diet with more whole grains, soy milk, ect. and less Red meat, so maybe some of those, who avoid the sun, have lowered their 'D' levels.e

Vitamin D 'insufficiency' is surely 'a pseudo-epidemic'. Those recommending higher serum 25(OH)D concentrations are wrong, just as experts were wrong about the effect of antioxidants

"The truth is that we still know little about the different feedback loops that maintain vitamin D in the human body".

I agree with that 100%,
"Vitamin D is a prohormone with several active metabolites that act as hormones. Vitamin D is metabolized by the liver to 25(OH)D, which is then converted by the kidneys to 1,25(OH)2D (1,25-dihydroxycholecalciferol, calcitriol , or active vitamin D hormone). 25(OH)D, the major circulating form, has some metabolic activity, but 1,25(OH)2D is the most metabolically active".

Tod said...

"Serum levels of bioactive vitamin D hormone (1,25(OH2)D) are usually normal in cases of vitamin D overdose".(Merck Manual)

Clearly the feedback loops cast doubt on the interpretation of low serum 25(OH)D, (the only metabolite usually tested for, as a cause of disease.

Serum Vitamin D Levels and Markers of Severity of Childhood Asthma in Costa Rica

"Our results suggest that vitamin D insufficiency is relatively frequent in an equatorial population of children with asthma. In these children, lower vitamin D levels are associated with increased markers of allergy and asthma severity".

Costa Rica is 9 - 11 degrees above the equator. It gets more UVB than almost anywhere else on earth. Do Costa Ricans make their kids cover up and avoid the sun ?
Costa Rican Independence Day

How could children in Costa Rica possibly develop low 25-hydroxyvitamin D levels (the major circulating form of vitamin D) except as the result of disease (Trevor Marshall's hypothesis)

The treatment for asthma is inhaled corticosteroids which damp down inflammation and affect the immune system. According to Marshall large doses of Vitamin D can also affect the immune system. The obvious question - does the immune system go haywire in cases of autoimmunity or is it trying to fight some kind of infection.

Interview with evolutionary biologist Paul Ewald.
"The same evolutionary forces that would cause a serious disease to be weeded from the population would also cause those people whose immune systems are prone to self-destruction to be eliminated from the population.[...] even researchers who previously dismissed the possibility of infection are accepting the possibility that “autoimmune” disease could be triggered by infection. This is some progress, but it’s not enough. Especially since the concept of autoimmunity encourages doctors to prescribe immunosuppressive steroids"

People who take Vitamin D to cope with 'autoimmune' disorders might be suppressing their immune system and allowing a low level infection to progress.

Tod said...

"Melanin content does not alter the amount of vitamin D that can be produced. Thus, individuals with higher skin melanin content will simply require more time in sunlight to produce the same amount of vitamin D as individuals with lower melanin content" (1),(2).

"Imagine you're a space alien looking down on Earth. You have these humans who evolved in
the Horn of Africa, as nudists living around the equator. They would have been getting lots of
vitamin D through their skin. Then they suddenly . . . move north and put on lots of clothes and
block out most of their capacity to make vitamin D," said Reinhold Vieth, a University of
Toronto vitamin D researcher. "For me it's a no-brainer. We're not getting enough.[...]

Dr. Vieth has approached the matter by asking: What vitamin D level would humans have if they were still living outside, in the wild, near the equator, with its attendant year-round bright sunshine? "Picture the natural human as a nudist in environments south of Florida," he says.

He estimates humans in a state of nature probably had about 125 to 150 nanomoles/litre of vitamin D in their blood all year long — levels now achieved for only a few months a year by the minority of adult Canadians who spend a lot of time in the sun, such as lifeguards or farmers.

For the rest of the population, vitamin D levels tend to be lower, and crash in winter. In testing office workers in Toronto in winter, Dr. Vieth found the average was only about 40 nanomoles/L, or about one-quarter to one-third of what humans would have in the wild." (3)

How can these statements be reconciled.

The answer is that only advantage to vitamin D synthesis conferred by white skin lies in the reduced time taken until that synthesis ceases.

"Ultraviolet exposure beyond the minimal erythemal dose does not increase vitamin D production further. The ultraviolet-induced production of vitamin D precursors is counterbalanced by degradation of vitamin D and its precursors. The concentration of previtamin D in the skin reaches an equilibrium in white skin within 20 min of ultraviolet exposure (41). Although it can take 3–6 times longer for pigmented skin to reach the equilibrium concentration of dermal previtamin D, skin pigmentation does not affect the amount of vitamin D that can be obtained through sunshine exposure." ( ref 5)

It might be objected that in ice age northern Europe (Winter temperatures averaged from -20 to -30 °C in exposed conditions with little natural protection) people wore more clothing and endured winter months when vitamin D synthesis from UVB was absent (as it is now). However that is largely irrelevant; the greatest amount of clothing would surely be worn in the UVB-less period of winter. Summer was when UVB was intense enough to synthesize vitamin D and at that time far more skin would be exposed.

The lack of year-round vitamin D synthesis in northern Europe would mean there was a period of several months when stored vitamin D (synthesized during the summer) was vitally necessary.

Tod said...


It might be presumed that white skin enables increased synthesis of vitamin D during the summer and hence greater stores for the winter. However there is more Vitamin D made in summer UVB by 'white' skin than even the most pigmented black skin given a couple of hours exposure.(5) The only situation that offers any advantage to the Vit. D synthesis characteristics of 'white' skin is where the north Europens were only exposed to summer sun for 20 minutes a day. Surely unrealistic for hunter gatherers.

If any people should have adaptations for the maximizing of vitamin D it is those who evolved in north Europe.

"This cross-sectional study included 741 premenopausal white women, mostly of French descent.[...]
Circulating 25(OH)D concentrations in premenopausal women are strongly related to DBP polymorphisms. Whether DBP rare allele carriers have a different risk of vitamin D–related diseases and whether such carriers can benefit more or less from dietary interventions, vitamin D supplementation, or sun exposure need to be clarified."(ref6)

Shouldn't these circulating 25(OH)D concentration polymorphisms be vanishingly rare in Europeans if anything that lowered vitamin D levels was being selected against.


1)quote from 'Vitamin D'- Wikipedia. Referenced to Vitamin D'.

2)Matsuoka LY, Wortsman J, Haddad JG, Kolm P, Hollis BW (April 1991). "Racial pigmentation and the cutaneous synthesis of vitamin D". Arch Dermatol 127 (4): 536–8.

"We conclude that while racial pigmentation has a photoprotective effect, it does not prevent the generation of normal levels of active vitamin D metabolites."

3)Vitamin D Deficiency Called Major Health Risk By Rob Stein
Washington Post Staff Writer
Friday, May 21, 2004.

4)Vitamin D casts cancer prevention in new light,
Globe and Mail by MARTIN MITTELSTAEDT (2007)

5)Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety.
Reinhold Vieth the Department of Laboratory Medicine and Pathobiology, University of Toronto.

6)Genetic polymorphisms of the vitamin D binding protein and plasma concentrations of 25-hydroxyvitamin D in premenopausal women.Marc Sinotte1,2,3, Caroline Diorio1,2,3, Sylvie Bérubé1,2,3, Michael Pollak1,2,3 and Jacques Brisson1,2,3

From the Département de Médecine Sociale et Préventive, Université Laval, Québec, Canada (MS, CD, and JB); the Unité de Recherche en Santé des Populations, Centre Hospitalier Affilié Universitaire de Québec, Québec, Canada (MS, CD, SB, and JB); the Centre des Maladies du sein Deschênes-Fabia, Centre Hospitalier Affilie Universitaire de Québec, Québec, Canada (CD, SB, and JB); and the Cancer Prevention Research Unit, Lady Davis Institute of the Jewish General Hospital and McGill University, Departments of Medicine and Oncology, Montréal, Canada (MP).

Tod said...

Correction, the first paragraph of the previous comment should read:-

It might be presumed that white skin enables increased synthesis of vitamin D during the summer and hence greater stores for the winter. However there is no more Vitamin D made in north European summer UVB by 'white' skin than even the most pigmented black skin (given a couple of hours exposure).(5) The only situation that offers any advantage to the Vitamin D synthesis characteristics of 'white' skin is where the north Europeans were only exposed to summer sun for 20 minutes a day; a very unrealistic scenario for hunter gatherers or farmers.

Tod said...

Even under a parasol Vitamin D would be made:-

"Dr. Turnbull, working with Dr. Kimlin in Australia, showed that UVB light in the shade is strong enough to activate vitamin D production in the skin. Think of UVB as a ping-pong ball. It bounces off lots of things. When you go into the sun—if the sun is high enough in the sky—UVB light comes through the atmosphere and then starts bouncing around. It bounces at you from the ground, buildings, cars, and even the bottom of clouds. Sitting under a shade tree delivered about half as much UVB as sitting in the direct sun. Furthermore, the damaging UVA radiation under direct sun was three times more than under the shade tree. Sitting in the shade in the summer (and the winter in subtropical and tropical latitudes) is a good way to get vitamin D." (Vitamin D Council)

'Damaging UVA' is what produces a deep tan. Someone who avoided direct sunlight - like the lady with a parasol - would be very pale but would be making vitamin D all the same.

From the Vitamin D council website:-
"When researchers went to an Italian nursing home, they found that 99 of 104 residents had no detectable vitamin D in their blood,"

Bad news for these people ?

All of the 104 resident were over 98 years old! I hate to think what would have happened if they'd been keeping their vitamin D levels 'normal'.

Peter Frost said...


Thanks for your references! You've brought a lot of information to my attention.

What is your opinion of Stamp's finding that white, South Asian, and black subjects had the same increase in plasma 25(OH)D in response to UV irradiation? Does skin color make a difference when the time of irradiation is controlled? In his study, the subjects had whole-body radiation for 1 minute on the front and 1 minute on the back the first day, with increases of 1 minute per day for both surfaces (Stamp, T.C.B. (1975) Factors in human vitamin D nutrition and in the production and cure of classical rickets, Proc. Nutr. Soc., 34, 119)

Tod said...

The UVB irradiation Stamp used was maybe too powerful to be equivalent to African sun of draw conclusions about skin synthesis.
Stamp himself suggests - 'these changes may represent maximum stimulation of vitamin D synthesis, overcoming a partial pigment barrier'.
'Irradiation at a distance of 450 mm from the light source using a Theraktin lamp [...] giving a strong emission at 290 nm.'

The lamp was emitting energy in the wavelength ideal for vitamin D synthesis.
"It is recognized that ultraviolet radiation with energies between 255
and 315 nm (UVB) is effective in photolyzing 7-DHC to previtamin D3" (Hollick)

One possible reason for the lack of difference between black and white was that the intensity of the UVB made the whites' skin synthesis shut off in a couple of minutes.

Vieth (1999) found that UVB lamps produced considerably higher 25(OH)D concentrations than sun.

"The effects of artificial ultraviolet light treatment sessions on 25(OH)D concentrations are summarized in Table 2Go. The highest individual 25(OH)D concentration attained was 274 nmol/L (38). The main problem in interpreting the data was that the exact dose of ultraviolet light was ambiguous because there is variability in the surface area of skin exposed and in the frequency and duration of exposure. Had the ultraviolet treatment sessions continued, one would expect that for those given full-body exposure, serum 25(OH)D concentrations would plateau at mean values comparable with those of the farmers and lifeguards shown in Table" (1)

Stanley drew the correct conclusion ;-
'It is more sensible to conclude that black skin evolved for protection against tropical sunburn than to suggest that white skin evolved to ensure vitamin D nutrition in temperate climates.'

1)Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety.
Reinhold Vieth.

Tod said...

I can no longer access the text of Dunnigan. I think the following excerpts are from the same 1997 paper.(1)

"The discovery of late rickets and osteomalacia in the Glasgow Muslim community in 1961 (Dunnigan et al. 1962) was followed by a study of 7 d weighed dietary intakes in rachitic and normal Muslim schoolchildren and in a control group of white schoolchildren (Dunnigan & Smith, 1965). Surprisingly, the dietary vitamin D intakes of rachitic Asian children, normal Asian children and Glasgow white children were similar. The higher fibre and phytate intakes of the Asian children were not considered etiologically significant.
Studies of daylight outdoor exposure showed no significant differences between the summer and non-summer exposures of rachitic and normal Muslim schoolchildren or between Muslim and white schoolchildren (Dunnigan, 1977).

These patterns of daylight outdoor exposure did not conform to the Muslim ‘purdah’ stereotype, although sunbathing was unknown in the Asian community. It was also evident that many Glasgow white schoolchildren went out relatively little, even in fine weather, in a form of ‘cultural purdah’. Similar patterns of apparently adequate daylight outdoor exposure were noted in Asian women with privational osteomalacia wearing Western dress in London (Compston, 1979). These observations did not support the hypothesis that Asian rickets and osteomalacia resulted from deficient exposure to UVR or from deficient dietary vitamin D intake relative to white women and children in whom privational rickets and osteomalacia were unknown outside infancy and old age.

The suggestion that Asian rickets in the UK might be related to the consumption of unleavened bread was supported by Mellanby’s (1949) earlier identification of an anticalcifying factor in oatmeal, subsequently shown to be phytic acid, and by evidence of ‘sunshine’ rickets in Iranian village children consuming large quantities of unleavened bread (tanok) with abundant exposure to UVR (Rheinhold, 1972)."[...]

"The UVR-diet model derived from the Glasgow Asian population indicates that the prevalence, seventy and age distribution of privational rickets and osteomalacia are precisely delineated by the practice of varying degrees of vegetarianism or near-vegetarianism. High-fibre and -phytate diets are potentially rachitogenic in children, particularly at the pubertal growth spurt, but appear relatively innocuous in adults. Exposure to sufficient UVR neutralizes the rachitogenic or osteomalacic potential of the vegetarian diet in the presence of an adequate intake of dietary Ca (Fig. 1). The rarity of privational osteomalacia in white vegetarians in the UK appears related to their pursuit of a healthy lifestyle with high levels of outdoor exposure. Nevertheless, vegetarian diets may lead to privational rickets and osteomalacia, regardless of race, if exposure to WR is sufficiently restricted (Chick et al. 1923; Dent & Smith, 1969; Fogelman ef al. 1979)." (1)

I wondered why you seemed so convinced the aetiology of rickets in Victorian England involved bread. An awareness of how much was eaten back then I suppose, anyway you were right.
Correction, bakers bread did induce rickets, through being adulterated with alum (aluminium potassium phosphate).(2)


1)An epidemiological model of privational rickets and osteomalacia.
Dunnigan MG, Henderson JB.
Proc Nutr Soc. 1997 Nov;56(3):939-56.

2)Commentary: John Snow and alum-induced rickets from adulterated London bread: an overlooked contribution to metabolic bone disease.
International Journal of Epidemiology 2003;32:340-341.

Dennis Mangan said...

Hence there are excellent evolutionary reasons to believe that if vitamin D is so very good for you at the higher levels, any problem caused by low serum levels would have been taken care of by now.

No. Before the 20th century, most people farmed or fished or worked outdoors. They didn't work in offices and avoid the sun on doctors' advice.

Tod said...

Before the 20th century, most people farmed or fished or worked outdoors. They didn't work in offices and avoid the sun on doctors' advice.

They would have to avoid the shade as well as the sun (see above).

Whatever the putatively deleterious effects of sun avoidance would be, black Africans would suffer them several times worse just going about normally. Those living in northern climes with UVB absent for months would be visibly ill - if it were true.

Scare tactic
Serum 25-hydroxyvitamin D3 levels are elevated in South Indian patients with ischemic heart disease. here.

Peter Frost said...


Before the 1920s, most people routinely avoided the sun, and this was as true for farming people as it was for urban dwellers.

In the late 1980s, I interviewed elderly French-Canadian farmers who remembered how things were before the 'sunshine movement' took hold during the interwar years. In their youth, they practiced sun avoidance to a degree that would now seem unthinkable and even ridiculous. Before going out into the fields, they would put on broad-brimmed hats and clothes that fully covered their arms and legs. They even had special gloves with holes to let their fingers through. They also scheduled outdoor work as early or as late in the day as possible. In fact, much of their farmwork took place before sunrise. Midday was a time for staying indoors and having the main meal.

Frankly, when I look around myself, I see very few people avoiding the sun. And this impression seems borne out by skin cancer statistics. If people are avoiding the sun, why is the incidence increasing?

Dennis Mangan said...

Skin cancer is caused by burning, so the fact that some people get it doesn't mean that the majority get adequate solar exposure. Also, as far as I can see after a cursory look around, it's melanoma the rates of which are increasing, and melanoma isn't necessarily caused by sun exposure. (See this map for instance, and tell me whether you see a correlation between sunlight and melanoma.)

As for your French-Canadians, what were their non-skin cancer, osteoporosis, heart disease, etc., rates? Perhaps terrible. And how long historically had this practice endured? One generation, two? Also, what about other cultures - I'd think that we'd need more data before generalizing the French Canadian case to the entire Western world. I myself do see plenty of sun avoidance, anyone who works in an office is a good candidate. The daughter of a friend of mine got rickets because her pediatrician warned the mother to avoid the sun and use sunscreen.

Tod said...

Childhood sun exposure as a risk factor for melanoma: a systematic review of epidemiologic studies.
"CONCLUSIONS: Ecological studies provided better-quality evidence than case-control studies for examining the effects of exposure to sunlight during specific age periods. Exposure to high levels of sunlight in childhood is a strong determinant of melanoma risk, but sun exposure in adulthood also plays a role."

" Diet may also be a
factor in melanoma as it is in other cancers"
Melanoma risk in relation to height, weight, and exercise (United States). Cancer
Causes Control 12, 599-606 (2001).

Case-control study of malignant melanoma in
Washington State. II. Diet, alcohol, and
obesity. Am J Epidemiol 139, 869-80 (1994)..

It's my impression very few young people avoid the sun or wear broad brimmed hats. Just the face and neck exposed gives 5000 IU a day. When I see office workers in lunch hour they're going out their way for some sun.
Ultraviolet Exposure Scenarios: Risks of Erythema from Recommendations on Cutaneous Vitamin D Synthesis.
"We have calculated the exposure required to gain a number of proposed oral-equivalent doses of vitamin D, as functions of latitude, season, skin type and skin area exposed, together with the associated risk of erythema, expressed in minimum erythema doses. The model results show that the current recommended daily intake of 400IU is readily achievable through casual sun exposure in the midday lunch hour, with no risk of erythema, for all latitudes some of the year and for all the year at some (low) latitudes. At the higher proposed vitamin D dose of 1000 IU lunchtime sun exposure is still a viable route to the vitamin, but requires the commitment to expose greater areas of skin, or is effective for a shorter period of the year. The highest vitamin D requirement considered was 4000 IU per day."

Anyway I doubt that many office workers stay indoors at the weekend.

"The daughter of a friend of mine got rickets because her pediatrician warned the mother to avoid the sun and use sunscreen."

The mother wasn't a vegan by any chance?

The putative destruction of folic acid by UVA may explain the doctors advice . I think it's nothing like the factor Jabloncki suggests myself. Sunbeds produce a lot of UVA and those using them would have suffered deficiency, birth defects in babies ect. on the way to their odd orange tan.