Showing posts with label UV. Show all posts
Showing posts with label UV. Show all posts

Thursday, July 9, 2009

African Americans and vitamin D

Vitamin D insufficiency is more prevalent among African Americans (blacks) than other Americans and, in North America, most young, healthy blacks do not achieve optimal 25-hydroxyvitamin D [25(OH)D] concentrations at any time of year. This is primarily due to the fact that pigmentation reduces vitamin D production in the skin. Also, from about puberty and onward, median vitamin D intakes of American blacks are below recommended intakes in every age group, with or without the inclusion of vitamin D from supplements. (Harris, 2006)

It’s well known that African Americans have low levels of vitamin D in their blood. In fact, this seems to be generally true for humans of tropical origin. In a study from Hawaii, vitamin D status was assessed in healthy, visibly tanned young adults who averaged 22.4 hours per week of unprotected sun exposure. Yet 51% had levels below the current recommended minimum of 75 nmol/L (Binkley et al., 2007). In a study from south India, levels below 50 nmol/L were found in 44% of the men and 70% of the women. The subjects are described as “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight” (Harinarayan et al., 2007). In a study from Saudi Arabia, levels below 25 nmol/L were found in respectively 35%, 45%, 53%, and 50% of normal male university students of Saudi, Jordanian, Egyptian, and other origins (Sedrani, 1984).

These low levels are usually blamed on the darker skin of tropical humans, i.e., melanin blocks the UV-B component of sunlight, which the skin needs to make vitamin D. Actually, dark skin is not a serious constraint on vitamin D production. While it is true that a single UV-B exposure of moderate intensity will produce less vitamin D in black skin than in white skin, this difference narrows with longer exposure times, since white skin cuts back vitamin D production after only 20 minutes in the sun (Holick, 1995). Even in England, where sunlight is relatively weak, Asian, West Indian, and European adolescents show similar increases in vitamin D levels during the spring and summer (Ellis et al., 1977).

Another possible reason why tropical humans make less vitamin D is that there is no need to build up a reserve for the winter, when this vitamin cannot be produced. In contrast, such a reserve is necessary in the temperate zone. This seasonal variation is shown by a study of Nebraskan men after a summer of landscaping, construction, farming, and recreation. Their mean vitamin D level was initially 122 nmol/L. By late winter, it had fallen to 74 nmol/L (Barger-Lux & Heaney, 2002). Tropical humans may thus produce less of this vitamin because their skin doesn’t have to ‘make hay while the sun shines.’ This adaptation would then persist in those groups, like African Americans, that now inhabit the temperate zone.

Whatever the reason for this lower rate of production, tropical humans seem to compensate by converting more vitamin D into its active form. Although a single UV-B exposure produces less vitamin D3 in black subjects than in whites, the difference narrows after liver hydroxylation to 25-OHD and disappears after kidney hydroxylation to 1,25-(OH)2D. The active form of vitamin D is thus kept at a constant level, regardless of skin color (Matsuoka et al., 1991, 1995).

Robins (2009) notes that nearly half of all African Americans are classified as vitamin-D deficient and yet show no signs of calcium deficiency, which would be a logical result of vitamin D deficiency. Indeed, they “have a lower prevalence of osteoporosis, a lower incidence of fractures and a higher bone mineral density than white Americans, who generally exhibit a much more favourable vitamin D status.” He also cites a survey of 232 black (East African) immigrant children in Melbourne, Australia, among whom 87% had levels below 50 nmol/L and 44% below 25 nmol/L. None had rickets—the usual sign of vitamin-D deficiency in children (McGillivray et al., 2007).

In short, low vitamin D levels seem to be normal for African Americans and nothing to worry about. Such contrary evidence, however, doesn’t deter the vitamin D worrywarts:

Despite their low 25(OH)D levels, blacks have lower rates of osteoporotic fractures. This may result in part from bone-protective adaptations that include an intestinal resistance to the actions of 1,25(OH)2D and a skeletal resistance to the actions of parathyroid hormone (PTH). However, these mechanisms may not fully mitigate the harmful skeletal effects of low 25(OH)D and elevated PTH in blacks, at least among older individuals. Furthermore, it is becoming increasingly apparent that vitamin D protects against other chronic conditions, including cardiovascular disease, diabetes, and some cancers, all of which are as prevalent or more prevalent among blacks than whites. Clinicians and educators should be encouraged to promote improved vitamin D status among blacks (and others) because of the low risk and low cost of vitamin D supplementation and its potentially broad health benefits. (Harris, 2006)


The National Institute of Health is now studying the benefits of giving African Americans mega-doses of vitamin D, in the hope of bringing their disease rates down to those of other Americans. "We're excited about the potential of vitamin D to reduce this health gap," says the study co-leader. "But it is important to get answers from clinical trials before recommending megadoses of this supplement." (see article)

Yes, it might be best to get a few answers first. Unfortunately, there are millions of people out there who are now taking mega-doses of vitamin D every day. The mass experiment has already begun and the results should be ready in a decade or so, particularly among African Americans.

But why wait? The same experiment was performed from the mid-1980s to 2009 on an African American. The results are now in …

Was MJ done in by the D men?

A local journalist recalled interviewing Michael Jackson three years ago and noted that this man, then in his mid-40s, had the withered look of someone much older—like a vieillard.

What was responsible? His repeated plastic surgeries? His starvation diet? His abuse of painkillers and tranquillizers? These are the usual suspects. In the shadows, however, lurks another suspect who will never be questioned.

Michael Jackson had probably been taking mega-doses of vitamin D. This regimen would have started when he began bleaching his skin in the mid-1980s to even out blotchy pigmentation due to vitiligo. Since this bleaching made his skin highly sensitive to UV light, his dermatologist told him to avoid the sun and use a parasol. At that point, his medical entourage would have recommended vitamin D supplements. How high a dose? We’ll probably never know, but there are certainly many doctors who recommend mega-doses for people who get no sun exposure.

Such a recommendation would have dovetailed nicely with Michael’s fondness for vitamins. A 2005 news release mentions vitamin therapy as part of his health program:

“He’s getting vitamin nutrients and supplements,” the source said.

This source would not elaborate on the type of supplements or the way in which they are being administered.

There is also an interview with his former producer Tarak Ben Ammar:

C'était un hypocondriaque et on savait jamais vraiment s'il était malade car il a été entouré de médecins charlatans qui vivaient de cette maladie, qui lui facturaient des milliers et des milliers de dollars de médicaments, de vitamines…

[He was a hypochondriac and one never really knew whether he was sick because he was surrounded by charlatan doctors who lived from this sickness, who billed him for thousands and thousands of dollars of medication, of vitamins …]

It’s known that Michael Jackson was receiving injections of the ‘Myers cocktail’ (a mix of vitamins and nutrients), but this mix doesn’t normally contain vitamin D. He was probably taking the vitamin in tablet form.

What effects can we expect from long-term use of vitamin D at high doses? Keep in mind that we are really talking about a hormone, not a vitamin. This hormone interacts with the chromosomes and will gradually shorten their telomeres if concentrations are either too low or too high. Tuohimaa (2009) argues that optimal levels may lie in the range of 40-60 nmol/L. This is well below the current recommended minimum of 75 nmol/L. Furthermore, compliance with this optimal range may matter even more for populations of tropical origin, like African Americans, since their bodies have not adapted to the wide seasonal variation of non-tropical humans.

If this optimal range is continually exceeded, the long-term effects may look like those of aging:

Recent studies using genetically modified mice, such as FGF23-/- and Klotho-/- mice that exhibit altered mineral homeostasis due to a high vitamin D activity showed features of premature aging that include retarded growth, osteoporosis, atherosclerosis, ectopic calcification, immunological deficiency, skin and general organ atrophy, hypogonadism and short lifespan.

… after the Second World War in Europe especially in Germany and DDR, children received extremely high oral doses of vitamin D and suffered hypercalcemia, early aging, cardiovascular complications and early death suggesting that hypervitaminosis D can accelerate aging.
(Tuohimaa 2009)

Have we opened a Pandora’s box? Far from being a panacea, vitamin D could be an angel of death that will make millions of people old before their time.

Poor Michael. He looked to his doctors for eternal youth and they gave him premature old age.

References

Barger-Lux, J., & Heaney, R.P. (2002). Effects of above average summer sun exposure on serum 25-hydroxyvitamin D and calcium absorption, The Journal of Clinical Endocrinology & Metabolism, 87, 4952-4956.

Binkley N, Novotny R, Krueger D, et al. (2007). Low vitamin D status despite abundant sun exposure. Journal of Clinical Endocrinology & Metabolism, 92, 2130 –2135.

Ellis, G., Woodhead, J.S., & Cooke, W.T. (1977). Serum-25-hydroxyvitamin-D concentrations in adolescent boys, Lancet, 1, 825-828.

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Harris, S.S. (2006). Vitamin D and African Americans, Journal of Nutrition, 136, 1126-1129.

Holick, M.F. (1995). Noncalcemic actions of 1,25-dihydroxyvitamin D3 and clinical applications, Bone, 17, 107S-111S.

Matsuoka, L.Y., Wortsman, J., Chen, T.C., & Holick, M.F. (1995). Compensation for the interracial variance in the cutaneous synthesis of vitamin D, Journal of Laboratory and Clinical Medicine, 126, 452-457.

Matsuoka, L.Y., Wortsman, J., Haddad, J.G., Kolm, P., & Hollis, B.W. (1991). Racial pigmentation and the cutaneous synthesis of vitamin D. Archives of Dermatology, 127, 536-538.

McGillivray, G., Skull, S.A., Davie, G., Kofoed, S., Frydenberg, L., Rice, J., Cooke, R., & Carapetis, J.R. (2007). High prevalence of asymptomatic vitamin-D and iron deficiency in East African immigrant children and adolescents living in a temperate climate. Archives of Disease in Childhood, 92, 1088-1093.

Robins, A.H. (2009). The evolution of light skin color: role of vitamin D disputed, American Journal of Physical Anthropology, early view.

Sedrani, S.H. (1984). Low 25-hydroxyvitamin D and normal serum calcium concentrations in Saudi Arabia: Riyadh region, Annals of Nutrition & Metabolism, 28, 181-185.

Tuohimaa, P. (2009). Vitamin D and aging, Journal of Steroid Biochemistry and Molecular Biology, 114, 78-84.

Thursday, July 2, 2009

Why are Europeans white?

Why are Europeans so pale-skinned? The most popular explanation is the vitamin-D hypothesis. Originally developed by Murray (1934) and Loomis (1967), it has been most recently presented by Chaplin and Jablonski (2009). It can be summarized as follows:

1. To absorb calcium and phosphorus from food passing through the gut, humans need vitamin D. This vitamin is either produced in the skin through the action of UV-B light or obtained from certain food sources, notably fatty fish.

2. Humans are often vitamin-D deficient, even in tropical regions where UV-B exposure is intense and continual. This deficiency has led to high frequencies of rickets in many populations, particularly western Europeans and North Americans during the great rickets epidemic from c. 1600 to the mid-20th century. This epidemic occurred in areas where human skin was already producing sub-optimal levels of vitamin D because of the naturally weak sunlight at northern latitudes. These levels then fell even further wherever the Industrial Revolution had reduced sun exposure through air pollution, tall buildings, and indoor factory life.

3. If ancestral humans were often sub-optimal for vitamin D, natural selection should have favored lighter skin color, as a way to produce more of this vitamin by allowing more UV-B into the skin. Such selection, however, would have been counterbalanced in the tropical zone by selection for darker skin, to prevent sunburn and skin cancer.

4. This equilibrium would have ceased once ancestral humans had left the tropical zone. On the one hand, selection for darker skin would have relaxed, there being less sunburn and skin cancer. On the other, selection for lighter skin would have increased, there being less UV-B for vitamin-D production.

Ancestral humans thus began to lighten in skin color once they had entered Europe’s northern latitudes. This selection pressure eventually drove European skin color almost to the limit of depigmentation.


Were ancestral Europeans deficient for vitamin D?

There are several problems with the vitamin-D hypothesis. First, if lack of this vitamin created the selection pressure that led to white European skin, why are Europeans genetically polymorphic in their ability to maintain blood levels of vitamin D? At least two alleles reduce the effectiveness of the vitamin-D binding protein, and their homozygotes account for 9% and 18% of French Canadians (Sinotte et al., 2009). If lack of this vitamin had been so chronic, natural selection would have surely weeded out these alleles. And why does European skin limit vitamin-D production after only 20 minutes of UV-B exposure? (Holick, 1995). Why is such a limiting mechanism necessary?

There is also little evidence that ancestral Europeans suffered from vitamin-D deficiency. Before the 17th century, we have only sporadic evidence of rickets in skeletal remains and even these cases may be false positives, as Wells (1975) notes:

It is likely that these low frequencies of rickets should be even lower because some of the authors quoted above have based their diagnoses on such features as plagiocrany (asymmetry of the skull), which may occur merely from cradling habits and other causes (Wells, 1967a) or on irregularities of the teeth, which probably result from many adverse factors in foetal life as well as in infancy.

On this point, Chaplin and Jablonski (2009) affirm: “Despite taphonomic biases, it [rickets] has been recognized in early archeological and Neolithic materials at the rate of 1-2.7% (a reasonably high selective value).” In fact, Wells (1975) reports no cases from Paleolithic Europe and only sporadic cases from Neolithic Europe. The range of 1-2.7% seems to apply to “a gradual, albeit slow, increase of the disease during the European Middle Ages” (Wells, 1975). Wells (1975) cites a series of Hungarian remains that indicate an increase in frequency from 0.7 to 2.5% between the 10th and 13th centuries. As Wells notes, even this low incidence is probably inflated by false positives.

Why is skin white only among Europeans?

The vitamin-D hypothesis raises a second problem. Why is white skin an outlier among the skin tones of indigenous human populations north of 45° N? Skin is much darker among people who are native to these latitudes in Asia and North America and who receive similar levels of UV-B at ground level. Murray (1934) attributes their darker skin to a diet rich in vitamin D:

One of the chief difficulties up to now in accounting for the origin of the white or unpigmented race has been the existence of the darkly pigmented Eskimo in these same dark sunless Arctic regions which we have been discussing as the probable original habitat of the white race. The unravelling of the causes of rickets has fully explained this anomaly. The Eskimo though deeply pigmented and living in a dark habitat, nevertheless is notoriously free from rickets. This is due to his subsisting almost exclusively on a fish oil and meat diet. Cod liver oil, as has been stated, is fully as efficient as sunlight in preventing rickets. Now the daily diet of the Eskimo calculated in antirachitic units of cod liver oil equals several times the minimum amount of cod liver oil needed to prevent rickets. Because of his diet of antirachitic fats, it has been unnecessary for the Eskimo to evolve a white skin in the sunless frigid zone. He has not needed to have his skin bleached by countless centuries of evolution to admit more antirachitic sunlight. He probably has the same pigmented skin with which he arrived in the far north ages ago.

This argument fails to explain why skin is equally dark among inland natives of northern Asia and North America who consume little fatty fish and yet show no signs of rickets. One might also point out that fatty fish has long been a major food source for the coastal inhabitants of northwestern Europe. According to carbon isotope analysis of 7,000-6,000 year old human remains from Denmark, the diet must have been 70-95% of marine origin (Tauber, 1981). Yet Danes are very pale-skinned.

Some have suggested that sufficient vitamin D could have been obtained from the meat of land animals, if eaten in sufficient quantities (Sweet, 2002). This has led to a revised version of the vitamin-D hypothesis: ancestral Europeans lightened in color when they made the transition from hunting and gathering to agriculture 8,000 to 5,000 years ago, and not when they first arrived some 35,000 years ago.

Do we know when Europeans became white? This change has been roughly dated at two gene loci. At SLC45A2 (AIM1), Soejima et al. (2005) have come up with a date of ~ 11,000 BP. At SLC24A5, Norton and Hammer (2007) suggest a date somewhere between 12,000 and 3,000 BP. These are rough estimates but it looks like Europeans did not turn white until long after their arrival in Europe. As a Science journalist commented: “the implication is that our European ancestors were brown-skinned for tens of thousands of years” (Gibbons, 2007). Thus, the original version of the vitamin-D hypothesis no longer seems plausible.

Of course, the revised vitamin-D hypothesis is still plausible, i.e., Europeans became pale-skinned after giving up hunting and gathering for agriculture. But this scenario does raise problems. For one thing, it would mean that many Europeans turned white at the threshold of history. In the case of Norway, agriculture did not arrive until 2400 BC and fatty fish, rich in vitamin D, have always been a mainstay of the diet (Prescott, 1996).

References

Chaplin, G., & Jablonski, N.G. (2009). Vitamin D and the evolution of human depigmentation, American Journal of Physical Anthropology, early view

Gibbons, A. (2007). American Association Of Physical Anthropologists Meeting: European Skin Turned Pale Only Recently, Gene Suggests. Science 20 April 2007:Vol. 316. no. 5823, p. 364 DOI: 10.1126/science.316.5823.364a
http://www.sciencemag.org/cgi/content/summary/316/5823/364a

Holick, M.F. (1995). Noncalcemic actions of 1,25-dihydroxyvitamin D3 and clinical applications, Bone, 17, 107S-111S.

Loomis, W.F. (1967). Skin-pigment regulation of vitamin-D biosynthesis in Man, Science, 157, 501-506.

Murray, F.G. (1934). Pigmentation, sunlight, and nutritional disease, American Anthropologist, 36, 438-445.

Norton, H.L. & Hammer, M.F. (2007). Sequence variation in the pigmentation candidate gene SLC24A5 and evidence for independent evolution of light skin in European and East Asian populations. Program of the 77th Annual Meeting of the American Association of Physical Anthropologists, p. 179.

Prescott, C. (1996). Was there really a Neolithic in Norway? Antiquity, 70, 77-87.

Robins, A.H. (2009). The evolution of light skin color: role of vitamin D disputed, American Journal of Physical Anthropology, early view.

Sinotte, M., Diorio, C., Bérubé, S., Pollak, M., & Brisson, J. (2009). Genetic polymorphisms of the vitamin D binding protein and plasma concentrations of 25-hydroxyvitamin D in premenopausal women, American Journal of Clinical Nutrition, 89, 634-640.

Soejima, M., Tachida, H., Ishida, T., Sano, A., & Koda, Y. (2005). Evidence for recent positive selection at the human AIM1 locus in a European population. Molecular Biology and Evolution, 23, 179-188.

Sweet, F.W. (2002). The paleo-etiology of human skin tone.
http://backintyme.com/essays/?p=4

Tauber, H. (1981). 13C evidence for dietary habits of prehistoric man in Denmark, Nature, 292, 332-333.

Wells, C. (1975). Prehistoric and historical changes in nutritional diseases and associated conditions, Progress in Food and Nutrition Science, 1(11), 729-779.

Thursday, June 25, 2009

What caused the rickets epidemic?

The English, like other Western nations, were once plagued by rickets—a softening of the bones leading to fractures and deformity, particularly in children. Initially rare, it became much more frequent after 1600 and had reached epidemic levels by the turn of the 20th century (Gibbs, 1994; Harrison, 1966; Holick, 2006; Rajakumar, 2003). A survey at the Great Ormond Street Hospital found symptoms of rickets in one out of three children under 2 years of age and another one at Clydeside, in 1884, found symptoms in every child examined (Gibbs, 1994). Elsewhere during the late 19th century, in Boston and Leiden (Netherlands), autopsy studies showed rickets in 80-90% of all children (Holick, 2006).

The currently accepted explanation was developed in the late 1880s by Dr. Theobald Palm. For Palm, rickets seemed to correlate with lack of sun. The illness was more common in northwestern Europe, particularly England, where sunlight was naturally weaker. It was also more common in urban areas where “a perennial pall of smoke, and ... high houses cut off from narrow streets a large proportion of the rays which struggle through the gloom” (Hardy, 2003). His hypothesis was strengthened in 1919 by the finding that ultraviolet light can cure rickets by releasing a chemical, eventually identified as vitamin D, that provides us with calcium and phosphorus from food passing through the gut (Gibbs, 1994). Such health benefits, together with the anti-microbial action of UV light, led to the ‘sunshine movement’ of the 1920s—a vast effort to boost sun exposure by redesigning our clothes, our streets, parks, and buildings, and even our notions of fun and recreation. This movement gave us much of the look and feel of modern life.

By the mid-20th century, rickets was again rare, thus vindicating not only UV therapy but also the view that lack of sun had been the cause (Harrison, 1966). This view nonetheless remains unproven. Although vitamin D does help the body absorb more calcium and phosphorus, no one knows for sure whether the epidemic was due to low levels of this vitamin. That kind of blood test did not exist yet. People may have developed rickets because something was immobilizing calcium or phosphorus in their bodies. They would have then required more vitamin D. This possibility is hinted at by Harrison (1966):


… a number of cases are on record of children with marked rickets who have received an amount of vitamin D ordinarily sufficient to prevent rickets and who do not have manifestations of intestinal malabsorption. When these children are given increased amounts of vitamin D, several thousand units per day, the biochemical manifestations of vitamin D effect result, and roentgenograms show healing of the rickets. The basis for this increased requirement is not known.

At the height of the epidemic, one physician did suggest that something was immobilizing phosphorus in people with rickets. Dr. John Snow (1857) observed that the illness was most frequent in London and the south of England where industrial bakeries used alum to make bread look whiter. It was rare in the north where bread was normally home-baked. He reasoned that this alum combined with phosphorus in the body to form insoluble aluminum phosphate, thus depleting the reserves of phosphorus needed for strong bones.

Snow pointed out that London bakeries would add about one and a half ounces of alum per four pounds of loaf. Since manual laborers met 70% of their energy requirements by eating bread, they would have been ingesting 20 g of alum daily or 4 g of aluminum hydroxide (Dunnigan, 2003). A recent case study describes an infant who developed rickets after consuming 2 g of aluminum hydroxide (via antacids) per day over five to six weeks (Pattaragarn & Alon, 2001). There have been many other reports of antacid-induced rickets (Boutsen et al., 1996; Cooke et al., 1978; Pivnick et al., 1995; Shetty et al., 1998; Spencer & Kramer, 1983).

Snow’s hypothesis was forgotten and has been dusted off only in recent years (Dunnigan, 2003; Hardy, 2003; Paneth, 2003). This renewed interest is partly due to the realization that rickets can be caused not only by lack of vitamin D but also by ingested substances that make phosphorus or calcium unusable. Alum is one, as seen in reports of rickets induced by antacids. Another is phytic acid in cereal grains (Sandberg, 1991). The acid binds to calcium and makes it unavailable to the body, as shown when dogs develop rickets on an oatmeal diet (Harrison & Mellanby, 1939). It is this calcium depletion, via food like unleavened bread or chapatti, that now causes rickets in the Middle East and South Asia (Berlyne et al., 1973; Harinarayan, Ramalakshmi, Prasad, Sudhakar, Srinivasarao, Sarma, & Kumar, 2007).

We may never disprove the view that lack of sun caused the rickets epidemic of a century ago. But we can point to some inconsistencies. First, rickets was much less frequent in northern England and absent in northwest Scotland—the area of Great Britain with the weakest solar UV (Gibbs, 1994). Second, it was not really a disease of cities with dark narrow streets and smoke-filled skies, as Snow (1857) himself observed.


The usual causes to which rickets are attributed are of a somewhat general nature, such as vitiated air, want of exercise and nourishing food, and a scrofulous taint. These explanations, however, did not satisfy me, as I had previously seen a good deal of practice in some of the towns in the north of England, where the over-crowding and the other evils above mentioned were as great as in London, whilst the distortion of the legs in young children was hardly present; moreover, I noticed that the most healthy-looking and best-nourished children often suffered most from curvature of the bones of the legs, owing to their greater weight; and I afterwards found that this complaint was quite common in the villages around London as well as in the metropolis itself.

Lack of sun also fails to explain why the epidemic initially broke out within a small geographic area. Indeed, the evidence points to a highly localized origin, essentially southwest England in the early 17th century. Rickets was completely new to observers at the time, including the College of Physicians president Francis Glisson. In 1650, he wrote:


The disease became first known as near as we could gather from the relation of others, after sedulous inquiry, about thirty years since, in the counties of Dorset and Somerset … since which time the observation of it hath been derived unto all the southern and western parts of the Kingdom. (Gibbs, 1994)

Gibbs (1994) attributes this rapid growth to that of England’s home-based textile industry, which by 1600 had become the main export. “Whole families worked from before dawn until after dusk in their homes and, whether the children were too young to work or old enough to assist in home production, they would have lived their lives predominantly indoors.” This explanation is hard to accept because textile cottage industries developed primarily in the midlands and around London. The southwest trailed the rest of England in this regard. In addition, family workshops were normally off-limits to children below the age of apprenticeship. Such children would have been left with elderly relatives or told to play outside.

But there may have been an indirect link with the growth of England’s textile industry, namely the parallel growth in the use of alum to fix the colors of cloth. Until the ban on alum imports in 1667, when newly exploited Yorkshire shales became the main source, England imported this substance from the Italian Papal States (Balston, 1998; Jenkins, 1971). The port of entry would have been Bristol—in Somerset county, southwest England. This may have been where English bakers first learned to whiten bread with alum.

When did bakers stop using alum? The practice seems to have died out after the turn of the century with tougher enforcement of food adulteration statutes in the United Kingdom and elsewhere (Kassim, 2001). By the mid-20th century, “the use of alum in bread was only occasionally encountered” (Hart, 1952). Eliminating this additive from bread probably did much to eliminate rickets. Probably just as important was the decreasing importance of bread in working class diets, as a result of increasing affluence.

But this is not what the history books say. As Steve Sailer tells us, history is written by those who like to write, and much more has been written about the sunshine movement and its presumed benefits for humanity.

References

Balston, J. (1998). “In defence of alum – 2. England”, In: The Whatmans and Wove Paper: Its invention and development in the West, West Farleigh.


Berlyne, G.M., Ari, J.B., Nord, E., & Shainkin, R. (1973). Bedouin osteomalacia due to calcium deprivation caused by high phytic acid content of unleavened bread, The American Journal of Clinical Nutrition, 26, 910-911.

Boutsen, Y. Devogelaer, J.P., Malghem, J., Noël, H., & Nagant de Deuxchaisnes, C. (1996). Antacid-induced osteomalacia, Clinical Rheumatology, 15, 75-80.

Cooke, N., Teitelbaum, S., & Aviol, L.V. (1978). Antacid-induced osteomalacia and nephrolithiasis, Archives of Internal Medicine, 138, 1007-1009.

Dunnigan, M. (2003). Commentary: John Snow and alum-induced rickets from adulterated London bread: an overlooked contribution to metabolic bone disease, International Journal of Epidemiology, 32, 340-341.

Gibbs, D. (1994). Rickets and the crippled child: an historical perspective, Journal of the Royal Society of Medicine, 87, 729-732.

Hardy, A. (2003). Commentary: Bread and alum, syphilis and sunlight: rickets in the nineteenth century, International Journal of Epidemiology, 32, 337-340

Harrison, D.C., & Mellanby, E. (1939). Phytic acid and the rickets-producing action of cereals, Biochemical Journal, 33, 1660-1680.

Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.

Hart, F.L. (1952). Food adulteration in the early twentieth century, Food Drug Cosmetic Law Journal, 7, 485-509.

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Harrison, H.E. (1966). The disappearance of rickets, American Journal of Public Health, 56, 734-737.

Holick, M.F. (2006). Resurrection of vitamin D deficiency and rickets, The Journal of Clinical Investigation, 116, 2062-2072.

Jenkins, R. (1971). “The alum trade in the fifteenth and sixteenth centuries, and the beginnings of the alum industry in England,” in: Links in the history of engineering and technology from Tudor times: the collected papers of Rhys Jenkins, pp. 193-203, Newcomen Society (Great Britain), Published by Ayer Publishing.

Kassim, L. (2001). The co-operative movement and food adulteration in the nineteenth century, Manchester Region History Review, 15, 9-18.

Paneth, N. (2003). Commentary: Snow on rickets, International Journal of Epidemiology, 32, 341-343.

Pattaragarn, A., & Alon, U.S. (2001). Antacid-induced rickets in infancy, Clinical Pediatrics, 40, 389-393.

Pivnick, E.K., Kerr, N.C., Kaufman, R.A., Jones, D.P., & Chesney, R.W. (1995). Rickets secondary to phosphate depletion: a sequela of antacid use in infancy. Clinical Pediatrics, 34, 73-78.

Rajakumar, K. (2003). Vitamin D, cod-liver oil, sunlight, and rickets: a historical perspective. Pediatrics, 112, 132-135.

Sandberg, A.S. (1991). The effect of food processing on phytate hydrolysis and availability of iron and zinc. Advances in Experimental Medical Biology, 289, 499-508.

Shetty, A.K., Thomus, T., Rao, J., and Vargus, A. (1998). Rickets and secondary craniosynostosis associated with long-term antacid use in an infant. Archives of Pediatrics & Adolescent Medicine, 152, 1243-1245.

Snow J. (1857). On the adulteration of bread as a cause of rickets. Lancet, ii:4–5. (Reprinted in International Journal of Epidemiology (2003), 32, 336–337.)

Spencer, H., & Kramer, L. (1983). Antacid-induced calcium loss, Archives of Internal Medicine, 143, 657-659.

Thursday, June 18, 2009

Vitamin D and homeostasis

In my previous posts, I argued that a homeostatic mechanism keeps the level of vitamin D in our bloodstream within a certain range. When UV-B light is always intense, as in the tropics, the level seems to be 50-75 nmol/L in young adults and progressively lower in older age groups. The more sunlight varies seasonally, the more the body will produce vitamin D in summer in order to maintain at least 50 nmol/L in winter—a level well below the recommended minimum of 75 nmol/L and even further below the 150 nmol/L now being advocated by vitamin-D proponents.

This homeostatic mechanism breaks down if we daily ingest 10,000 IU of vitamin D or more (Vieth, 1999). It seems that the human body has never naturally encountered such intakes, at least not on a continual basis.

In a recent review article, Robins (2009) presents evidence for a second homeostatic mechanism. Even when the level of vitamin D varies in the bloodstream, the second mechanism ensures that these divergent levels will translate into the same concentration of the biologically active 1,25-(OH)2D metabolite.

Matsuoka et al. (1991) demonstrated that after single-dose, whole-body UVB exposure black subjects had distinctly lower serum vitamin D3 levels than whites, but differences between the two groups narrowed after liver hydroxylation to 25-OHD and disappeared after kidney hydroxylation to 1,25-(OH)2D. These findings suggest that there is a compensatory mechanism whereby, in the presence of vitamin D3 suppression by melanin, the liver and kidney hydroxylating enzymes are activated in tandem to ensure that the concentration of the biologically active 1,25-(OH)2D metabolite is normalized and kept constant regardless of ethnic pigmentation (Matsuoka et al., 1991, 1995).

Robins (2009) goes on to note that nearly half of all African Americans are vitamin-D deficient but show no signs of calcium deficiency. Indeed, they “have a lower prevalence of osteoporosis, a lower incidence of fractures and a higher bone mineral density than white Americans, who generally exhibit a much more favourable vitamin D status.” He also cites a survey of 232 black (East African) immigrant children in Melbourne, Australia, among whom 87% had levels below 50 nmol/L and 44% below 25 nmol/L. None had rickets (McGillivray et al., 2007).

References

Matsuoka, L.Y., Wortsman, J., Chen, T.C., & Holick, M.F. (1995). Compensation for the interracial variance in the cutaneous synthesis of vitamin D, Journal of Laboratory and Clinical Medicine, 126, 452-457.

Matsuoka, L.Y., Wortsman, J., Haddad, J.G., Kolm, P., & Hollis, B.W. (1991). Racial pigmentation and the cutaneous synthesis of vitamin D. Archives of Dermatology, 127, 536-538.

McGillivray, G., Skull, S.A., Davie, G., Kofoed, S., Frydenberg, L., Rice, J., Cooke, R., & Carapetis, J.R. (2007). High prevalence of asymptomatic vitamin-D and iron deficiency in East African immigrant children and adolescents living in a temperate climate. Archives of Disease in Childhood, 92, 1088-1093.

Robins, A.H. (2009). The evolution of light skin color: role of vitamin D disputed, American Journal of Physical Anthropology, early view.

Vieth, R. (1999). Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety, American Journal of Clinical Nutrition, 69, 842-856.

Thursday, June 11, 2009

Mad dogs and ....

How can vitamin-D deficiency exist despite lengthy sun exposure? This apparent paradox was raised in my last post. The medical community now recommends bloodstream vitamin D levels of at least 75-150 nmol/L, yet these levels are not reached by many tanned, outdoorsy people.

In a study from Hawaii, vitamin D status was assessed in 93 healthy young adults who were visibly tanned and averaged 22.4 hours per week of unprotected sun exposure, with 40% reporting no use of sunscreen. Yet their mean vitamin D level was 79 nmol/L and 51% had levels below the recommended minimum of 75 nmol/L (Binkley et al., 2007).

These results are consistent with those of a study from Nebraska. The subjects were thirty healthy men who had just completed a summer of outdoor activity, e.g., landscaping, construction, farming, and recreation. One subject used sunscreen regularly and sixteen others sometimes or rarely. Their mean vitamin D level was initially 122 nmol/L. By late winter, it had fallen to 74 nmol/L (Barger-Lux & Heaney, 2002).

A study from south India found levels below 50 nmol/L in 44% of the men and 70% of the women. The subjects are described as “agricultural workers starting their day at 0800 and working outdoors until 1700 with their face, chest, back, legs, arms, and forearms exposed to sunlight.” (Harinarayan et al., 2007).

These studies lead to two conclusions. First, sun exposure seems to produce vitamin D according to a law of diminishing returns: the more we expose ourselves to the sun, the less the vitamin D in our bloodstream increases. Perhaps frequent sun exposure results in less being produced in the skin and more being broken down in the liver. This might explain why intense sun exposure leads to a lower vitamin D level in Hawaiian subjects than in Nebraskans. In the latter group, vitamin D production may be ‘calibrated’ to provide a reserve for the winter months.

Second, to stay above the recommended minimum of 75-150 nmol/L, we must take supplements in the form of vitamin pills or fortified foods. Sun exposure is not enough. Yet even dietary supplementation seems to be countered by some unknown mechanism within the body:


… what effect does a 400 IU/d dose of vitamin D for an extended time (months) have in adults? The answer is little or nothing. At this dose in an adult, the circulating 25(OH)D concentration usually remains unchanged or declines. This was first shown in both adolescent girls and young women. … mothers who were vitamin D deficient at the beginning of their pregnancies were still deficient at term after receiving supplements of 800-1600 IU vitamin D/d throughout their pregnancies. (Hollis, 2005)

The assembled data from many vitamin D supplementation studies reveal a curve for vitamin D dose versus serum 25-hydroxyvitamin D [25(OH)D] response that is surprisingly flat up to 250 μg (10000 IU) vitamin D/d. To ensure that serum 25(OH)D concentrations exceed 100 nmol/L, a total vitamin D supply of 100 μg (4000 IU)/d is required.
(Vieth, 1999)


Only mega-doses can overcome what seems to be a homeostatic mechanism that keeps bloodstream vitamin D within a certain range. Indeed, this range falls below the one that is now recommended. Curious isn't it? Why would natural selection design us the wrong way?

Perhaps ancestral humans got additional vitamin D from some other source, such as the food they ate. In the diets of hunter/gatherers and early agriculturalists, fatty fish are clearly the best source, as seen when we rank the vitamin D content (IU per gram) of different foods (Loomis, 1967):

Halibut liver oil : 2,000-4,000
Cod liver oil : 60-300
Milk : 0.1
Butter : 0.0-4.0
Cream : 0.5
Egg yolk : 1.5-5.0
Calf liver : 0.0
Olive oil : 0.0

Yet fatty fish were unavailable to many ancestral humans, if not most. And again, when vitamin D enters the blood from our diet, it seems to be limited by the same homeostatic mechanism that limits entry of vitamin D from sun-exposed skin.

It looks like natural selection has aimed for an optimal vitamin D level substantially lower than the recommended minimum of 75-150 nmol/L. This in turn implies some kind of disadvantage above the optimal level. Indeed, Adams and Lee (1997) found evidence of vitamin D toxicity at levels as low as 140 nmol/L. But this evidence is ridiculed by Vieth (1999):

The report of Adams and Lee, together with its accompanying editorial, suggest that serum 25(OH)D concentrations as low as 140 nmol/L are harmful. This is alarmist. Are we to start avoiding the sun for fear of raising urine calcium or increasing bone resorption?

These side effects may or may not be serious. But there are others. High vitamin D intake is associated with brain lesions in elderly subjects, possibly as a result of vascular calcification (Payne et al., 2007). Genetically modified mice with high vitamin D levels show signs of premature aging: retarded growth, osteoporosis, atherosclerosis, ectopic calcification, immunological deficiency, skin and general organ atrophy, hypogonadism, and short lifespan (Tuohimaa, 2009). Vitamin D supplementation during infancy is associated with asthma and allergic conditions in adulthood (Hyppönen et al., 2004)

In this, vitamin-D proponents are guilty of some hypocrisy. They denounced the previous recommended level, saying it was just enough to prevent rickets while ignoring the possibility that less visible harm disappears only at higher intakes. Yet the current recommended level ignores the possibility that less visible harm appears below the level of vitamin D poisoning.

This being said, the pro-vitamin-D crowd may still be partly right. The optimal level might now exceed the one the human body naturally tends to maintain. With the shift to industrial processing of cereals, we today consume more phytic acid, which makes calcium unusable and thus increases the body’s need for vitamin D. We have, so to speak, entered a new adaptive landscape and our bodies have not had time to adapt.

Or they may be completely wrong. Frankly, I’m not reassured by the pro-vitamin-D literature. It strikes me as being rife with loosely interpreted facts, like the correlation between cancer rates and distance from the equator (and hence insufficient vitamin D). Cancer rates also correlate with the presence of manufacturing, which is concentrated at temperate latitudes for a number of historical and cultural reasons, notably the absence of slavery and plantation economies.

Then there’s this gem:

The concentrations of 25(OH)D observed today are arbitrary and based on contemporary cultural norms (clothing, sun avoidance, food choices, and legislation) and the range of vitamin D intakes being compared may not encompass what is natural or optimal for humans as a species (Vieth, 1999)

Actually, cultural norms are much more heliophilic today than during most of our past. In a wide range of traditional societies, people avoided the sun as much as possible, especially during the hours of peak UV (Frost, 2005, pp. 60-62). Midday was a time for staying in the shade, having the main meal, and taking a nap. Nor is there reason to believe that sun avoidance and clothing were absent among early modern humans. Upper Paleolithic sites have yielded plenty of eyed needles, awls, and other tools for making tight-fitting, tailored clothes (Hoffecker, 2002).

Heliophilia is the historical outlier, not heliophobia. It was the sunshine movement of the 1920s that first persuaded people to cast off hats, cut down shade trees, and lie on beaches for hours on end. This cultural revolution was still recent when Noël Coward wrote his 1931 piece ‘Mad Dogs and Englishmen’:

In tropical climes there are certain times of day
When all the citizens retire to tear their clothes off and perspire.
It's one of the rules that the greatest fools obey,
Because the sun is much too sultry And one must avoid its ultry-violet ray.
The natives grieve when the white men leave their huts,
Because they're obviously, definitely nuts!

Mad dogs and Englishmen go out in the midday sun,

The Japanese don’t care to, the Chinese wouldn’t dare to,
Hindus and Argentines sleep firmly from twelve to one

But Englishmen detest a siesta.
In the Philippines there are lovely screens to protect you from the glare.
In the Malay States there are hats like plates which the Britishers won't wear.
At twelve noon the natives swoon and no further work is done,
But mad dogs and Englishmen go out in the midday sun.

References

Adams, J.S., & Lee, G. (1997). Gains in bone mineral density with resolution of vitamin D intoxication. Annals of Internal Medicine, 127, 203-206.

Barger-Lux, J., & Heaney, R.P. (2002). Effects of above average summer sun exposure on serum 25-hydroxyvitamin D and calcium absorption, The Journal of Clinical Endocrinology & Metabolism, 87, 4952-4956.

Binkley N, Novotny R, Krueger D, et al. (2007). Low vitamin D status despite abundant sun exposure. Journal of Clinical Endocrinology & Metabolism, 92, 2130 –2135.

Frost, P. (2005). Fair Women, Dark Men. The Forgotten Roots of Color Prejudice. Cybereditions: Christchurch (New Zealand).

Harinarayan, C.V., Ramalakshmi, T., Prasad, U.V., Sudhakar, D., Srinivasarao, P.V.L.N., Sarma, K.V.S., & Kumar, E.G.T. (2007). High prevalence of low dietary calcium, high phytate consumption, and vitamin D deficiency in healthy south Indians, American Journal of Clinical Nutrition, 85, 1062-1067.

Hoffecker, J.F. (2002). Desolate Landscapes. Ice-Age Settlement in Eastern Europe. New Brunswick: Rutgers University Press.

Hollis, B.W. (2005). Circulating 25-Hydroxyvitamin D levels indicative of vitamin D sufficiency: implications for establishing a new effective dietary intake, Journal of Nutrition, 135, 317-322.

Hyppönen, E., Sovio, U., Wjst, M., Patel, S., Pekkanen, J., Hartikainen, A-L., & Järvelin, M-R. (2004). Infant Vitamin D Supplementation and Allergic Conditions in Adulthood. Northern Finland Birth Cohort 1966, Annals of the New York Academy of Sciences, 1037, 84–95.

Loomis, W.F. (1967). Skin-pigment regulation of vitamin-D biosynthesis in man, Science, 157, 501-506.

Payne, M.E., Anderson, J.J.B., & Steffens, D.C. (2008). Calcium and vitamin D intakes may be positively associated with brain lesions in depressed and non-depressed elders, Nutrition Research, 28, 285-292.

Tuohimaa, P. (2009). Vitamin D and aging, Journal of Steroid Biochemistry and Molecular Biology, 114(1-2), 78-84.

Vieth, R. (1999). Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety, American Journal of Clinical Nutrition, 69, 842-856.