Most of the world is understandably freaked out about disease and pestilence. A lot of people are on the brink of wearing those creepy, beak-masked hazmat suits that are de rigueur in movies about 16th century Europe.
Others are just trying to shore up their immune systems, getting all their immune cells recruited, trained, and ready for battle.
One of the ways they're attempting to do so is by popping vitamin D capsules. They're taking so many of these golden, translucent spheres that sushi restaurants might as well perform a public service and replace the fish eggs on their cut rolls with vitamin D capsules.
Based on this high consumption of capsules, you'd think vitamin D blood levels of the population would be soaring – that their immune systems would be super powerful – but they're not.
A good number of people can't get their vitamin D blood levels to budge, no matter how much vitamin D they take. They're floundering down in the 20 nanograms per deciliter range when they should be surfing a crest of 30, 40, or 50 ng/dl.
So what the hell is going on? Physiologically, we know the answer (which I'll get to in just a little bit), but how we actually find ourselves in this situation requires a little bit of detective work. To see what I'm talking about, we have to go back to the 1970s.
In 1972, President Nixon signed a grain deal with the Soviet Union. That, along with a streak of bad weather in the Midwest, caused a horrible grain shortage. Commodity prices soared and consumers got pissed. Nixon then instructed his suitably named Secretary of Agriculture, Earl Butz, to fix the problem.
His solution was farm subsidies. Farmers got paid to produce as much grain as they could and dump it on the market, regardless of the price it fetched. Soon, American farmers were growing an additional 500 calories per American per day and buying luxurious tractors outfitted with seats made of soft Corinthian leather.
Cereal manufacturers took advantage of this manna from government heaven by creating a multitude of new breakfast foods. Consider that in the years between 1970 and 1998, the number of breakfast cereals in the U.S. nearly doubled, snap, crackle, and popping their way from 160 varieties to around 340. Carnation Instant Breakfast also hit its stride in the early 70's, as did fancy sweetened yogurts.
What do these products all have in common? They allowed Americans to swap out their bacon and eggs, Eggo waffles, and Danishes for something that the cereal industry was pitching as healthy alternatives.
But more importantly, at least for the purposes of my story, was that these foods required lots and lots of milk. Milk with which to drown those Lucky Charms, milk with which to mix with all that Instant Breakfast (which in turn is itself made from milk), and milk with which to ferment and make all that yogurt.
Milk, of course, contains a lot of calcium. People started ingesting and absorbing tons of it. Add to that the supplemental calcium that osteoporosis-fearing women started mainlining on the advice of their doctors, and you've got a country that's cuckoo for both Cocoa Puffs and calcium.
The problem is that calcium and magnesium exist in a very tight relationship in the body. If this ratio isn't respected, if something mucks it up – like a dramatic increase in the consumption of calcium foods without a concurrent increase in magnesium consumption – you get unfortunate consequences.
For one thing, you severely cripple the ability to transport, synthesize, and activate vitamin D.
Doctors and scientists see it all the time: Individuals with a high calcium to magnesium intake are at a higher risk of magnesium deficiency, and the activities of the three major enzymes that determine vitamin D concentrations are all magnesium dependent.
The end result is a vitamin D deficiency, or at the very least, a vitamin D insufficiency.
So maybe you're thinking, no problem, I'll just double or triple-up my intake of vitamin D supplements. Not so fast.
The more vitamin D you take, the further you tap into magnesium stores, leaving you an increasingly insufficient amount to activate the enzymes responsible for determining vitamin D levels.
So why hasn't the magnesium intake of Americans kept up with calcium intake? The problem, as explained above, is not only that we're taking in a lot more calcium, but also that the standard U.S. diet only contains 50% of the conservatively estimated RDA for magnesium.
That's understandable, given that the magnesium status is low in populations that consume a lot processed foods that are high in fat, sugar, and refined grains.
Viewed together, calcium consumption, from 1977 to 2012, has increased at a rate 2 to 2.5 times that of magnesium intake, resulting in calcium to magnesium intake ratios of more than 3.0.
That's not good. The ideal ratio is around 2.0 to 2.2, and anything higher than 2.8 can lead to problems, among them being the inability to metabolize vitamin D and a greater risk of developing type 2 diabetes, metabolic syndrome, chronic pulmonary disease, and possibly some cancers, along with impairing many of the 300 enzymatic reactions that involve magnesium.
Additionally, having an optimal calcium/magnesium ratio prevents calcium from turning your normally pliable coronary blood vessels into calcified tunnels through which your cardiac surgeon has to go spelunking to save your life.
Two clinical studies on magnesium-deficient patients found that magnesium infusion alone brought up levels of 25(OH)D (a prehormone that's produced in the liver by hydroxylation of vitamin D3) and 1,25(OH)2D (the active form of vitamin D in the body) a little bit, whereas magnesium infusion plus oral vitamin D (as 25(OH)D) substantially increased serum levels of 25(OH)D and 1,25(OH)2D.
In two other studies involving patients suffering from vitamin D rickets, magnesium supplementation decreased resistance to vitamin D treatment, whereas intramuscular infusion of high amounts of vitamin D alone (up to 600,000 IU) didn't do squat.
It's pretty clear-cut. Without sufficient amounts of magnesium, vitamin D doesn't even get up and put its pants on to go to work in bolstering your immunity.
If you were to go to your family doctor – the guy you thought was one or two days away from retiring twenty years ago but is still practicing – and asked him about magnesium deficiency, he'd likely check to see if you're suffering from any of the classic symptoms of a deficiency, things like muscle twitches, depression, fatigue, high blood pressure, or atrial fibrillation.
You're seemingly healthy, so you'd answer no to all his questions. But that doesn't mean you're not deficient.
Next, just to placate you, he'd add a test for magnesium to your blood panel. It would likely come back as normal, but even that's not reliable because less than 1% of the magnesium in your body hangs around serum. The rest is split about equally between bone and tissue.
Even so, you don't have to be clinically deficient in magnesium for there to be a problem. Just having an insufficiency is enough to make activating vitamin D a challenge.
The safer course is to just assume you have a magnesium insufficiency, particularly if you subsist on America's magnesium-poor food or are an athlete (one of the ways magnesium leaves the body is through sweat).
While blood tests aren't a reliable indicator of magnesium status, they work just fine in determining vitamin D status. A level of 20 ng/dl is considered acceptable by most members of the medical profession while 30 is considered optimal.
Other die-hard, kamikaze bio-hackers and non-traditionally schooled nutrition types like to push the level of vitamin D higher, up to 50 or so. Either way, you can't do it without adequate levels of magnesium and, as I pointed out, magnesium appears to be in short supply in the American diet.
It used to be that we made enough of our own vitamin D through exposure to sunlight, as every skin cell in the body contains the machinery to convert sunlight to a vitamin D precursor, which then undergoes two hydroxylations before it becomes metabolically active.
But no more. Given the public's increasing awareness of the sun's role in developing skin cancer, people have been doing the full Zuckerberg and wearing so much sunscreen that they look like they did a face plant into a bucket of Crisco. It's a great strategy for preventing skin damage but it effectively blocks the natural production of vitamin D.
So the crux of the situation is this:
Most of us probably ingest too much calcium and too little magnesium, creating a situation where vitamin D isn't fully metabolized, leading to immune systems that aren't functioning nearly as well as they could be.
Here's where I piss in your cornflakes a bit. Magnesium doesn't just help you elevate vitamin D levels into a desirable range, it also tries to prevent those levels from getting too high. If someone's D3 levels are much above 30 ng/dl, particularly over 50, magnesium has the opposite effect.
It actually prevents too much vitamin D from being activated by increasing the production of vitamin D de-activating enzymes.
It's a good thing, though, as really high levels of vitamin D, since it's fat soluble and not water soluble, can protect someone from overdosing and developing toxicity issues.
Alright, enough corn-fusion. You likely want to elevate vitamin D levels so you need to take between 2,000 and 5,000 of it a day (those of you that spend any time at all in the sun can opt for the lower end of that range).
Secondly, in order to ensure that the vitamin D you're taking gets put to work, you most likely need to take a magnesium supplement. Biotest makes a nice option. It's called Elite Pro™ Minerals.
Each serving contains 400 mg. of magnesium, in addition to several other minerals that athletes tend to be deficient in. Here's the per-serving rundown:
- Magnesium (as glycinate chelate): 400 mg.
- Zinc (as arginate chelate): 30 mg.
- Selenium (as glycinate complex): 200 mcg.
- Chromium (as nicotinate-glycinate chelate): 200 mcg.
- Vanadium (as nicotinate-glycinate chelate): 100 mcg.
Take one serving a day with or without meals so that protein synthesis, hormone production, energy production, carbohydrate utilization, and, perhaps most importantly, vitamin D absorption and activation, can take place.
You don't need to take the magnesium at the same time as the vitamin D, though. What you DO need to take at the same time as vitamin D, though, is some dietary fat as fat can increase its absorption and magnesium can't activate it if it's not absorbed.
The general consensus is that right around 11 grams of fat works best, but the study that advertised that number gave patients a prodigious 50,000 IU of vitamin D in a once-a-month dosage. That's likely more than any sane person would take in one sitting.
An easier strategy would be to just take your vitamin D with meals, which, under normal circumstances, should contain enough fat to ferry the vitamin through the intestinal lining.
Okay, I threw a lot of info at you, but all you need to retain is this:
- Dai Q et al. Magnesium status and supplementation influence vitamin D status and metabolism: results from a randomized trial. Am J Clin Nutr. 2018 Dec 1;108(6):1249-1258. PubMed.
- Dawson-Hughes B et al. Meal conditions affect the absorption of supplemental vitamin D3 but not the plasma 25-hydroxyvitamin D response to supplementation. J Bone Miner Res. 2013 Aug;28(8):1778-83. PubMed.
- Rosanoff A et al. Essential Nutrient Interactions: Does Low or Suboptimal Magnesium Status Interact with Vitamin D and/or Calcium Status? Adv Nutr. 2016 Jan 15;7(1):25-43. PubMed.