As a purveyor of health and wellness education, I am no stranger to the controversies that surround seemingly benign topics. While it is uncommon to see the pro- and anti- vitamin camps fight as bitterly as those talking about vaccinations, vitamin supplementation is surprisingly controversial.
In its early history, vitamin supplementation in the United States actually started as a public health initiative to combat a range of prevalent diseases caused by nutritional deficiencies. According to the book, “Dietary Reference Intakes: Guiding Principles for Nutrition Labeling and Fortification” (2003), prior to the availability of powder and pill form supplements, the Committee on Food and Nutrition, now the Food and Nutrition Board, started fortifying foods with the addition of vitamins and/or minerals. The earliest incidence of fortification began in 1924, with the addition of iodide to salt. By the early 1930s, Vitamin D was a standardized and regulated addition to milk, and the fortification of flour and bread was quick to follow. Around the same time as flour fortification (early 1940s), multi-vitamin and mineral supplements became available and widely used.
Now, if you have even a tenuous grasp on U.S. history, you will realize that this time frame was particularly notable as the post- World War I through early World War II period. While WWI saw nutritional deficiencies in potential military recruits, this became a greater issue following the entry of the U.S. into World War II. At this time, many recruits were turned away due to poor health, and specifically diseases caused by nutritional deficiencies. Therefore, nutrition became an issue of national security. Fortification and supplementation initiatives has become so effective at lowering deficiency-related diseases, it has effectively rendered conditions such as goiter, pellagra, and rickets* essentially obsolete.
For those of you who are a little less historically inclined, let us take a look at data that is a little more contemporary. According to a 2013 Gallup Poll, 50 % of Americans take a daily vitamin supplement, while the CRN Consumer Survey places the number of Americans taking a dietary supplement at a nearly consistent level of 68%, annually, from 2011-2015. (Note: To be fair, the surveys measure slightly different things-multivitamin vs. all dietary supplementation.) The really interesting thing about the Gallup Poll is that it shows that older individuals are far more likely to use vitamin supplements. We could, of course, use this result as an indicator that early habits established in the 1940s are carrying over into the present-day, but that would require some assumptions that are beyond the scope of the data. Sad, because I think I might be on to something there. Regardless, vitamin supplements are very popular, but as we should all know by now, popularity does not equate with necessary or good.
The main argument against vitamins is that are an unnecessary supplement that provides little benefit and can even be harmful. (For the record, yes, it is absolutely possible to overdose on vitamins.) At least these were the findings released in a three-part series of journal articles appearing in the Annals of Internal Medicine (late 2013). In short, one of the largest literature reviews of vitamin supplementation, including 26 previous studies, determined that, for the majority of the population, vitamin supplements were a waste of money. In particular, they did nothing to reduce risk of disease or early death. In December of 2013, an editorial appeared in the same journal, titled “Enough is Enough: Stop Wasting Money on Mineral Supplements,” which was co-authored by five doctors, including three Johns Hopkins professors: Drs. Eliseo Guallar, Lawrence J. Appel, and Edgar R. Miller III. In the editorial, the authors recommended avoiding supplementation for the general population, while allowing for supplementation in some cases. For example, women who are of childbearing age should routinely supplement with folic acid to prevent neural tube defects in any potential offspring.
So, let us get down to the real question. Should you take a daily multi-vitamin or similar supplement? The short answer is: probably not. For the vast majority of individuals, vitamin and mineral supplementation is a waste of money. As one of my former professors liked to say, “vitamins are a recipe for expensive urine.” Do not even get me started on super-doses of vitamins. (Note: For those interested in types of vitamins, as well as upper and lower thresholds, I will discuss those issues at some point). Ideally, your main source of all vitamins and minerals should be an appropriately varied diet, emphasizing vegetables and fruits. Residents of the United States, in general, do not suffer from the deficiencies of previous generations. This is in part due to the extensive range of available fortified foods, as well as the sheer volume and variety of foods accessible to the average person. Supplement with vitamins as a last resort, rather than a first resort. At this point, it is important for me to emphasize that if your physician has prescribed or recommended vitamin or mineral supplements, take them. If you have a particular condition or take a medication that increases your likelihood for deficiency, ask your physician about taking a supplement. Likewise, if you or child have a very limited diet (i.e. you can count the foods you eat on two hands or less, all your food is beige barring ketchup, you have sensory issues that influence food choices, etc.), ask your physician about taking a good multi-vitamin to fill in dietary nutritional gaps.
As mentioned before, I am not a doctor, and my conclusions are based on general population guidelines rather than advice that is tailored to you and your particular situation. If in doubt, ask a qualified physician familiar with your health history and current situation.
*Nutritional deficiency caused rickets reemerged in the United States starting around 2003.