Vitamania! Are You Being Hoodwinked by Nutritional Supplements?
Excerpt adapted from NUTRITIONISM by Gyorgy Scrinis. Copyright (c) 2013 Columbia University Press. Used by arrangement with the publisher. All rights reserved.
From the 1930s, the supplement industry used the belief that vitamin supplements provided a form of “health insurance” to justify and promote their use. The industry promoted this vitamin insurance policy—or vitaminsurance—as a safety net for a range of nutritional threats, such as the occasional lapses in good eating habits by people with busy modern lives, or the invisible deficiencies in modern foods and diets. In recent decades, this vitaminsurance has taken the form of the multinutrient supplement designed to meet a broad spectrum of nutrient requirements.
The rationale of proponents of supplement use has also shifted over the past century, from meeting vitamin deficiencies to reducing the risk of chronic disease in the good-and-bad era to enhancing health and bodily functioning. Today, many nutrition experts commonly endorse the taking of supplements as a nutritional insurance policy. For example, in his 2005 book Eat, Drink, and Be Healthy, leading nutritional epidemiologist Walter Willett of the Harvard School of Public Health promotes multivitamin, vitamin D, and calcium supplements, which he justifies as “a cheap and effective genuine ‘life nutritional insurance’ policy.”
In the late 1960s and early 1970s, Nobel Prize–winning chemist Linus Pauling—who had no recognized expertise in nutrition science—promoted megadoses of vitamin C as a cure for the common cold, based on his review of the scientific literature rather than his own nutrition studies. Medical experts and institutions such as the AMA belittled Pauling’s ideas, arguing that the studies cited by Pauling failed to support his contentions. But Pauling’s theories received much publicity in the popular media, and the idea of the preventive and curative power of vitamin C became widely recognized and accepted by the lay public at the time. Sales of vitamin C supplements skyrocketed. A belief in the preventive or curative power of vitamin C for the common cold is still held by many people today.
In the quantifying era of nutritionism, the promotion of megadoses of vitamin supplements and claims for a range of health benefits beyond the avoidance of nutrient deficiencies were associated with alternative health practitioners and the supplements industry, but in some respects these alternative ideas have now assumed some mainstream respectability. Very high doses of vitamin D as well as omega-3 fat supplements—beyond those required to protect against more immediate deficiency diseases—are now promoted by some nutrition experts, who consider the long-term inadequate intake of these nutrients as contributing to a range of chronic diseases. The discourse of functional nutritionism similarly promises people an enhanced state of health and bodily functioning if they consume an optimal amount of these functional nutrients.
Having unleashed vitamins, nutrition experts have had trouble containing the public’s expectations. From the 1920s these experts advised the public to think of the vitamins and nutrients that their bodies required but to eat food in order to receive adequate quantities of them. Yet many people decided to take control of their own vitamin intake to address their own perceived nutritional requirements and deficiencies, and to consume their vitamins directly in the form of fortified foods and supplements. By the 1980s, an apparent gap had also opened up between nutrition experts’ use of supplements and the advice they gave about them to the public. A survey of dieticians in Washington State in 1981, for example, revealed that 60 percent of them used some kind of nutritional supplement. As public health nutritionists Joan Gussow and Paul Thomas commented at the time, “Even as nutrition professionals seek to teach Americans to ‘eat well’ and avoid ‘pills,’ they themselves are swallowing supplementary nutrients. ‘Do as I say,’ nutritionists seem to be telling the public, ‘not as I do’; and few of them dare to defend their own pill practices before their fellows."
Food regulators have also struggled to contain the commercial exploitation of vitamania by the food and supplement industries. From the 1940s, the U.S. Food and Drug Administration (FDA) had attempted to impose greater regulation over the vitamin supplement industry, and particularly to limit the health claims in advertising. Yet each time, the FDA was met with resistance not only from the industry but also from a public that demanded access to these supplements. Rima Apple has observed that the social, scientific, and regulatory struggles around vitamins continued through the second half of the twentieth century:
Decade after decade ... the same scenario is repeated: scientists report on the beneficial effects of vitamins; the media and particularly manufacturers publicize the claims; skeptical scientists declare that the American consumer is being hoodwinked; government agencies propose regulations to control the advertising, labeling, and sale of vitamin pills; and concerned consumers assert their right to take vitamins without government interference.
Vitamania—and the earlier obsession with calorie counting—signaled the rise of a wider nutritional consciousness in members of the lay public in the first half of the twentieth century. The nutricentric person had arrived. Everyday food discourses had begun to be suffused with nutri-speak. The public was growing familiar with the language of nutrients, learning to select and eat foods with some of this nutritional knowledge in mind, and becoming more open to the nutritional marketing of the food and supplement industries.
Despite the public’s growing familiarity with the science and the language of nutrients, much of the government’s dietary advice to the public during this nutritional era primarily referred to foods. The first food guide published by the USDA in 1917 contained five food groups: milk and meat, cereals, vegetables and fruits, fats and fatty foods, and sugars and sugary foods. These food-based dietary guides went through several revisions throughout the first half of the twentieth century. The Basic Seven guide, for example, released in 1943, included three groups of different fruits and vegetables, and the milk/dairy, meat/beans, bread/grains, and bread/margarine groups. Nutritionists intended that their recommended number of servings of each food group would ensure a nutritionally adequate diet, based on the quantities specified in the first RDAs released in 1941.
By the 1950s, many nutrition experts considered that all of the important nutrients in foods had been discovered, that their roles in the body were well understood, and that most of the important questions concerning nutrition had been answered. As evidence of this nutritional hubris, in 1946 Oxford University in the United Kingdom was offered a large sum of money to establish an Institute of Human Nutrition but reportedly turned down the offer because its directors believed that after ten years “there would be no human nutrition problems to study.” However, by the early 1960s, nutrition scientists had begun to rethink their enthusiastic promotion of meat and dairy products, following a rise in the incidence of heart disease and its claimed association with saturated fat intake. This reevaluation signaled the transition to the new paradigm and era of good-and-bad nutritionism.
Even if some of the specific nutritional hypotheses of the era of quantifying nutritionism have since been discredited, a number of its features live on, such as the myth of nutritional precision, caloric reductionism, and the perception of nutrient scarcity. The history of vitamania also illustrates how easily the food and supplement industry could exploit scientific knowledge as a strategy to market their products, and the willingness of members of the lay public to embrace these scientific claims and scientifically marketed products.