Whatever Happened to American Longevity?
Life expectancy is a pretty simple concept: it's an estimation of how long the average person lives. Anyone can understand that. So how is this for a compelling data point: if you look at life expectancy in nations around the globe, you'll find that over the past 20 years, the U.S. has sunk from ranking No. 11 to ranking No. 42. In other words, a baby born in 2004 in any one of 41 other countries can expect to live longer than his or her American counterpart.
This may come as a surprise. Sure, we all know the health care system in the U.S. is broken, but life expectancy isn't just tied to medicine -- it's also related to quality of life in a larger sense. (I can live in a nation with the best health care system in the world, but if it's in the throes of civil war, my life expectancy will be short). As we all know, the American standard of living is the envy of the world. After all, we're the richest country on the globe. So what gives?
While some of us are rich, the average American is not. And while the rich are living longer, the poor are living shorter. Factor in the profit motive that drives U.S. healthcare, and you will begin to understand why American medicine has done little to heal the gap between rich and poor. Over the past twenty-five years, we have poured money into healthcare, but have paid relatively little attention to public health.
This may seem a bold claim, but last month the Congressional Budget Office (CBO) issued a report that provides the numbers: "In 1980," the CBO found that "life expectancy at birth was 2.8 years more for the highest socioeconomic group than for the lowest. By 2000, that gap had risen to 4.5 years."
The report notes that "the 1.7-year increase in the gap" between socioeconomic groups "amounts to more than half of the increase in overall average life expectancy at birth between 1980 and 2000." In other words, even though the average life expectancy has increased in the U.S., it has grown more slowly because of widening socioeconomic disparities.
Citizens of countries that don't tolerate as much inequality enjoy longer lives. According to numbers from the Census Bureau and the National Center for Health Statistics, a baby born in the United States in 2004 will live an average of 77.9 years. In the U.K., an '04 baby can expect to live 78.7 years; in Germany, 79 years; in Norway, 79.7 years; in Canada, 80.3 years; in Australia, Sweden, and Switzerland, 80.6 years; and in Japan, a newborn can expect to live 81. 4 years.
Somehow or other, when they hear these figures, most Americans just shrug. Indeed, "it is remarkable how complacent the public and the medical profession are in their acceptance of" our low ranking when it comes to life expectancy, "especially in light of trends in national spending on health, " Dr. Steven Schroeder, a professor in the Department of Medicine at the University of California, San Francisco wrote in the New England Journal of Medicine last year.
"One reason for the complacency may be the rationalization that the United States is more ethnically heterogeneous than the nations at the top of the rankings, such as Japan, Switzerland, and Iceland. But," Schroeder pointed out, "even when comparisons are limited to white Americans, our performance is dismal. And even if the health status of white Americans matched that in the leading nations, it would still be incumbent on us to improve the health of the entire nation."
In the OECD countries that outrank us, the gaps between rich and poor are not as great and, not coincidentally, all have universal health insurance. (As Maggie wrote in an earlier post on Health Beat, in countries that are mainly middle-class, there tends to be more social solidarity. People identify with each other, and are more willing to pool their resources to pay for healthcare for everyone.)
But having access to health care is only a small part of health. Schroeder identifies five factors that determine health and longevity: "social circumstances, genetics, environmental exposures, behavioral patterns and health care." Of these five, when "it comes to reducing early deaths," he points out, "medical care has a relatively minor role." Indeed, "inadequate health care accounts for only 10% of premature deaths, yet it receives by far the greatest share of resources and attention."
Socioeconomic status is the strongest predictor of health, above and beyond access to health care. This is because socioeconomic status includes access to health care and a variety of other factors. Even if when the poor have insurance, they are less likely to have access to cutting-edge medical discoveries; they're more likely to smoke, more likely to be obese, more likely to live in unsafe or unhealthy environments. They also tend to be less educated, meaning that they are less able to manage chronic diseases.
These facts are reflected in life expectancy. African-Americans are more likely to live in poverty than other Americans: as a result, black men can expect to live six years less than white men, and black women four years less than white women. Education, another critical component of socioeconomic status, also contributes to the story. The CBO reports that "the gap in life expectancy at age 25 between individuals with a high school education or less and individuals with any college education increased by about 30 percent" from 1990 to 2000. "The gap widened because of increases in life expectancy for the better educated group," the report notes. "Life expectancy for those with less education did not increase over that period."
This trend is clear: since 1980, affluent members of society have made gains while the have-nots have, at best, run in place, and, at worst, lost ground. Another recent study published in the PLoS Medicine takes a broader look at the problem by going all the way back to 1960 to see at how life expectancies have differed in U.S. counties. (Counties were used because they are the smallest geographic units for which death rates are collected, thus allowing for a precise comparison of subgroups). The authors, who hail from Harvard, UCSF, and the University of Washington, discovered that "beginning in the early 1980s and continuing through 1999, those who were already disadvantaged did not benefit from the gains in life expectancy experienced by the advantaged, and some became even worse off."
1980 was a watershed year. Indeed, the study reports that from 1960 to 1980, life expectancy increased everywhere. But "beginning in the early 1980s the differences in death rates among/across different counties began to increase. The worst-off counties no longer experienced a fall in death rates, and in a substantial number of counties, mortality actually increased, especially for women..."
So what was so special -- or rather, harmful -- about the 1980s?
1980 was the year that a conservative agenda firmly replaced the "War on Poverty" that LBJ had begun in the 1960s. For the next 28 years, the trend would continue as corporate welfare and tax cuts for the wealthy replaced programs for the poor and middle-class.
As the authors of a 2006 PLoS Medicine study note, "in the 1980s there was a general cutting back of welfare state provisions in America, which included cuts to public health and antipoverty programs, tax relief for the wealthy, and worsening inequity in the access to and quality of health care." By contrast, in the 1960s, "civil rights legislation and the establishment of Medicare set out to reduce socioeconomic and racial/ethnic inequalities and improve access to health care."
But after 1980, the '06 PLoS Medicine study shows that rates of premature mortality across socioeconomic groups began to diverge, helping to roll back the gains of the 1960s and 1970s. In a stunning conclusion, the study's authors reported that "if all people in the US population experienced the same health gains as the most advantaged [i.e. whites in the highest income group] without the problems of the 1980s, "14 percent of the premature deaths among whites and 30 percent of the premature deaths among people of color would have been prevented."
In sum, the stronger social safety net of the 1960s helped to increase longevity for all Americans; its erosion in the 1980s created a discrepancy between the haves and the have-nots. Indeed, given that socioeconomic status is the strongest predictor of health, it's noteworthy that the lowest quintile of earners in the U.S. saw its income fall by 15 percent between 1979 and 1993, while the highest 20 percent saw their income grow by 18 percent over this same period. The poverty rate in the U.S. was cut nearly in half tin the 1960s; from 1980 to 1989, it inched down by just one percentage point.
Clearly, the decline of American longevity is related to an increase in American inequality. But it would be short-sighted to stop our analysis here. It's also worth asking, where have we been spending our health care dollars?
"To the extent that the United States has a health strategy, its focus is on the development of new medical technologies and support for basic biomedical research," Schroeder observes. "We already lead the world in the per capita use of most diagnostic and therapeutic medical technologies, and we have recently doubled the budget for the National Institutes of Health. But these popular achievements are unlikely to improve our relative performance" when it comes to longevity.
If we want to cut the number of premature deaths, we might put more emphasis on smoking cessation clinics. "Smoking causes 440,000 deaths a year in the United States," notes Schroeder, who directs the Smoking Cessation Leadership Center at UCSF. "Smoking shortens smokers' lives by 10 to 15 years, and those last few years can be a miserable combination of severe breathlessness and pain." 44.5 million Americans still smoke. "Smoking among pregnant women is a major contributor to premature births and infant mortality. Smoking is increasingly concentrated in the lower socioeconomic classes and among those with mental illness or problems with substance abuse," Schroeder explains. "Understanding why they smoke and how to help them quit should be a key national research priority. Given the effects of smoking on health, the relative inattention to tobacco by those federal and state agencies charged with protecting the public health is baffling and disappointing."
Kaiser Permanente of northern California has shown that it can be done. When Kaiser implemented a multisystem approach to help smokers quit, Schroeder reports that "the smoking rate dropped from 12.2% to 9.2% in just 3 years. Of the current 44.5 million smokers, 70% claim they would like to quit. Assuming that one half of those 31 million potential nonsmokers will die because of smoking, that translates into 15.5 million potentially preventable premature deaths. Merely increasing the baseline quit rate from the current 2.5% of smokers to 10% -- a rate seen in placebo groups in most published trials of the new cessation drugs -- would prevent 1,170,000 premature deaths. No other medical or public health intervention approaches this degree of impact. And we already have the tools to accomplish it."
The poor also are more likely to be obese, "in part because of inadequate local food choices and recreational opportunities," says Schroeder. Fattening foods are cheaper than fresh fruit, vegetables and fish, particularly if you are shopping in inner cities. Gyms are too expensive for low-income families; exercising outdoors can be dangerous, and in inner cities, public schools often lack playgrounds and gymnasiums.
"Psychosocial stress" also leads poorer Americans to engage in "other behaviors that reduce life expectancy such as drug use and alcoholism," Schroeder notes. And even when they avoid these behaviors, " people in lower classes are less healthy and die earlier than others." A polluted environment, combined with uncertainty and worry, takes a toll.
Rather than focusing solely on medicine and medical care, Schroeder is committed to strategies that would improve public health. In the U.S. there is a sharp division between the two, with public health always the poor relation.
"It's harder, because there's stigma attached to it," Schroeder explains. "There's a sense among some that if a large portion of the nation's population is obese or sedentary, drinks or smokes too much, or uses illegal drugs, that's their own fault or their own business."
"We often get a double-standard question," he continues. "Critics who object to investing more in programs that could help drug addicts and alcoholics, ask: Well, don't many of these people relapse?
"Yes, of course," Schroeder responds. "But is it worth treating pancreatic cancer, which has a 5 percent survival rate, at most? Yes. So the odds of successfully treating drug abuse or alcoholism are actually better than in many of the serious illnesses that society, without question, wants us to treat."
Schroeder is right: When allocating health care dollars, we eagerly spend far more on cutting-edge drugs that might give a cancer patient an extra five months than on drug rehab clinics that could make the difference between dying at 28 and living to 68.
Again, 1980 marks a turning point, notes Marcia Angell, a Senior Lecturer at Harvard Medical School and the former editor-in-chief of NEJM. "Between 1960 and 1980, "prescription drug sales were fairly static as a percent of US gross domestic product, but from 1980 to 2000, they tripled."
This wasn't just happenstance, says Angell. A major catalyst of the pharma boom was the Bayh-Dole Act of 1980, a law that "enabled universities and small businesses to patent discoveries emanating from research sponsored by the National Institutes of Health, the major distributor of tax dollars for medical research, and then to grant exclusive licenses to drug companies." In other words, the Bayh-Dole Act commoditized medical research.
Before 1980, "taxpayer-financed discoveries were in the public domain, available to any company that wanted to use them," says Angell. As a result, long-term, collaborative tinkering could help to create new and effective medications. But Bayh-Dole made research proprietary and profitable.
After Bayh-Dole, drug research seemed to be less about making real medical progress, and more about doing the bare minimum to create a patentable product. And so began the age of me-too drugs, which do little to promote health and instead exist to increase market share. In a Boston Globe op-ed last year, Angell observed that, "according to FDA classifications, fully 80 percent of drugs that entered the market during this decade are unlikely to be better than existing ones for the same condition."
Why are we willing to devote 13 or 14 percent of our $2.2 trillion health care budget to prescription drugs, while refusing to help the quarter of the population that still smokes?
"It is arguable that the status quo is an accurate expression of the national political will -- a relentless search for better health among the middle and upper classes," Schroeder acknowledges. [our emphasis] "This pursuit is also evident in how we consistently outspend all other countries in the use of alternative medicines and cosmetic surgeries and in how frequently health 'cures' and 'scares' are featured in the popular media. The result is that only when the middle class feels threatened by external menaces (e.g., secondhand tobacco smoke, bioterrorism, and airplane exposure to multidrug-resistant tuberculosis) will it embrace public health measures. In contrast, our investment in improving population health -- whether judged on the basis of support for research, insurance coverage, or government-sponsored public health activities -- is anemic."
We're hopeful that this will change. In going to medical conferences over the past year, Maggie has met an impressive number of very, very bright 20-somethings who are devoting their careers to public health. And they understand that "medicine" and "public health" are not separate disciplines.