The tax cut proposals first announced by President Donald Trump this April are, simply put, a fraud. They are about greed and politics, not economic growth or true tax reform. The policy has been proved wrong twice already. As a broadly faithful rerun of the Reaganomics of the early 1980s, Trump’s policies, should they be passed, will end as Reagan’s “voodoo economics” did, a failure by virtually any measure. George W. Bush’s sharp tax cuts in the early 2000s also failed to generate a strong economy, one that was driven by speculation rather than strong investment.
Reaganomics was the name given to the self-serving claims of the wealthy that a huge tax cut for them would pay for itself—by creating incentives to invest and work that would lead to renewed economic growth. Tax revenues would simply rise with a growing economy to plug the revenue hole. Some call it trickle-down economics. John Kenneth Galbraith put it scornfully in The Culture of Contentment: if you give a horse enough to eat, some of the kernels will fall to the ground for the sparrows.
In the summer of 1981, Congress, including a Democratically-controlled House, passed the Reagan tax proposal, which reduced the top income tax rate from 70 percent to 50 percent. Among other measures, it included a tax break for businesses. Reagan’s economists predicted the budget deficit—the size of which, under Jimmy Carter, was a major campaign issue for Reagan—would fall to $45 billion for that coming fiscal year as the economy recovered. Fiscal 1982 ended with the deficit not at $45 billion but at $140 billion.
The economy began to recover when Federal Reserve chairman Paul Volcker cut interest rates, and to a degree the tax cuts created some Keynesian stimulus. In the real world, growth could not magically rise to the heights needed. Reagan now started raising taxes, including the regressive Social Security tax, to stop the flow of red ink.
It was not enough. When Reagan took office, American debt was $997 billion; when he left, it had reached $2.85 trillion. America became a net debtor nation for the first time in modern history.
Perhaps the debt would have been worth it, but Reaganomics was an abject failure as an economic program. Inflation came down but that was the only achievement. Growth returned but average wages, however measured, stopped growing altogether, whereas they had been growing steadily since the early 1950s. Income inequality started to rise rapidly from early in Reagan’s first term and didn’t stop. The greatest of ironies is that capital investment was tepid and the source of economic growth, productivity, grew only moderately, the opposite of the takeoff that was promised.
George W. Bush reinstated the Reaganomics philosophy with his own major tax cuts, again mostly funneled to the wealthy. The recovery and expansion that followed created the fewest jobs of any expansion since World War II. What economic growth there was had been fueled by the speculation in residential housing that led to the crash of 2008, the year Bush left office.
Trump’s proposed tax cuts are far more radical than those of Bush, and more on a par with Reagan’s. They would be another unjustifiable gift to the wealthy. Some progressives are saying tax cuts never stimulate growth, but this is untrue. They can be a potent Keynesian stimulus from time to time, as were the Kennedy-Johnson tax cuts of 1964. Reaganomics was steroidal Keynesianism—not at all what the master economist had ordered.
Unlike Reagan, Trump will try to impose tax cuts not during a severe recession, but in a fairly strong economy that has been growing for several years and in which the unemployment rate is below 5 percent. Reagan took over when the economy was in the midst of its worst recession and the unemployment rate had reached nearly 11 percent. There was slack that warranted fiscal stimulus.
There is still some slack in the economy. Effective stimulus should take the form of public investment or more robust social programs. A modest but not huge tax cut could be temporarily productive.
Trump’s initial tax cut proposal would cut top tax rates from 39.6 percent to 35 percent, but the major tax cut is reserved for business: from 35 percent to 15 percent. The estate tax and the alternative minimum tax would be eliminated, which benefits only the rich. The wealthy would benefit most, though the details of Trump’s slapdash program are too skimpy to estimate accurately by how much.
Analyses suggest the federal government would lose $5 to $7 trillion in tax revenues over 10 years under the initial plan. I think fears of deficits are often exaggerated, but an additional $700 billion a year of financial borrowing would unsettle the financial markets.
Unperturbed by the lessons of history, Trump’s advisers believe if economic growth is raised by one percentage point to 3 percent a year, enough federal revenue will be raised to fill the hole. Trump’s wish is their command. Americans should be aware that a sustained growth rate of 3 percent a year after a long expansion is almost impossible to achieve. It would require a productivity boom, the kind Reagan never got.
Still more disturbing, it would leave no room to invest in infrastructure, which almost everyone agrees America badly needs. It would require heartless cuts in the safety net and economic development programs—which Republicans covet—which would lead to more inequality. This disastrous outcome could be compounded if this wish list somehow improbably gets enacted, by limiting the deduction for state and local taxes. This measure would hurt states that provide generous social policies and economic development programs.
Deregulation also had a part in the failure of Reaganomics. Reagan defanged the Department of Labor, for example, which then couldn’t enforce fair labor standards. Trump is doing the same.
Trump has conned some of the electorate into thinking he knows how to channel their anger constructively by being vindictive himself. There is almost nothing constructive about his early tax plan. These details will change. But the plan reflects a mean-spirited, self-involved, and uninformed mind at work.
Workers beware. Trump’s reckless mimicry of Reaganomics is likely to lead to stagnating wages and growing inequality for another four years, and no recharging of the dynamics of the American economy will ensue.
Despite the practical failures of free-market economics, too many mainstream economists have continued to embrace simplistic ideas about how the economy works. Such ideas are often rooted more in ideology than in evidence. These beliefs and the policies that follow led directly to the 2008 financial crisis and the Great Recession. They also centrally contributed to the nation’s subpar performance beginning in the late 1970s, and to our widening inequality. They continue to endanger America’s economic health.
The mainstream of the profession claims to qualify oversimplified free-market ideas. But when it comes to key policy choices, the premise that markets are efficient usually trumps a more complex analysis. Thus, most mainstream economists are usually for less regulation even when more is required. They argue for reducing deficits even when expanded public outlay is indicated. They favor letting markets set wages without many safeguards for workers, even when the result proves neither equitable nor efficient. The consensus in the profession is that widening inequality must be the result of deficiencies in the skills of the workforce, rather than the result of structural disadvantages inadequately addressed by government.
To be sure, there are dissenting economists. A few even win Nobel prizes. But in the academy, free-market ideas are still the dominant ones.
The neoclassical insights at the core of standard economic thinking were once exciting intellectual breakthroughs. These ideas could still be useful, if adapted to the times, with their limitations understood, and tempered by other kinds of economic thinking. But the profession has largely turned its key ideas into faux-scientific rules of thumb that in fact reflect (and reinforce) the conservative political attitudes of the time. Disguised in technical terms, these ideas have increasingly become mainstream justifications for a reduced role for government in the economy.
The central propositions of free-market economics boil down to these:
The Invisible Hand. The premise of Adam Smith’s invisible hand is that buyers and sellers, free of any government interference and merely following their self-interest, will arrive at an optimal distribution of goods and services at the “right” price, as if guided by an unseen hand.
Mainstream economists often say they don’t literally believe in the invisible hand. They concede that many assumptions must be made for free markets to produce optimal outcomes. These include transparent access to information and product prices, no undue power for oligopolistic corporations to set prices or control distribution, highly rational buyers and sellers pursuing their self-interest, etc.
But in fact, for the economic mainstream, the invisible hand is the default principle whether or not these assumptions are met. Why, for example, do so many economists oppose increases in the minimum wage? Over the past 10 to 15 years, empirical evidence began to show that an increase in the minimum wage in many communities did not result in more than a trivial number of lost jobs and may have actually resulted in more jobs, as demand for goods and services increased with higher purchasing power. In the real world, a hike in the minimum wage did not perform according to the invisible hand, yet economists assumed it would.
People Get What They Deserve. If labor markets worked according to Adam Smith’s principles, you could explain inequality not as a market failure, but as an efficient market mechanism. Some economists do worry about the social costs of unequal wages. But most economists believe a rise in inequality is a signal of the economy’s technological progress. The claim that unequal education and skills explain unequal wages is an invisible-hand argument. If people with more education are better qualified, the market will justifiably pay them better. This premise allows mainstream economists to ignore the role of power shifts in labor market institutions and the fact that educational opportunity itself increasingly reflects hardened class lines—who your parents were, principally—more than the acquisition of skills. In a nation of fewer opportunities, whom you know and what social skills you have become an entrÃ©e to a job, not learned skills. This is all beyond the grasp of the invisible hand.
Sure, investing in education matters. But recognizing that unequal opportunity begins at birth—or earlier—and devising compensatory policies is more important. Labor markets often compound these disadvantages; they don’t compensate for them.
Say’s Law and Austerity Economics. A close cousin of the invisible hand is Say’s Law, legacy of the 19th-century economist Jean-Baptiste Say. In shortened form, it argues that supply creates its own demand. In other words, if you make it, people will buy it. John Maynard Keynes devoted his classic General Theory to dismantling the idea. Say’s Law is the corollary to Smith’s premise that economies are self-adjusting as long as government steps out of the way.
Closely related to this proposition is another assumed accounting identity—the claim that savings equal investment. This is true only retrospectively, but too many still accept the proposition that more savings will generate more investment. With more savings, the price of investment—the interest rate—will fall; due to the invisible hand, business will invest more. Not so, said Keynes. If more savings come at the expense of more buying, investment will likely fall, especially in a weak economy.
Again, some economists will tell you they know better than to believe Say’s Law without qualification. There are diminishing returns to savings, for example, so the general gain from more saving peters out in the end. In an Adam Smith world of self-adjusting economies, people would reduce savings as the interest rate falls. But in a recession, worried people ignore these market signals. They often increase savings because they fear they will soon be out of a job. Individual behavior does not aggregate to general efficiency.
But read the literature, and few mainstream economists acknowledge that point. More savings are always good. This thinking is behind austerity economics. Government deficits reduce national savings, so these deficits must be minimized, even if that means tax hikes and reduced social spending during a prolonged slump. Democratic economists, it is largely forgotten, loudly called for reduced deficits in the 1980s under Republican Ronald Reagan. A Say’s Law–type of argument was used by President Clinton’s economists to channel budget surpluses to reduction of the national debt rather than, say, increases in public investment. Advisers to President Obama in 2010 called for deficit reduction long before the economy was on the mend. Of course, not all economists believe this; and there are times (during full employment, for example) when even Keynesians favor deficit reduction. Yet simplistic ideas about deficit-reduction as cure-all dominate the profession. They seep into the public consciousness and are not easy to reverse, especially when both Democratic and Republican economists and presidents have advocated them at one time or other. Keynes defeated Say’s Law only temporarily.
Financial Markets Are Efficient. One of the more extreme abuses of the invisible hand has been efficient markets theory. Economists like Eugene Fama of the University of Chicago claimed that markets for stocks, bonds, and other financial instruments were so rational that they accurately reflected the future value of the underlying company. In such a rational marketplace, there could be no lasting speculative bubbles. Moreover, you could tie a CEO’s compensation to the stock price and get better managerial results.
Rational financial markets require minimal regulation. But financial deregulation, which began in the 1970s and was reinforced by Reagan and Clinton, led directly to the subprime bubble and the 2008 collapse. Many economists now know better. Robert Shiller of Yale has been arguing for decades that bubbles exist. But do most economists appreciate the overwhelming evidence for bubbles? As the Dodd-Frank Act is slowly eviscerated, there is no groundswell within the mainstream profession calling for more effective re-regulation of finance.
Inflation Targeting and Price Stability Are Holy. Ben Bernanke, the former Federal Reserve chairman, was a leading theoretician in favor of targeting a low inflation rate as a primary government policy. Low and predictable inflation is said to remove uncertainties about the future, and thus allows the invisible hand to work its miracles. Market efficiency in turn will lead to prosperity.
Right up until the eve of the 2008 collapse, mainstream economists were convinced that inflation-targeting was the main justifiable intervention; the free market would do the rest. They even created their own self-congratulatory measure of success, called The Great Moderation. From the early 1980s to 2007, the U.S. gross domestic product fluctuated less than it had in prior decades. The stability was taken as proof that economists knew what they were doing at last.
But consider what else happened during these supposedly ideal years. Income inequality rose to the heights of the 1920s; debt soared as consumers tried to maintain living standards by borrowing; men’s wages fell dramatically. Median income for the bottom 90 percent declined to levels of the 1960s; public investment was tragically neglected; and there was one financial crisis after another: 1982, 1987, 1990, 1994, 1997, 1998, 2000, and finally 2008.
What’s more, growth was on average slower in this period than it was in earlier decades. If slower growth is the price paid for stability, what is the purpose of economics? If the conditions are created for a market crash, inflation targeting can’t be the summa of policy.
More Cross-Border Trade Is Always Good. In the 1990s, Western nations developed a set of policies known as the Washington Consensus which involved not merely free trade but also the free flow of capital around the world. It was a classic invisible-hand argument. One-size-fits-all policies should be adopted everywhere, no matter the developmental stage, educational attainment, or culture of a nation.
But the Washington Consensus badly failed in the 1997 East Asian financial crisis. In fact, there is a great deal of doubt that free-trade agreements have created the jobs that economists claim for them. Moreover, widespread assertions that free-market reforms led to enormous reductions in global poverty foundered on a hard fact: Most of the reduction occurred in China, and to a lesser degree in India—countries that did not adopt the Washington Consensus.
Yet simple free-trade agreements are still backed aggressively by many economists. These agreements often favor rich nations, or the elite in poor nations. We don’t even know what is in the new 12-nation Trans-Pacific Partnership proposal, organized by the U.S. government and its corporate allies. The secrecy is apparently needed to reduce likely controversy. We do know that the intellectual property of big companies in rich nations is likely to be well protected; that “trade” norms are intended to be used to undermine domestic regulation; and there are doubts that workers in either poor or rich nations will be similarly protected.
We should learn a few non–invisible-hand lessons about trade. First, nations need space to develop their own industries and institutions. This might require subsidies and other supports that violate trade agreements. Second, free trade should be adopted gradually—no shock therapies, please. Third, we should admit that there are losers in free trade, and the social safety net should always be expanded accordingly.
Markets Invariably Work Better than Governments. Mainstream economics has no strong theory of government, except that it is a corrector of market failures (which are presumed to be rare). We might call this a negative theory. The Fed can intervene to save the economy from collapse. Or anti-trust authorities can make sure markets are competitive (which they don’t do much of these days). Governments are also supposed to fill the hole for social goods that markets don’t provide, like highways and schools and clean air and water. But in free-market economics, failures like these are hard to define; they make mainstream economists uncomfortable because they depart from core theory. The nation needs a positive theory of government, which recognizes how valuable social policies and public investment have been, and how much more of them we need.
Invisible-hand purists often love to oversimplify economic history, claiming, for example, that in the 19th century America lived by the invisible hand of laissez-faire. This is simply not true. Transportation, education, health care, wage protection—all these were the work of government. Today, fears of a big federal deficit block adequate government investment. But the nation won’t grow without more government. The dominance of bad mainstream thinking, which leads to resistance to public investment, has been especially damaging because it undermines the foundation of future prosperity.
What, then, is behind the strong hold these ideological principles have on mainstream economists? There are three main explanations: faux science, careerism, and political acceptability.
Faux Science. The acceptance of the invisible hand is taken as a close approximation of reality not only for a single market but also for the whole economy. This is known as general equilibrium. With that assumption taken almost as scientific fact, economists can build highly complex models. Some economists, even on the left, will say that the invisible hand is much like Galileo’s law of falling bodies, which states that heavy and light objects will fall at the same rate of acceleration to earth, air friction aside.
But the invisible hand, as I observe in my book, Seven Bad Ideas, is not in any way comparable to such verifiable physical phenomena. It is a compelling metaphor, but not a scientific one. The authority it provides economics is a false one because there are many immeasurable frictions that keep the invisible hand from producing the best outcome for all. We don’t even know how the magic price where the supply and demand curves allegedly meet is arrived at. Leon Walras, one of the first theorists who postulated that there was a general equilibrium, argued the price was found through an imaginary auction process, but there is no serious proof that a general equilibrium exists.
From these assumptions, however, it logically follows that an economy is almost always self-adjusting—and the politically conservative assumption that government interference is almost always bad becomes axiomatic. The extreme form is found in rational expectations theory, which argues government stimulus is almost always unnecessary or damaging. On these assumptions complex mathematical models can be built, which divert attention from the real world of work, investment, and wages, and allow economists a studied ignorance of economic history and real-world phenomena. As Robert Lucas, the father of rational expectations theory, put it: “Economic theory is mathematical analysis. Everything else is just pictures and talk.” The faux-scientific principles make a clean theory out of a dirty world.
Careerism. Another reason mainstream economics retains its magnetic hold is that it provides a safe basis for career advancement in academic institutions. Mathematical methodologies can be evaluated. These are often complex and sometimes brilliant, but they are based on foundations that may not relate to the real world. Contradictions in outcomes of economic research are ignored, because the evaluation is of methods, not outcomes. This is hardly science.
For example, Alberto Alesina of Harvard and his co-authors long argued that austerity economics could generate economic growth, even in a weakening economy. IMF economists, led by Olivier Blanchard, more recently showed persuasively that this was not true. But Alesina’s career continues to thrive. As Keynes famously wrote, it is better for reputation to fail conventionally than to succeed unconventionally. As noted earlier, Eugene Fama of the University of Chicago remains skeptical of speculative bubbles because markets are too rational. His prestige remains undiminished. Robert Shiller argues that there clearly are such irrational bubbles. They simultaneously won the Nobel Prize in 2013. Can this be science?
The lack of a model-building methodology resulted in the reputational suppression of Hyman Minsky, now among the most cited of economists for his work predicting the inevitability of speculative bubbles and the damage they can do. He was more historian and psychologist than technical economist, but now mainstream economists are paying attention. One wonders for how long. Similarly, John Kenneth Galbraith’s championing of public investment over tax cuts was neglected because of his lack of modern methodologies. On the other hand, Joseph Schumpeter, who was also an old-fashioned narrative economist, is still heralded because he was basically a conservative.
Political Reward. Finally, the movement toward simplified ideological economics has had great appeal to increasingly conservative policymakers, think tanks, and business organizations. By its very nature, a firm belief in the invisible hand means a faith in laissez-faire policies: reduced taxes and regulation. The less government, the better. Markets, as noted, will reach the right price on their own. If a stock is priced too high, a smart market participant will sell it. It just happens that these principles provide scientific grounding for the policies that business elites prefer. No wonder mainstream economists are showered with support, prestige, and well-lubricated career trajectories.
The turn in the nation’s attitudes against government began with the high inflation of the 1970s, which many economists, led by Milton Friedman, pinned on government deficits. Ronald Reagan sealed the political argument in a debate with Jimmy Carter when he said: “We don’t have inflation because the people are living too well. We have inflation because the government is living too well.”
Arguments for free-market solutions, rather than social spending, followed as night follows day. Social programs were afterwards mostly tied to tax incentives, like the earned income tax credit, not to cash outlays. Industrial policies, where government invested in technologies and new business, were scoffed at. Invisible-hand thinking fit this new political environment perfectly. In an interview in 2001, Larry Summers, then Clinton’s former Treasury Secretary and Obama’s future chief economic adviser, told PBS, “There is something about this epoch in history that really puts an emphasis on incentives, on decentralization, on allowing small economic energy to bubble up.”
To win the ears of government officials and the public, economists not coincidentally fit their theories to the new elite attitudes in America. Believing that markets solved social problems and that expensive social programs did not was music to the ears of business and right-wing politicians. Research could be produced that big government and high taxes diminished economic growth. The research was flawed, but no matter.
The dominating policy ideas of the invisible hand have failed. Combined with deregulation, the Great Moderation and inflation-targeting created a bubble economy. Neglect of public investment in infrastructure, clean energy, and education, a consequence of Say’s Law–type thinking, has undermined the nation’s foundation. All these ideas were compatible with the prevailing conservative economic ideology of the time, and earned economists the attention of Washington and the media, but they failed America. The media in particular fell hard for the putative scientific nature of economics and hardly picked up on the ideological foundation of the economic advice.
Science is universally true. The premise of economics as science was a great cover for conservative ideology. But one-size-fits-all economics, which best describes economic advice over the past 30 years, is a practical failure. Anti-government economics failed, pure and simple.
Only a little seems to be changing. Targeting absurdly low inflation rates is still alive. One wonders whether regulation of finance will ever be adequate. The pressure for globalization is over-simplified, where one-size-fits-all policies are particularly damaging. We need economists who revise their theories based on evidence, but there is little room for reformers—few prestigious universities make space for heterodox thinking.
It is hard to be optimistic about economics. Being an economist has become a career, though not an intellectual profession. Money talks loudly in their academic hallways, and a small-government philosophy still rules the nation, despite the calamities that began in 2008.
One in five children in the United States lives in poverty. That is the highest rate for children in the rich world, with only one or two small countries as exceptions. Breaking down U.S. child poverty by age, the rate is highest for children age five and under. Moreover, the child poverty rate has not fallen in the economic recovery since 2009. In sum, 14.7 million U.S. children live in families below the poverty line.
Many domestic social programs address poverty levels. They include food stamps, the earned income tax credit, the child tax credit, and Medicaid. But they are clearly not adequate.
One policy idea that is disregarded or even disdained in the United States is regular cash allowances to parents. Of thirty-five economically advanced countries evaluated by a 2012 UNICEF report on child poverty, which included most of Europe, Japan, Canada, and Australia, the United States is the only one without some sort of cash allowance policy.
America’s lack of interest in cash allowances for children flies in the face of solid evidence that they are highly effective. The Century Foundation’s Bernard L. Schwartz Rediscovering Government Initiative held a conference on the issues in January called Child Poverty Solutions That Can Work. Below we present briefly the encouraging findings.
As currently practiced in many nations, a cash allowance is provided monthly or weekly to parents for each child. Usually, all that’s needed to collect is a birth certificate.
The justification for the policy is that additional cash income allows parents to invest more in their children, but in a way that is adaptable to specific circumstances. In the United States for example, a single working mother who doesn’t have enough cash income to buy the cheaper monthly Metrocard in New York City would have very different needs than an unemployed family of four living in rural Utah needing gas for the car. Transferring cash allows each family to use the money as it sees fit, while respecting each family’s autonomy.
A typical American bias against such allowances that we often hear is that poor parents will merely use the money, perhaps on drugs, alcohol, and cigarettes. But there is strong evidence from others nations and a few programs in the United States that parents tend to spend the money on their children, not themselves. (U.S. policy instead leans toward tax credits, which require the recipient to hold a job. For the poor, however—especially single parents—this requirement can result in enormous stress, that is in turn passed on to children in a damaging way.)
Britain offers a highly useful case study on how cash allowances are used. An in-depth analysis by sociologist Jane Waldfogel at Columbia University and co-researchers tracked the spending habits of British parents after a substantial increase in a universal child allowance in 1999. Low-income families prioritized spending of the cash allowance on goods for their children, such as clothes, books, and toys. It turned out that high-income parents who received the allowance were significantly more likely than low-income parents to spend the extra income on alcohol and tobacco.
Lessons at Home
There have been some encouraging studies that focus on examples of cash allowances in the United States at the local level. A leading scholar in the field, Greg Duncan, professor at University of California-Irvine, along with co-researchers, found impressive correlations between cash allowances in some regions and children’s educational achievement.
One report by Duncan and colleagues, synthesizing five large-scale studies, explored the educational achievement of children whose parents were enrolled in welfare programs in the 1990s that had mandatory employment requirements, such as job training or education. Additionally, some of the parents were given an increase in cash income above what others in the welfare program received. Children of parents who received only mandatory employment services did not show any noteworthy achievement effects. However, younger children in this study whose families also received additional cash income had measurable achievement gains, such as higher test scores and better reports from teachers, in comparison to the children of families who didn’t receive the extra income.
A similar study by researchers at Duke University analyzed a North Carolina Cherokee tribe that opened a casino and distributed $4,000 annually to each adult in the community. A comparison of poor Native American adolescents in the area whose parents received the extra income with those who did not found that school attendance and graduation rates increased among those who benefited, while criminal behavior and drug use decreased.
Duncan also has developed research showing the importance of early childhood income for outcomes later in life. Data from the Panel Study of Income Dynamics, a household survey that spanned over thirty years, showed that an income boost for poor families during early childhood was associated with greater adult work hours and earnings of the children.
In 2007, Mayor Mike Bloomberg implemented a version of cash allowances—conditional cash transfers—as an experiment in New York City. Called Opportunity NYC, the program gave cash to low-income families, conditional on their fulfilling certain requirements, such as making sure their children attended school and had regular doctor and dentist visits. An average of $8,700 was transferred to each participating family over three years.
The results were impressive. Material hardship, such as food scarcity and financial worries, was reduced and families increased their savings. The purchase of health insurance and dental care also increased. On the other hand, there were no significant positive increases in educational outcomes, which was one of the study’s main goals. Lawrence Aber, professor at NYU, argues that the program should have rewarded more measures of educational improvement, such as getting good grades, and was ended too soon to see whether there would be measurable educational improvements.
Cash can help with the psychological stress for parents that is associated with poverty and scarcity, thus leading to parenting that is more nurturing and less punitive. It’s hard to worry about quality time with your children when you’re about to get evicted from your apartment. This is one reason, researchers hypothesize, why cash allowances produce measurably improved outcomes for children.
While more research is required about the amount and timing of income boosts, it’s increasingly clear that there are substantial positive effects attributable to higher cash incomes for the poor with children.
Lessons from Abroad
As noted, almost every country in Europe has some form of benefit policy that provides cash to parents. Additionally, cash allowances have been implemented as major anti-poverty measures in numerous developing countries in Latin America and Asia.
The United Kingdom is probably the best recent example of a government making a major commitment to reducing child poverty. To do this, Tony Blair’s government made numerous investments in parents and children, such as increasing parental leave and childcare assistance. Prominent among them was the large increase discussed above in the child cash allowance, directed especially to young children.
The United Kingdom also created a child tax credit, in addition to the cash allowance, which, like the U.S. child tax credit, is provided to middle and low-income parents. However, the U.S. credit is regressive (meaning poorer families get less money) and not fully refundable, so that those without an adequate income receive no benefit. The U.K. credit, on the other hand, did not require parents to be working and was fully refundable so that those with low to no income could still take advantage of the tax credit. In addition, payments were made on a progressive basis, so that poorer families got more money.
The result? Child poverty in the United Kingdom was reduced since 1999 by an impressive 53 percent in the following decade (using the official government absolute poverty threshold, comparable to the U.S. measure). In the meantime, U.S. child poverty rose by 25 percent. And as we noted, the money was typically spent on the kids, not the parents.
Advantages to a Cash Allowance Policy in the United States
One of the major benefits of child cash allowances is that they can be paid out monthly or weekly, while tax credits, currently our country’s major anti-child poverty policy, are only distributed annually. Families cannot easily budget for their EITC or CTC benefits when they need to pay rent at the end of the month or buy groceries every week.
Another potential advantage of cash allowances is that they can be made universal—that is, every parent, regardless of their income level, would receive the benefit for their children. In this way, universal benefits have a nearly perfect take-up rate, or percentage of eligible people who participate in the benefit, because all that is required is to fill out a form and send in a birth certificate. No proof of income is required.
By contrast, around a quarter of U.S. families miss out on refundable tax credits for which they are eligible because of the complicated application process. The simpler universal cash allowance would also allow for fewer errors in making payments.
A universal benefit also acknowledges the basic fact that having children is expensive for all families in an era of stagnating wages and rising educational and health costs. Receiving income support for your child should not be shameful—the USDA estimates it will cost a quarter of a million dollars to raise a child (not including college).
While channeling money to every child seems expensive, it may well be more costly as a society to ignore our child poverty problem. The United Kingdom achieved its radical child poverty cuts by investing an extra 0.9 percent of GDP, which would translate roughly to $150 billion in the United States. This investment could help to offset the $500 billion or 4 percent of GDP that child poverty (by one estimation) costs our country every year due to reduced productivity of future workers, increased costs of crime, and higher health expenditures.
A Better Future
With a child poverty rate of 20 percent, we need to look toward empirically based innovative solutions that work for families. Evidence points to the great potential that cash allowances can have for America’s children. However, more policy-oriented research is needed to determine the most effective model for our country.
Opportunity NYC was one of the first formal government-sponsored cash allowance trials in America—many sociologists and economists call for more.
As Jane Waldfogel has stated, “America’s high child-poverty rate is an outcome of our social policy choices.” We have the funds and the ability to relieve hardship for our children, yet no political will to do so.
To learn more about cash allowances view our infographic, Show Kids The Money: The case for cash allowances in America.
Much is rightly made about the Republican War on Women. But the Republicans are fighting a more deliberate battle against the poor. It is audacious, insensitive and ugly. Republicans have clearly decided that the War on the Poor is good politics.
Yet the Paul Ryan budget would take two-thirds of its non-military cuts from low-income programs like Medicaid, food stamps, job training and Pell grants for college, according to the Center for Budget and Policy Priorities. While the Ryan plan would cut the tax rate for the rich to 25 percent, the non-partisan Tax Policy Center reports taxes for those who make $30,000 or less would go up.
Robert Greenstein, the president of the CBPP, calls it "likely... the largest redistribution of income from the bottom to the top in modern U.S. history."
The budget policy is only the spearhead of the war on the poor. Mitt Romney and Paul Ryan would also reverse as much of Obamacare as possible, including if they can the enormous expansion of Medicaid passed by Congress. Few may realize that Medicaid, the healthcare plan for the poor, was designed only for families; no matter how poor, individuals did not qualify. Moreover, the typical cut-off for qualification even for families was two thirds of the poverty rate. This all changed with Obamacare. Some 15 to 17 million poor Americans would now get healthcare coverage.
Romney and Ryan would also change Medicare radically -- at least the Ryan budget would. Whose pocket would that come out of? The elderly, who are generally low-income Americans, have low poverty rates only because of Medicare and Social Security. They would immediately start to lose benefits if Obamacare were reversed. The Romney-Ryan camp try to cover this up by saying their plan would only affect those 55 and under today. Not so. And the Ryan plan of offering premium support -- vouchers -- rather than guaranteeing healthcare as is now done under Medicare would be highly costly to the elderly.
A recent Center for American Progress report found that ending Obamacare would cost today's seniors $11,000 due to higher premiums and higher drug costs as the famed doughnut hole was set to close. As for those who turn 65 ten years from now, the losses are huge because premiums under Romney-Ryan will not keep up with healthcare costs. That could come to $60,000 in higher payments over the typical 2023 retiree's span of retirement.
It's not just Romney and Ryan among the Republicans who are fighting a war on the poor. Republican states led the legal challenge against Obamacare, which would have provided healthcare coverage to two-thirds of Americans who have none, some 30 million people. They effectively lost in the Supreme Court. But when the Court ruled in June that states could reject the Medicaid portion of Obamacare, five Republican governors said they would, including the governor of Texas, where 25 percent of the population has no healthcare coverage. The national average is about 18 percent. These five and perhaps as many as roughly 20 more, all led by Republican governors, will do so even though the federal government will pick up 100 percent of the costs in the first few years, and 90 percent thereafter. It's worth mentioning that a new study from Harvard University finds that Medicaid does indeed save lives, reducing the death rate in several states where Medicaid had earlier been expanded.
Then there is the minimum wage. Republicans may now be trying to reduce its reach. In Arizona, Republicans have tried to repeal the minimum wage, claiming business can't afford it in a recession. But the federal minimum wage, now $7.25 an hour, has been raised so rarely in the last few decades, that it is well below its 1968 high when discounted for inflation. Mitt Romney has now backed off his long-held position to raise the minimum wage along with inflation to satisfy his fellow Republicans. They argue the old simplistic economic story that any increase in wages means lost jobs. But what America needs now is more spending -- and higher wages would help do that. America grew rapidly in the 1950s and 1960s when the minimum wage was relatively much higher than it is today.
One pro-Republican interviewee on Moyers and Company recently asked whether anyone really believed Paul Ryan was cruel and didn't care about the poor. People baring harsh policies do not grow fangs. The Ryan argument is a very old conservative one: that social programs make people dependent. One wonders whether he believe there were any poor when there were no substantial programs to redistribute money in America -- say, in the 1800s.
The Republicans of course say they want to provide jobs. Free markets, once released to work their magic, will enable workers to get a job and provide them the pride they lack. And Romney and Ryan they know how to do it -- tax cuts.
We'll get back to tax cuts. But, first, the markets don't work their magic -- anywhere. Economists like Tim Smeeding and political scientists like Lane Kenworthy have pored over the data on incomes across countries and have found that markets create a lot of poorly paid work, not only in the U.S. but also in much of Europe. The U.S. households with incomes less than 40 percent of the disposable income of the typical household comes to nearly 18 percent, but it is higher in England and not much lower in Germany or Sweden (Foreign Affairs, sept. oct, 2012, Campbell). One third of Americans have incomes below 200 percent of the poverty line -- the poverty line is about $14,000 for an individual, $22,000 for a family of four.
What makes this tolerable is that social programs, such as the Earned Income Tax Credit, Medicaid and Medicare, redistribute income. Europe requires those same programs, and theirs on balance are far more generous.
Paul Ryan has never, to my knowledge, presented evidence that reducing sharply these redistributive programs will work -- to motivate these people to find better jobs. His is an ideological argument, based on no serious theory and no serious experience. He knows, I suppose in his heart, that such a laissez-faire social state is better for the poor. And the Republicans pull out the Obama record to prove their point. The poverty rate went up in in 2009 and 2010 under Obama: he is clearly doing something wrong. Of course, Obama inherited that Bush recession.
But what they propose is more of what George W. Bush did. Big tax cuts to motivate the rich to create more jobs! It is worth noting that Bush's job creation record was the worst of any recovery in the postwar period. Moreover, wages remained essentially flat. And finally, Bush, who inherited a poverty rate from Bill Clinton of 11.3 percent, still had a poverty rate of 12.5 percent in 2007 after years of the housing-led economic expansion. In 2008, he left Obama a poverty rate of 13.2 percent, nearly two full percentage points above the one he inherited, and moving up inexorably. What about Reagan, the tax cutter? He left George H.W. Bush a poverty rate of 12.8 percent. Poverty rates had fallen to well below 12 percent in the 1970s as a result of Johnson's war on poverty. In the 1950s, the poverty rate was estimated at 22 or 23 percent.
Romney and Ryan promise jobs to reduce poverty. There is little doubt it is the healthiest of cures. But their policy of tax cuts is merely a repeat of the failed Bush years. And the only way the Reagan years look passable is if you lop off the severe recession of 1981 and 1982, a habitual trick of The Wall Street Journal editorial page. Under Reagan, the great American wage stagnation began, and productivity growth remained slow.
Romney and Ryan also claim that their cuts of social programs will get the nation's books in better balance and stoke confidence. What we need is a new stimulus to generate business and encourage companies to invest.
Are Romney and Ryan cruel? They are politicians who know their base believes the poor are getting way with something. It is a policy driven by fear and scapegoatism. They have sponsored highly deceptive ads about how Obama has taken the work requirement out of welfare and about how Medicare will rob from the elderly to finance coverage for the poor. One man's insensitivity and ignorance is another's cruelty, especially in difficult economic times. Better people would soften the anger, not stimulate it. Racism is always close to the surface when discussing social programs, even if the majority of the poor are white. The flip side of the War on the Poor is War on Minorities. It is a tragically sad country which will deliberately neglect its least advantaged -- especially when income inequality so starkly favors the rich and taxes are lower than in any other major nation. There is a hole in the Republican's moral fabric.
This post is part of the HuffPost Shadow Conventions 2012, a series spotlighting three issues that are not being discussed at the national GOP and Democratic conventions: The Drug War, Poverty in America, and Money in Politics.