When war claims a soldier’s life, what does that death signify? Almost reflexively, Americans want to believe that those making the supreme sacrifice thereby advance the cause of freedom. Since freedom by common consent qualifies as the ultimate American value, death ennobles the fallen soldier.
Yet sometimes nobility is difficult to discern and the significance of a particular death proves elusive. Consider the case of Captain William F. Reichert, shot and killed on January 27, 1971, at An Khe in the Republic of Vietnam. Captain Reichert did not fall in battle. He was assassinated. His assassin was an American soldier.
Age twenty-three, unmarried, and a graduate of West Point’s Class of 1968, Reichert was at the time commanding Troop C, First Squadron, Tenth Cavalry. As it happened, I was also stationed at An Khe then, serving as a platoon leader in Troop D.
Despite an impressive lineage, by the time I arrived, the First Squadron, Tenth Cavalry (“Buffalo Soldiers”) rated as something other than a “crack” outfit. By the winter of 1970–71, the dwindling American order of battle in Vietnam boasted few crack outfits. The U.S. Army was heading toward the exits, and those units that remained made for a motley collection.
Higher headquarters had assigned One-Ten Cav the mission of securing a long stretch of highway running west from the coastal city of Qui Nhon through the Central Highlands and on to Pleiku. The squadron’s area of operations included the Mang Yang Pass, where in 1954 the Vietminh had obliterated the French army’s Groupement Mobile 100, thereby ringing down the curtain on the First Indochina War.1
No such replay of the Little Bighorn punctuated my own tour of duty. Indeed, the operative idea—widely understood even if unwritten—was to avoid apocalyptic encounters so that the ongoing drawdown could continue. As long as the withdrawal of U.S. forces proceeded on schedule, authorities in Washington could sustain the pretense that the Second Indochina War was ending in something other than failure.
One-Ten Cav had been allotted little more than a bit part in this elaborate production. Keeping that highway open allowed daily supply convoys to move food, fuel, ammunition, and other essentials to Pleiku and points beyond. To accomplish this mundane task, Buffalo Soldiers in armored vehicles guarded bridges or reacted to enemy ambushes. Others, in helicopters or on foot, conducted reconnaissance patrols, flying above or trudging through the jungle. The assignment offered little by way of glory or grandeur, both of which were then, in any case, in short supply throughout South Vietnam. That late in the war, navigating between honor and dishonor, foolhardy courage and craven cowardice, necessary subordination and mindless obedience posed challenges. It was not a happy time or place to be an American soldier.
Yet if the squadron did not literally share G. M. 100’s fate, it was succumbing incrementally to a defeat that was hardly less decisive. As any home owner will tell you, a leaky roof, if left unattended, can pose as much danger as a category five hurricane. Collapse is just a longer time coming. In the backwater that was An Khe, the roof was leaking like a sieve.
No one was likely to mistake the United States in 1971 for a land of concord and contentment. During the interval between the assassination of John F. Kennedy and the election of Richard M. Nixon, cleavages dividing left and right, black and white, flag burners and flag wavers, college kids and working stiffs had become particularly acute. Looming in the background was an even more fundamental cleavage between state and country. Depending on which camp you occupied, the government appeared either clueless or gutless. In any case, those exercising political authority no longer commanded the respect or deference they had enjoyed during the 1940s and 1950s. Sullen citizens eyed their government with cynicism and mistrust.
Comparable division and discord pervaded the ranks of those sent to serve in Vietnam. In the war zone, the animosity between the governing and the governed at home found its parallel in the relationship between leaders and led. In Vietnam, sullen enlisted soldiers—predominantly draftees—eyed their officers with cynicism and mistrust.
To vent their anger at policies not to their liking, outraged citizens engaged in acts of protest. To express their animus toward leaders not to their liking, alienated soldiers did likewise, their acts of protest ranging from disrespect to shirking to out-and-out insubordination. (The army’s unofficial motto had by then become “don’t mean nothin’,” usually muttered sotto voce at the back of some annoying superior.)
On January 27, 1971, Private First Class James D. Moyler, a twenty-year-old helicopter crewman from Chesapeake, Virginia, carried matters further. After exchanging words over allegations of barracks theft, the black soldier flipped the safety off his M16 and in broad daylight shot C Troop’s white commander at point-blank range. Captain Reichert bled to death in front of his own orderly room.
With the military justice system promptly cranking into high gear, Moyler was quickly arrested, jailed, charged, court-martialed, convicted, and sentenced to a long prison term. In the blink of an eye, he disappeared. From an institutional perspective, so too did the entire episode. In Saigon and Washington, those presiding over the war had no intention of allowing the death of Captain Reichert to affect their plans.
So in its sole report on the incident, the Pacific edition of Stars and Stripes offered the barest recitation of the facts—a masterful exercise in journalistic minimalism. As if in passing, however, the newspaper hinted at a larger context. Earlier that same month in Quang Tri Province, Stripes noted, one officer had been killed and another wounded “following a quarrel with enlisted men.” Meanwhile, at Tan Son Nhut Air Base outside Saigon, someone had rolled a fragmentation grenade into the quarters of a military police officer, wounding him as he slept. Again, enlisted soldiers were suspected of perpetrating the attack, “although no one ha[d] been charged.”2
In other words, what had occurred at An Khe, however shocking, did not qualify as particularly unusual. Disgruntled soldiers obliged to fight a war in which they (along with most of the country) had ceased to believe were not without recourse. Among the options available was the one PFC Moyler had chosen, turning weapons intended for use against the enemy on those whose authority they no longer recognized.
The implications of Moyler’s action were, in military terms, beyond alarming. To sustain a massively unpopular war, the state had resorted to coercive means: report for duty or go to jail. At home, clever young men had become adept at evading that choice and so the war itself. Those less clever or more compliant ended up in uniform and in Vietnam. There, the nominally willing—now armed—were having second thoughts. In increasing numbers, they not only refused to comply but were engaging in acts of resistance.
The problem was Vietnam, of course. But the war had become inextricably tied to conscription. To save itself, the army desperately needed to free itself of the war—and of those compelled to serve against their will. Allowed to spread unchecked, the poisons made manifest at An Khe posed an existential threat to the institution as a whole. Even to a subaltern as callow and obtuse as I was, that much was apparent.
That other, unforeseen consequences might also ensue, unfavorable to the army, to soldiers, and to the country, did not occur to me. All that mattered then was to escape from an unendurable predicament. If that meant putting some distance between the army and the American people, so be it.
In the years that followed, the army effected that escape, shedding the war, the draft, and the tradition of a citizen-based military. Henceforth, the nation would rely on an all-volunteer force, the basis for a military system designed to preclude the recurrence of anything remotely resembling Vietnam ever again. For a time, Americans persuaded themselves that this professional military was a genuine bargain. Providing fighting forces of unrivaled capabilities, it seemingly offered assured, affordable security. It imposed few burdens. It posed no dangers.
In relieving ordinary citizens of any obligation to contribute to the country’s defense, the arrangement also served, for a time at least, the interests of the military itself. In the eyes of their countrymen, those choosing to serve came to enjoy respect and high regard. Respect translated into generous support. Among the nation’s budgetary priorities, the troops came first. Whatever the Pentagon said they needed, Washington made sure they got.
As a consequence, the army that I left in the early 1990s bore no more resemblance to the one into which I had been commissioned than a late model Ferrari might to a rusted-out Model T. The soldiers wanted to soldier. NCOs knew how to lead, and smart officers allowed them to do so. Given such a plethora of talent, even a mediocre commander could look good. As for an unofficial motto, the members of this self-consciously professional army were inexplicably given to shouting “Hooah” in chorus, exuding a confidence that went beyond cockiness.
Here, it appeared, was a win-win proposition. That the all-volunteer force was good for the country and equally good for those charged with responsibility for the country’s defense seemed self-evident. Through the twilight years of the Cold War and in its immediate aftermath, I myself subscribed to that view.
Yet appearances deceived, or at least told only half the story. Arrangements that proved suitable as long as deterring the Soviet threat remained the U.S. military’s principal mission and memories of jungles and rice paddies stayed fresh proved much less so once the Soviet empire collapsed and the lessons of Operation Desert Storm displaced the lessons of Vietnam. With change came new ambitions and expectations.
For a democracy committed to being a great military power, its leaders professing to believe that war can serve transcendent purposes, the allocation of responsibility for war qualifies as a matter of profound importance. Properly directed—on this, President George W. Bush entertained not the least doubt—a great army enables a great democracy to fulfill its ultimate mission. “Every nation,” he declared in 2003, “has learned an important lesson,” one that events since 9/11 had driven home. “Freedom is worth fighting for, dying for, and standing for—and the advance of freedom leads to peace.”3 Yet the phrasing of Bush’s formulation, binding together war, peace, and freedom, might have left a careful listener wondering: Who fights? Who dies? Who stands? The answers to this triad of questions impart to democracy much of its substantive meaning.4
In the wake of Vietnam, seeking to put that catastrophic war behind them, the American people had devised (or accepted) a single crisp answer for all three questions: not us. Except as spectators, Americans abrogated any further responsibility for war in all of its aspects. With the people opting out, war became the exclusive province of the state. Washington could do what it wanted—and it did.
In the wake of 9/11, as America’s self-described warriors embarked upon what U.S. leaders referred to as a Global War on Terrorism, the bills came due. A civil-military relationship founded on the principle that a few fight while the rest watch turned out to be a lose-lose proposition—bad for the country and worse yet for the military itself.
Rather than offering an antidote to problems, the military system centered on the all-volunteer force bred and exacerbated them. It underwrote recklessness in the formulation of policy and thereby resulted in needless, costly, and ill-managed wars. At home, the perpetuation of this system violated simple standards of fairness and undermined authentic democratic practice.
The way a nation wages war—the role allotted to the people in defending the country and the purposes for which it fights—testifies to the actual character of its political system. Designed to serve as an instrument of global interventionism (or imperial policing), America’s professional army has proven to be astonishingly durable, if also astonishingly expensive. Yet when dispatched to Iraq and Afghanistan, it has proven incapable of winning. With victory beyond reach, the ostensible imperatives of U.S. security have consigned the nation’s warrior elite to something akin to perpetual war.
Confronted with this fact, Americans shrug. Anyone teaching on a college campus today has experienced this firsthand: for the rising generation of citizens, war has become the new normal, a fact they accept as readily as they accept instruction in how to position themselves for admission to law school.
The approach this nation has taken to waging war since Vietnam (absolving the people from meaningful involvement), along with the way it organizes its army (relying on professionals), has altered the relationship between the military and society in ways that too few Americans seem willing to acknowledge. Since 9/11, that relationship has been heavy on symbolism and light on substance, with assurances of admiration for soldiers displacing serious consideration of what they are sent to do or what consequences ensue. In all the ways that actually matter, that relationship has almost ceased to exist.
From pulpit and podium, at concerts and sporting events, expressions of warmth and affection shower down on the troops. Yet when those wielding power in Washington subject soldiers to serial abuse, Americans acquiesce. When the state heedlessly and callously exploits those same troops, the people avert their gaze. Maintaining a pretense of caring about soldiers, state and society actually collaborate in betraying them.
Operational Purpose: Mimicking Israel
Peace means different things to different governments and different countries. To some it suggests harmony based on tolerance and mutual respect. To others it serves as a euphemism for dominance, quiescence defining the relationship between the strong and the supine.
In the absence of actually existing peace, a nation’s reigning definition of peace shapes its proclivity to use force. A government committed to peace-as-harmony will tend to employ force only as a last resort. The United States once subscribed to this view—or beyond the confines of the Western Hemisphere at least pretended to do so.
A nation seeking peace-as-dominion will use force more freely. This has long been an Israeli predilection. Since the end of the Cold War and especially since 9/11, it has become America’s as well. As a consequence, U.S. national security policy increasingly conforms to patterns of behavior pioneered by the Jewish state. In employing armed force, decision makers in Washington (regardless of party) claim the sort of prerogatives long exercised by decision makers in Jerusalem (regardless of party). Although this “Israelification” of U.S. policy may have proven beneficial for Israel, it has not been good for the United States, nor will it be.
Credit Israeli statesmen with this much: they express openly views that American statesmen hide behind clouds of obfuscation. Here, for example, is Israeli Prime Minister Benjamin Netanyahu in June 2009 describing what he called his “vision of peace”: “If we get a guarantee of demilitarization . . . we are ready to agree to a real peace agreement, a demilitarized Palestinian state side by side with the Jewish state.” Now, the inhabitants of Gaza and the West Bank, if armed and sufficiently angry, can annoy Israel, though not destroy it or even do it serious harm. By any measure, the Israel Defense Forces (IDF) wield vastly greater power than the Palestinians could possibly muster. Still, from Netanyahu’s perspective, “real peace” becomes possible only if Palestinians guarantee that their putative state will forgo even the most meager military capabilities. Your side disarms, our side stays armed to the teeth; that in a nutshell describes the Israeli prime minister’s conception of peace.
Netanyahu asks a lot of Palestinians. Yet however baldly stated, his demands reflect long-standing Israeli thinking. For Israel, peace derives from security, which must be absolute and assured. Security thus defined requires not military advantage but military supremacy.
Given the importance that Israel attributes to security, anything that threatens it requires anticipatory action, the earlier the better. The IDF attack on Iraq’s Osirak nuclear reactor in 1981 provides one example. Israel’s destruction of a putative Syrian nuclear facility in 2007 provides a second; its 2013 attack on a Syrian convoy allegedly delivering weapons to Hezbollah offers a third.
Yet alongside perceived threat, perceived opportunity can provide sufficient motive for anticipatory action. In 1956 and again in 1967, Israel attacked Egypt not because its leader, the blustering Colonel Gamal Abdel Nasser, possessed the capability (even if he proclaimed the intention) of destroying the hated Zionists, but because preventive war seemingly promised a big Israeli payoff. In the first instance, the Israelis came away empty-handed; their British and French allies caved in the face of pressure imposed by an angry President Dwight D. Eisenhower, and Israel had no choice but to follow suit. In 1967, Israelis hit the jackpot operationally, albeit with problematic strategic consequences. Subjugating a substantial and fast-growing Palestinian population that Israel could neither assimilate nor eliminate imposed heavy burdens on the victors.
Adherence to this knee-to-the-groin paradigm has won Israel few friends in the region and few admirers around the world (Americans notably excepted). The likelihood of this approach eliminating or even diminishing Arab or Iranian hostility toward Israel appears less than promising. That said, the approach has thus far succeeded in preserving (and even expanding) the Jewish state: more than sixty years after its founding, Israel persists and even prospers. By this rough but not inconsequential measure, the Israeli security concept, nasty as it may be, has succeeded.
What’s hard to figure out is why the United States would choose to follow Israel’s path. A partial explanation may lie with the rightward tilt of American politics that began in the late 1970s, affecting the way both Republicans and Democrats have approached national security ever since. Among hawks in both parties, Israel’s kick-ass pugnacity struck a chord. As a political posture, it can also win votes, as it did so memorably for Ronald Reagan, campaigning for the presidency back in 1980, in the midst of the Iran hostage crisis. As a presidential candidate, Reagan not only promised unstinting support for Israel but also projected a “take no guff” attitude that came right out of that country’s political playbook. The contrast with Jimmy Carter, who was seemingly taking a lot of guff from abroad, could hardly have seemed starker. That Reagan proceeded to trounce Carter was a lesson not lost on candidates in subsequent elections.
Over the course of the Bush/Clinton/Bush/Obama quarter century, following in Israel’s path describes precisely what Washington has done. A quest for global military dominance, pursued in the name of peace, and a proclivity for preemption, justified as essential to self-defense, pretty much sums up America’s present-day MO.
Israel is a small country with a small population and no shortage of hostile neighbors. The United States is a huge country with an enormous population and no enemy within several thousand miles of its borders (unless you count the Cuban-Venezuelan Axis of Ailing Autocrats). Americans have choices that Israelis do not. Yet in disregarding those choices, the United States stumbled willy-nilly into an Israel-style condition of perpetual war—with peace increasingly tied to unrealistic expectations that adversaries and would-be adversaries will comply with Washington’s demands for submission.
Israelification got its kick-start with George H. W. Bush’s Operation Desert Storm, that triumphal Hundred Hour War likened at the time to Israel’s triumphal Six Day War. As we have noted, that victory fostered illusions of the United States exercising perpetually and on a global scale military primacy comparable to what Israel has enjoyed regionally. Soon thereafter, the Pentagon announced that it would settle for nothing less than what it termed full spectrum dominance.
Bill Clinton’s contribution to the process was to normalize the use of force. During the several decades of the Cold War, the United States had resorted to overt armed intervention only occasionally. Although difficult today to recall, back then whole years might pass without U.S. troops being sent into harm’s way. During Clinton’s two terms in office, however, intervention became commonplace.
The average Israeli had long since become inured to reports of IDF incursions into southern Lebanon or Gaza. Now the average American became accustomed to reports of U.S. troops battling Somali warlords, supervising regime change in Haiti, or occupying the Balkans. Yet the real military signature of the Clinton years came in the form of air strikes. Employing bombs and missiles to blast targets in Afghanistan, Bosnia, Serbia, and Sudan, but above all Iraq, became the functional equivalent of Israel’s reliance on airpower to punish “terrorists” while avoiding the risks and complications of putting troops on the ground.
In the wake of 9/11, George W. Bush, along with Secretary of Defense Rumsfeld, a true believer in full spectrum dominance, set out to liberate or—take your pick—pacify the Islamic world. The United States followed Israel in assigning itself the prerogative of waging preventive war. Although depicting Saddam Hussein as an existential threat, the Bush administration also viewed Iraq as an opportunity. By destroying his regime and occupying his country, the United States would signal to other recalcitrants the fate awaiting them should they mess with or defy Uncle Sam.
More subtly, in going after Saddam, Bush was tacitly embracing a long-standing Israeli conception of deterrence. For the United States during the Cold War, deterrence had meant conveying a credible threat (the prospect of nuclear retaliation) to dissuade your opponent (the Soviet Union) from hostile action. Israel had never subscribed to that view. Influencing the behavior of potential adversaries required more than signaling what Israel might do if sufficiently aggravated or aggrieved; influence was exerted through punitive action, ideally delivered on a disproportionate scale. Hit the other guy first, if possible. Failing that, whack him several times harder than he hit you; not the biblical injunction of an eye for an eye, but both eyes, an ear, and several teeth, with a kick in the groin thrown in for good measure. The aim of these “retribution operations” was to send a message: screw with us and this will happen to you. This message Bush intended to convey when he ordered the invasion of Iraq in 2003.
Unfortunately, Operation Iraqi Freedom, launched with all the confidence that had informed Operation Peace for Galilee, Israel’s equally ill-advised 1982 incursion into Lebanon, landed the United States in an equivalent fix. Or perhaps a different comparison applies: the U.S. occupation of Iraq triggered violent resistance similar to that which the IDF faced as a consequence of its occupation of the West Bank. Two successive intifadas gave the Israeli army fits. The insurgency in Iraq (along with its Afghan sibling) gave the American army fits, too.
Neither the Israeli nor the American reputation for martial invincibility survived these encounters. What did survive was Washington’s belief, imported from Israel, in the efficacy of anticipatory action. The United States now subscribed to the view that the key to successful self-defense was to attack the other guy first. The dictum that force should be a last resort might still apply to others, but after 9/11 it no longer applied to the United States. Nothing that actually occurred in Iraq, the first large-scale application of the Bush Doctrine of preventive war, altered that conviction.
While he was in office, George W. Bush adamantly rejected the arguments of those who characterized the intervention in Iraq as a mistake or a failure. Senator Barack Obama had forthrightly labeled the war a mistake of the first order, a position that became the foundation for his run for the presidency in 2008. Yet when the war became his, President Obama proved less inclined to criticize its conduct. By the time it finally ended in 2011, Obama was spinning the mission in Iraq as anything but a flop or a fiasco. Speaking at Fort Bragg, North Carolina, he described the war as “one of the most extraordinary chapters in the history of the American military,” culminating in “an extraordinary achievement, nearly nine years in the making,” to wit, the emergence of “a sovereign, stable and self-reliant Iraq.”
At best half-truths tailored to suit Obama’s soldier audience, his claims nonetheless meshed with the inclinations of Americans unwilling to face a painful truth: that the culmination of the Iraq War had yielded a verdict almost identical to that of Vietnam. As in Vietnam, U.S. forces had not succumbed to outright defeat. Yet whatever solace Americans might take from that fact paled in comparison with the massive policy failure the United States had again suffered.
Characterizing the Iraq War as a success, however tenuous or qualified, undoubtedly made that disaster easier to swallow, while also shielding the Bush Doctrine of preventive war from critical scrutiny. On that score, Obama expressed no regret and apparently harbored no second thoughts. Bush’s successor was not about to forfeit the option of striking first. In that regard, the Gates Doctrine—no more land invasions in the Middle East or Asia—may have amended but did not revoke the Bush Doctrine. Gates had no wish to limit U.S. freedom of action in employing force. He merely pointed to the need to devise different techniques for doing so. Like Israel, when it comes to “anticipatory defense,” the United States shows no signs of relinquishing its self-proclaimed entitlement.
The Israelification of U.S. policy also extended to the means employed in waging war. As its own preferred approach to preventive action, Israel had for decades relied on a powerful combination of tanks and fighter-bombers. In more recent times, however, it has sheathed its swift sword in favor of the knife between the ribs. Why deploy lumbering armored columns when a missile launched from a single Apache attack helicopter or a bomb fixed to an Iranian nuclear scientist’s car can do the job more cheaply and with less risk? Thus has targeted assassination eclipsed conventional military methods as the hallmark of the Israeli way of war.
Here too, lagging behind by a couple of decades, the United States has conformed to Israeli practice. By the time Barack Obama succeeded Bush in 2009, most Americans (like most Israelis) had lost their appetite for invading and occupying countries. Yet as a political rallying cry, “Never Again Iraq” no more foreshadowed a dovish turn in U.S. policy than had “No More Vietnams.”
Nobel Peace laureate or not, Obama had no intention of forfeiting the expanded latitude to use force bequeathed to him by his predecessor. Yet to maintain his freedom of action—affirming that war was the commander in chief’s business, with others invited to butt out—Obama needed to avoid the mistakes that had tripped up Bush. By reducing the likelihood of costly quagmires, the Israeli approach again offered an attractive model.
With this in mind, Obama demonstrated a keen preference for small-scale operations rather than big wars, quick strikes rather than protracted campaigns, actions conducted in secret rather than under the glare of publicity. The Washington Post columnist Michael Gerson got it right. “Obama wants to be known for winding down long wars,” he observed. “But he has shown no hesitance when it comes to shorter, Israel-style operations. He is a special ops hawk, a drone militarist.”
With his affinity for missile-firing drones, the president established targeted assassination as the very centerpiece of U.S. national security policy. With his predilection for commandos, he expanded the size and mandate of U.S. Special Operations Command, which under Obama maintained an active presence in some 120 countries.
That Obama should further the Israelification of U.S. policy was not without irony. After all, personal relations between the president and his Israeli counterpart, Prime Minister Netanyahu, were famously chilly. Yet in Yemen, Somalia, the frontier regions of Pakistan, and other far-flung places, Obama showed that when it came to using force, he and Netanyahu occupied the same page. Both were committed to the proposition that if you keep whacking bad guys long enough, a positive outcome should eventually ensue.
In the meantime, for the president, the downside of targeted assassination appeared minimal. True, from time to time an errant U.S. missile might kill the wrong people (to include children) or American commandos might “take out” some bystanders along with Mr. Big. Yet back home, reported incidents of this type elicited a muted response. As far as the American media were concerned, the death of a few nameless Somalis or Pakistanis carried about as much newsworthiness as a minor traffic accident. As a determinant of presidential standing, a U.S. fighter-bomber inadvertently wiping out an Afghan wedding party lagged far behind a slight uptick in the unemployment rate.
The government of Israel (along with ardently pro-Israeli Americans like Michael Gerson) may view the convergence of U.S. and Israeli national security practices with some satisfaction. Washington’s now-prevailing definition of self-defense—a self-assigned mandate to target anyone anywhere thought to endanger U.S. security—is exceedingly elastic. As such, it provides a certain cover for equivalent Israeli inclinations. And to the extent that the American roster of enemies overlaps with Israel’s—Iran providing an obvious example—the hope always remains that military action ordered by Washington just might shorten Jerusalem’s “to do” list.
Yet where does this all lead? “We don’t have enough drones,” writes the columnist David Ignatius, “to kill all the enemies we will make if we turn the world into a free-fire zone.”18 And if Delta Force, the Green Berets, Army Rangers, Navy SEALs, and the like constitute, in the words of one SEAL, “the dark matter . . . the force that orders the universe but can’t be seen,” we probably don’t have enough of them either.19 Unfortunately, the Obama administration has seemed willing to test both propositions.
From the Book BREACH OF TRUST: How Americans Failed Their Soldiers and Their Country by Andrew J. Bacevich. Copyright © 2013 by Andrew J. Bacevich. Reprinted by arrangement with Metropolitan Books, an imprint of Henry Holt and Company LLC.