Heroes, Outlaws & the Heretics of Corporate Change

Executives, it is said, are too busy to read. Instead, they skim. They spend their precious reading time -- on the train, on hold during a phone call, or under the porch light while their children sleep -- panning through hyped-up, ponderous management literature, searching relentlessly for any practical nuggets. Give them something they can use, on the shop floor or in a strategic plan. Give them something that can solve their problems, or make their efforts succeed, or let them go home early tonight.This is not a book of nuggets, but a story of management history. To read it, if I've done my job correctly, is to drift down the broad stream of business thought and activity -- particularly the rebellious and innovative parts -- from the past fifty years. En route, you will relive the intellectual influences and emotionally charged controversies of managerial culture since World War II. Beginning in the late 1950s, a growing number of heretics emerged in the dominant institutions of our time -- mainstream, publicly held, large multinational corporations. These were people within the firm, who saw a truth that ran against its prevailing attitudes. They saw how, despite the power of corporate practice, something desperately desirable had been lost in everyday corporate life: A sense of the value of human relationships and community. They saw how, without that human spirit, corporations could not perform.In 1959, if you were a college student positioned among the best and brightest of your peers, you might have wanted to become an engineer or physicist, and contribute to America's landing on the moon. In 1979, you would set your sights on becoming an investment banker, and a millionaire by 40. But in 1969, there were only two appropriate choices: To be an artist, or to save the world. In both cases, like the genteel aristocrats of the British empire, you weren't really supposed to think about money, overt status, or even achievement. Instead, as the mythologist Joseph Campbell began to tell his students at lectures, you were supposed to "follow your bliss" -- follow the goals you felt called upon to pursue.In the separate societies of Berkeley, Cambridge, the East Village, Austin, the Haight, the French Quarter, Ann Arbor, and college towns everywhere, the bottom dropped out of the primary currencies of the business world. Money had no cachet; security was for frightened conformists. Planning for the future -- making the right decision, cultivating the right career -- was meaningless. It meant consigning yourself to a life in prison: a life of making choices based on what someone else would tell you to do, gradually internalizing that tyrannical authority in your own mind.To people who complained, in effect, "Stop the world; I want to get off," the counterculture replied, in effect, "You can." Millions of young people opted out of industrial society. They supported themselves on odd jobs and allowances from their parents. They lived three, four, or six to an apartment, choosing poverty because it made them free, and because there was no loss of dignity in it. If you were bright, poverty was exalting. It meant (though nobody used this language) that you had found a way to re-embrace the vernacular spirit that society had tried to discard. You could live in a utopian sphere, where everyone was an aristocrat. You did not commute, learn to balance a checkbook, or buy a business suit. You experienced life at each moment, and felt no need to poison the quality of the moment by preparing for the future. More often than not, your life was wildly creative and inventive, spurred not just by drugs, but by camaraderie and free time.Counter-culture people could afford to take their stance precisely because living was easy. Rents were cheap; the parks, the streets, and the love of fellow members of the tribe were all available for nothing. As Herman Kahn noted, it cost about $500 per year to live as a hippie in 1968; which meant that twelve young people could share a large house, each working one month a year in the post office and taking eleven off. (Nobody actually did this; but it gave Kahn a nice, mythically apt image to use in speeches.) In the heyday of the hippie era, suffused with vernacular spirit, people acted as if sharing with each other, even with strangers, was the most natural thing in the world. "Free" stores and clinics opened in neighborhoods like the Haight-Ashbury of San Francisco or Cambridge in Massachusetts: distributing food, used clothing, emergency medical care, and dense, mimeographed broadsides about the coming apocalypse. They also took time to talk. The counterculture was rife with groups, clubs, cells, meeting grounds, and mutual enterprises -- but devoid of authority. Living in the counterculture, in a sense, was like living in a great, community-wide T-Group. Nothing was decided except through drawn-out dialogues and consensus sessions. Anyone was, in theory at least, capable of raising an objection or changing the flow of activity. The hippies were as intensely Pelagian as the academics of the National Training Laboratories, but their Pelagianism came straight from the heart. They lived and breathed it, without having to cultivate a particular environment for it.Admittedly, those who had money and success sometimes felt as if they were carrying the burden of the tribe. Rock musicians with record contracts felt under constant pressure to distribute their largesse. The music was an expression of the crowd flowing through them, so didn't the rewards from that music belong to everyone? The San Francisco concert promoter Bill Graham often found himself in shouting matches with young, would-be concert-goers who wanted him to stop charging money for music. "What does your father do? Is he a plumber? Well, I want my pipes fixed for free! Is he a baker? Well, I want my bread for free!" In retrospect, the startling thing about the counterculture was the disconnect between production and consumption. How could so many smart people fail to see a link between the work people did, the investments they made, and the rewards they gained? Only a very few students understood the issues of infrastructure, and the industrial economy, any better. Most young people had this blind spot, perhaps, because they had lived their entire lives in a world dominated by the giant, post-war, all-nurturing corporations -- institutions that separated work and sustenance. Daddy produced; Mommy consumed. Children never went to the office, for there was nothing resembling day-care there, even in offices with women managers. ("Someday," the writer Betty Harragan would later point out, "reactionary management may understand that the poor 'image' of business in the public opinion polls will never be successfully counteracted as long as children in their most impressionable, formative years grow up with the idea that a business work site is an unfriendly, repelling, and sinister place which refuses to accept them until they are over twenty years old.") Instead of going to work, children of the 1950s and 1960s went to school, where they learned that the rewards for their labor and achievement were never tangible. The reward was an empty grade, a classification of a rank against your peers. What was the point of living a life measured by that?At first, it seemed as if the counterculture would do without the market entirely, and produce everything they needed themselves. Some tried; self-sufficiency was an organizing principle in many of the communes of the late 1960s and early 1970s. But as historian Fernand Braudel has pointed out, existence without a market is "the lowest plane of human existence, where each man must himself produce almost all he needs." Only a few communards who tried that game, the purest of mystics and ideologues, managed to sustain themselves. The rest turned their communes into businesses (the Farm, in Summertown Tennessee, earned its income by publishing books about midwifery, ham radios, and their own experience). Or else they came running back to the city, creating institutions of their own.The first urban countercultural businesses emerged in the early 1960s. A woman who called herself Magnolia Thunderpussy opened a restaurant at Haight and Stanyan Streets in San Francisco in 1962. It was designed whimsically and flamboyantly, so that every meal was a theatrical experience. By the mid-1960s there were restaurants serving cheap, whole-grain food; often, the bulk of their clientele were regulars whose hair was too long, and clothes were too ragged, to fit in at ordinary restaurants. Other hippies, traveling to Katmandu or Amsterdam, brought back the fruits of the life they found there -- for sale, to local boutiques.Counterculture people often learned the rudimentary skills of marketing and management from drugs: by dealing marijuana and LSD. They'd give away a free joint to a new customer: a time-honored sales promotion technique. Like the bootleggers of the 1920s, they had to learn the intricacies of bookkeeping, or else they would get swindled by their suppliers -- or hurt by their customers. Around 1969, organized crime moved into the lucrative narcotics trade of neighborhoods like the Haight-Ashbury and the East Village. The hippies stepped away from selling dope; they parlayed their account-keeping and budgeting knowledge into new lines of work, unabashedly aiming their goods and services at the members of the "tribe." By the early 1970s, most mainstream services had counterculture equivalents. There were clothing stores which mixed new and used clothing, head shops which sold posters and drug paraphernalia, record and music stores of every description, food coops with organic grains that customers scooped into their own jars from large wooden barrels, and there were publishers of magazines and books, rooted in rock and culture, which the mainstream publishers didn't imagine would sell.At first, these new forms of commerce lacked infrastructure. Counterculture business people rarely knew about each other; they rarely knew how to keep going. The large-scale information in conventional business publications -- The Wall Street Journal and Fortune -- was worse than useless to them. It suggested that the only way to succeed was to professionalize yourself, to enter school. As an alternative to that horrible fate, the counterculture began to invent its own infrastructure. Instead of slipping through the cracks in the mainstream, they widened the cracks into an extensive surface. Gradually, this alternative infrastructure (which, after all, depended on the mainstream shipping and telecommunications lines) infiltrated and assimilated its way into American business culture. ***It would have been startling, back in 1964 or 1965, to see how the decade of the 1960s is regarded by management writers today. In our eyes now, the 1960s were a doomed golden age, where managers and labor (in particular) could feel cushioned by years of peace and a booming economy. Even former counterculture people look back on those years as an intensely protected period, a time when they felt secure enough to reinvent their lives. But it didn't seem so cushioned at the time. People writing about the 1960s at the moment seemed, almost without exception, to take an apocalyptic tone, as if the world were about to crash and burn itself out of existence any moment.The only thing that people seemed unprepared for was what actually happened: twenty-five years of steady, anxious turbulence. This was the curse of the Jacques de Molays of the 1960s: The heretics who had been scapegoated and ignored, either by the corporations who had hired them, or the apocalyptic critics of the corporations. As a nation, we were prepared for the collapse of capitalism, or its hegemony -- but not for the kind of rolling, choppy, uncertain economic growth that struck different components of society in turn with prosperity and calamity, so that no component could ever remain secure. We were prepared for a battle over the direction of government, but not for an intensely pluralistic society, in which government was no longer the primary engine of governance, having ceded that role to corporations and interest groups. We were prepared for race war, but not global interconnectedness, where economies were held in thrall to the imperatives of bond and currency markets. We were prepared for giant corporations to become public enemies, but not for them to adopt an ambiguous role as public enemy and social contributor.Most of all, we were not prepared for the speed of transactions to accelerate once again. In the 1400s, a typical business transaction might take 11 years to complete. Families dominated business, simply because business moved too slowly for individuals to master it. In the 1930s, a transaction took a week, or a quarter, or perhaps a couple of years to complete, and individuals created their own large-scale enterprises. They threw away the vernacular ties of family and community. Now, in the 1970s and 1980s, we lived in a world where a transaction took only a few seconds to complete; or, if it took more than a few months to fulfill an order, that was a sign that something was wrong. As a culture, we were not prepared for the ways in which this speed-up would allow the vernacular spirit, the spirit of the counterculture, back inside the belly of the industrial beast. Everywhere, the rules of the game changed. First to fall was the sense of legitimacy which business leaders enjoyed. They had been criticized from the left, sure, but they had always been respectable in the mainstream -- they had, after all, defined the mainstream. Then came the energy crisis of 1973-1974. The easiest villains to pillory, amidst the resulting search for scapegoats, were oil and energy companies -- and, implicitly, the profligate economy that oil companies encouraged. Washington's ascetic consumer advocate, Ralph Nader, began accusing the oil companies of fraud. None of them had increased their refining capacity during the earlier shortages of 1972. Had they deliberately held back their capacity -- creating an illusory crisis -- to force up the price? Were they trying to squeeze out smaller, independent distributors, who sold gasoline without a monopoly's brand name? Or was the oil shortage a blatant effort to undermine environmentalist opposition? Most damningly, nearly all the oil companies announced dramatic profit jumps -- in many cases, record-level net earnings -- in the last quarter of 1973.Unfortunately for their own cause, the senior executives of American oil companies were all terrible spokesmen. Early in 1974, Senator Henry Jackson subpoenaed them to testify, in televised hearings, before the special investigations subcommittee of the Senate. The executives spent the sessions staring stoically at Jackson, stiffly enduring his charges of "obscene profits," and offering only hazy answers to his questions. The only candid executive at the Senate hearings was Harold Bridges, the president of Shell Oil -- Royal Dutch/Shell's isolated American subsidiary. Bridges wryly pointed out that he and the others would be much more cogent and forthcoming if the Senators interviewed them separately, instead of in a lineup with their chief competitors. The public might think of them lumped together, as a conspiring cartel. ("They're like Siamese twins," Nader said in December. "They don't have to meet furtively. They know exactly what they are doing without meeting.") But the executives themselves were all excruciatingly aware of their long-standing rivalries and differences, as well as the anti-trust laws which kept them from communicating freely.The calumny trickled down, in various ways, to the middle-level oil supply managers who actually had handled the pressure of the oil crisis. They had worked around the clock, performing miracles with computers and shipping schedules, sneaking oil into the countries most hurt by the embargo, deciding (on the spur of the moment) which contracts to defer, and countering government threats and protests wherever they denied supply. "We were being attacked in the press every day," the head of Gulf oil supply later told Daniel Yergin. "We had to... tell old friends that we were cutting them back, and go around the world explaining the supply/demand balance."The more heroic their efforts, the more unsung they were -- even in their own companies. In the macho oil culture of Houston and Rockefeller Center, you were supposed to keep your travails secret, and never blow your own horn. Senior oil company executives might have forestalled some of the public protest if they had explained exactly where, and how, they intended to reinvest their windfall profits. But oil men didn't feel like opening up publicly. They felt that the public (and particularly the press) had put them under siege. By that time, other industries were facing similar crises of confidence. They were startled to discover, in the post-Watergate and post-Nader era, that their own hidden practices would no longer get a free ride. It came to light that food-packing companies sold cyclamate-laden fruit abroad, after the Food and Drug Administration banned it in the United States. It was discovered that McDonnell-Douglas managers suppressed their own engineers' warnings of cargo door defects, and (in the public interpretation) thus murdered 346 people in a plane crash in Paris. In early 1975, a 53-year-old man named Eli Black smashed his office window with an attache case, and leaped 44 floors to his death on a Manhattan street. He was the President of the United Fruit Company. Within a few days came the revelation of a $2.5 million bribe that United Fruit executives had given the government of Honduras. The bribe was particularly damaging because the managers had listed it as a business expense on United Fruit's tax forms -- an equivalent, perhaps, to listing the expenses from dealing cocaine as a tax deduction."Nobody gives a damn about us," said a CEO at the Conference Board meetings. "Not the government, not the consumers, not our workers." Executives, used to the adulation of people who worked for them, couldn't understand what people saw who looked at corporations from the outside: the trappings of hierarchy and stiff competition for perks, the peculiarly sensual rapport between executives and their secretaries, the fear that pervaded every conversation (even senior executives were terrified of cutting loose with what they really thought), and the results: Shoddy products, pollution, and planned obsolescence. They blamed university professors, who'd never had to meet a payroll. They blamed the so-called "public-interest" lawyers (like Ralph Nader) who proclaimed idealistic aspirations; after all, they knew first-hand about the idealistic aspirations of their own lawyers. (At one meeting, the CEOs considered a suggestion to recoup their lost honor by writing up a code of ethics for themselves. "You mean," snapped a voice across the room, "one that would work as well as the Bar Association's?") They blamed the environmental movement and the sinister influence of books like Limits to Growth.Most of all, they blamed the press. They had never lived in a climate where business journalism was so routinely harsh to private interests. They had never seen harsh news stories cause so many stock prices to plummet. Why couldn't reporters write those glowing paeans to capitalism and progress that the CEOs remembered from their own college days? The reason, they were sure, was profit. Skewering big business sold newspapers. Rawleigh Warner, Jr., the chairman of Mobil Oil, grumbled in a business newsletter article that while Texaco's net earnings had gone up 57% between 1970 and 1973, The Washington Post's net income had risen 160%.Of course, if a man like Warner had invested any of his analytical skills in studying the newspaper business, he would have realized that the surges in the Post's profit had more to do with the demise of its rival, the Washington Star, and its expansion into the suburbs, than with any of its antibusiness exposes. Nor did most CEOs have any concept of the real forces that drove the media's anti-business attitudes. They didn't understand the culture of writers and creative people, who had always been the outsiders in their dealings with media managers. In many interviews with a news reporter, a tough business man, feeling put on the spot, would follow his instincts. "You know," he would say, "if you print that, it will end up hurting the interests of your own newspaper's corporation." That was a standard tack in the business world -- a man-to-man reminder of the realities of the larger picture. But to reporters, who revered the "Chinese Wall" which protected the editorial staff from interference from advertisers, that sort of remark smacked of intimidation. They retaliated by letting more and more cynicism creep into their stories.By all conventional standards, the cynicism should have eased up after 1975. Price controls had been lifted. Richard Nixon had resigned. His successor, Gerald Ford, had done no visible harm to the economy. Even the OPEC leaders had relented a bit; they had quietly withdrawn the oil embargo, and the price had slipped back down, closer to normal. Yet the rapids had continued. If anything, inflation and recession had gotten worse. Consumer demand had fallen rapidly, in part thanks to the cumulative effect of scare stories, and in part thanks to unofficial tax that OPEC's surcharges placed on just about every purchase. The auto industry, building and construction, and iron and steel -- the industries paying the highest wages -- suffered the most intense declines. For the first time since World War II, profits and returns failed to live up to the levels that managers had expected all their lives. Since a manager whose reality did not match his projections was a manager who had lost control, a generation full of corporate people suddenly found themselves facing the visage of a trickster in their balance sheets.It was the last place they expected to find one. Managers ran through every trick they knew, every conventional device for controlling purchasing, operations, or marketing, and every approach that made sense, and yet the balance sheet projections refused to play along. The numbers even failed for the most prominent masters of numbers management. As the world spun further out of control, the fascination with control in conventional management circles intensified. Ostensibly, the goal of managers was still to return value to shareholders. But to anyone who looked closely, a more dominant "theory-in-use" was guiding managers' behavior. If they could make it through the "rapids" and come out the other side (they believed), there would be clear sailing again. The sooner they found the right hook, switch, or technique, the sooner they could return to the orderly, predictable world where they knew how to function, where they experienced the joy of effortlessly being winners at their work. From the bottom of their hearts, they did not want to hear that the rapids were here to stay. ***When our intuitions fail, we yearn for strategy. Thus, in the mid-1970s, the idea of building a corporate strategic plan, with its echoes of Napoleon, von Clausewitz and Sun Tzu, evolved into an irresistible management fashion. Simultaneously a heretical fad and the epitome of conventional management wisdom, the discipline of strategic planning had its roots in business school. But the most significant progenitor of the idea was a renegade engineer-turned-consultant named Bruce Henderson, the founder of the Boston Consulting Group. BCG was second after McKinsey -- not in size, but in the more important figure of revenues per consultant. BCG's young Harvard and Stanford MBA's were paid almost as much as McKinsey's young MBA's, and they fit the same mold: clean-cut, sharp, highly analytical, and willing to work impossible hours. Within that mold, however, Henderson encouraged a kind of quirkiness, even rebelliousness -- he, himself, was not an MBA, but a former engineer -- that made BCG a little more interesting to work for.Henderson began his career as a heretic himself; in the mid-1960s, he recognized, where nobody else did, the value of learning to a company's bottom line. He unearthed a discovery called the "learning curve." It dated back to Curtiss Aircraft in the 1920s; a factory manager discovered that when a work team stayed on the line long enough to double their experience, costs dropped by 20 percent. During the last three years of World War II, Curtiss used the learning curve to design a remarkably effective and inexpensive production schedule. But once the War's production demands ended, the learning curve fell out of fashion.Then Bruce Henderson resuscitated it. In the mid-1960s, he helped Texas Instruments use it to miraculously keep cutting the prices of its semiconductor chips and electronic calculators. As prices fell, more people bought the devices, which pushed production levels up, which meant that TI learned still more about production -- which, dropped costs still further. Henderson renamed it the "experience curve." He said it was as real as the law of gravity, and he proved that by plotting the curve in client after client, in industries ranging from beermaking to turbines to farming to retail stores to polyvinylchloride chemistry. He also discovered that the learning curve and market share fed each other. When a company dominated its competitors, it produced more products, which sent it faster along the experience curve. The rich really did get richer in Henderson's world, because being rich brought you more experience. While his sharp young proteges wandered around the world teaching the experience curve, Henderson began a range of studies in the early 1970s. He developed a simple management framework that made it possible for any manager to think like Harold Geneen. Henderson would have been horrified at the comparison, but it was apt. The tool became popular around 1973 -- ironically, just as Geneen's ITT was sliding downhill -- and it rapidly became a fixture in business decision-making practice. It was called the "growth-share" matrix.To use the matrix, you thought of your business as a portfolio of separate product lines. Every one of them fit into one of these compartments. Its placement depended on how much the market was growing (for instance, the market for American photocopiers was growing fast, while the number of American car-buyers was stagnant), and on the amount of market share (for instance, in those years General Foods had the dominant market for "semi-moist" dog food like Gravy Train.) Ultimately, every product line had one of two destinies, symbolized in the bottom two quadrants. Either it would mature into a cash cow, yielding rewards to the bottom line for years or decades, or it would become a dog -- hanging on tenaciously, barking for attention, and draining the vitality of the organization.Thus, the job of a manager was controlled ruthlessness. A good portfolio manager shunted investment among the cash cows, stars, and question marks, according to the rules that the matrix implied. It was worth borrowing money to keep a star shining, because a star might end up dominating its market niche. Was your product a cash cow, earning a rate of return on assets of at least 25%? Then milk it -- keeping your investments carefully monitored, so you can draw cash out of it for years. Investing in a question mark required care. It might be a great opportunity, or it might simply mean throwing money into a business which would never dominate its industry enough to jump onto the learning curve.As for the dogs, Henderson said, discard them -- unsentimentally and fast. As the country moved into the Jimmy Carter presidency (which began as a question mark, became a star and then dwindled out into a dog), the growth-share matrix and the Boston Consulting Group became wildly popular -- particularly with younger, more aggressive managers. The growth-share matrix bequeathed an era of highly intense competitiveness. Even if the market was expanding, a focus on market share made everyone act as if they were in a zero sum game. Every automaker tried to be General Motors; every consumer products company sought to be Procter & Gamble, on the assumption that only the leaders in a particular market segment would survive. Consumers, after all, routinely flocked to the biggest brand names; upstarts had an uphill battle at the grocery store. The purpose of a business, if you believed in market share, was not providing a service, making a good product, or even generating profits, but dominating a niche and seeking chinks in your competitors' armor. Customers were the battlefield that corporations trampled on as they fought. Madison Avenue loved the growth/share matrix. In most large companies, this sort of war meant an increase in advertising, and -- since market share depended on the largest mass audiences possible --the medium of choice was national broadcast television. In 1976, American television advertising expenditures hit an all-time high: a peak of $25 billion dollars. Nearly all of that money went to the big three, NBC, CBS, and ABC, or their affiliates. It was seen, at the time, as an affirmation that the industrial world was pulling out of its slump. Prosperity might be coming back. But the eerie bicentennial boom of 1976 didn't last long. All it really meant was that the finance way of thinking had overtaken all decision-making functions of American enterprises. In the past, managers bought and sold assets because they needed those assets to produce and sell their wares. Now, the profits would come increasingly from buying and selling assets in themselves: from acquiring stars and selling them as more profitable cash cows, before they were milked out.***The stage was now set for the great managerial event of the late 1970s and early 1980s: The realization, by mainstream American managers, that they did not have the answers. In 1978, Chrysler lost $205 million. This was enough, after a decade of ups and downs, to prompt the car company to apply for its famous bailout. The following year, managers at the Ford Motor Company saw their projected $600 million profit turn into a $500 million loss. Then General Motors experienced a calendar-year loss of $763 million, its first loss in sixty years. General Electric, Kodak, Xerox, General Foods, and many other companies all saw their market share drop precipitously -- if not to Japanese competition, then to smaller firms in the West. Corporate heretics, at this stage, were a glum group. Their strategies for boosting performance had failed, or were ignored. Only a few companies, such as Procter & Gamble, continued to move forward. (P&G, by now, was implementing its technician system, still ruthlessly kept secret, in plants throughout Europe and South America.) Before anything more could happen, the message of the heretics would have to be heard in the central bastions of the numbers -- in places like the Harvard Business School.In 1979, two members of the operations management department at Harvard Business School -- a section of the school that had been gradually losing status, through the 1960s and 1970s, to the finance-oriented departments -- found themselves thrown together at Vevey, Switzerland for the summer. They were William Abernathy, the HBS resident expert on auto manufacturing, and Bob Hayes, known for applying operations research to the assembly line. Both men were in their late forties; both had backgrounds in industry (Dupont/General Dynamics, and IBM, respectively) before coming to Harvard. Both were considered conventional, easygoing men -- neither would have been considered a heretic (although Abernathy was known for his rapid-fire, stream-of-consciousness, wide-ranging conversation). Hayes had just spent two years researching the differences in management styles between European and American multinationals. He had the idea that, since Europeans tended to speak more languages, they should have an easier time with cultural diversity; Americans, on the other hand, should clearly have the technological edge. But they didn't. Hayes visited a tiny machine tool manufacturer in southern Germany, a company of thirty or forty people. Sophisticated Americans barely understood computer-aided manufacturing software, but this firm was using it on a daily basis, and getting remarkably resilient at quickly producing custom-made tools. It wasn't just Germany; he began to see signs of sophisticated machine tools coming from Czechoslovakia, Switzerland, Hungary, and Japan. The more he looked, the lower his morale fell. Finally, during a class he was teaching to European businessmen, someone asked him why American productivity had declined so dramatically during the past ten years. He hauled out the standard answers that economists gave in those years: Organized labor, government regulations, the oil crisis, and the attitudes of the baby boom. His students looked at him with polite amusement. "We have all those factors here," one of them said, "and our productivity is increasing."Confused and shaken, Hayes began taking regular hikes with Abernathy, who had just arrived in Vevey. Abernathy was going through a similar set of shocks. He had come to compare the European auto industry with the Detroit Big Three. Seeing the same stagnation, they began comparing notes, and eventually settled on the only explanation that made sense to them: The core of the business "magic," the reliance on numbers that had made American business so powerful, was now hamstringing them. Consider, for instance, what happens when a manager becomes dependent on return-on-investment. Managers who used it as their primary yardstick would always give a higher ranking to the projects that got good ROI figures higher. Those projects would get the bulk of investment. In theory, that meant the company would prosper -- fewer "dogs," and more "stars" would be supported. (Hayes and Abernathy singled out the growth/share matrix as a particularly pernicious influence.) In practice, a dependence on ROI meant that risky initiatives, long-term projects, and anything driven by a manager's personal aspiration or care, would be shoved into the background. The companies would no longer risk; they would no longer pursue. Their goals, for the first time, had shrunk to pursuing return on investment to shareholders, at the expense of all other purposes or objectives, and the entire American economy was suffering as a result. That October, Hayes wrote up their ideas in a lead article for the Harvard Business Review. The reaction was intense. It is still the most-requested reprint in the history of the Harvard Business Review, and it divided the business school. There was vitriolic criticism. Some came from economists who claimed there was no productivity decline, no loss of American competitiveness, no American malaise. And there was a great sigh of relief from a surprisingly wide group of managers, who said that yes, somebody finally had articulated what they had felt for the past few years, but hadn't been willing to say out loud. Hayes and Abernathy had tapped into a growing feeling of shame, in fact, among business school academics, who felt they had to take some of the blame for the virulent wave of hostile mergers that was beginning at that time. No doubt the authors could have taken their protests further, but Hayes and Abernathy were more concerned about validity and research than about making a public splash. The next popularizer of similar ideas didn't have that inhibition.The greatest asset that heretics have, these days, is that there are so many of them. They exist in every organization, balancing the imperative to do good works with the imperative to keep their jobs and keep earning a living. They have been daunted, but not thoroughly disheartened, by the waves of "downsizing" that emerged in the last decade. Their greatest dream is to bring their work lives in tune with their personal hopes and dreams. They want to earn a paycheck, and yet accomplish their own goals. And they recognize that they can only do it by changing the systems of the world around them, one piece at a time. Perhaps a corporation exists, in the end, precisely for its heretics. Perhaps its purpose, in the long run, is to help people expand their souls and capabilities -- by giving them the venues within which to try things on a large scale, to succeed and fail and thereby change the world.

Understand the importance of honest news ?

So do we.

The past year has been the most arduous of our lives. The Covid-19 pandemic continues to be catastrophic not only to our health - mental and physical - but also to the stability of millions of people. For all of us independent news organizations, it’s no exception.

We’ve covered everything thrown at us this past year and will continue to do so with your support. We’ve always understood the importance of calling out corruption, regardless of political affiliation.

We need your support in this difficult time. Every reader contribution, no matter the amount, makes a difference in allowing our newsroom to bring you the stories that matter, at a time when being informed is more important than ever. Invest with us.

Make a one-time contribution to Alternet All Access, or click here to become a subscriber. Thank you.

Click to donate by check.

DonateDonate by credit card
Donate by Paypal

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.