I have this vision that because I own a dog, soon I will be legally prohibited from joining a union.
It's not quite that bad -- yet. But a series of decisions expected this summer from the Bush-appointed National Labor Relations Board (NLRB) could destroy the existing union memberships of millions of people in the U.S., and prevent any future unionization attempts by tens of millions more.
Under 1947's notorious union-busting Taft-Hartley Act, supervisors in the U.S. workforce are considered "management" and therefore have no legal right to unionize. The anticipated NLRB rulings, of three disputes collectively known as the Kentucky River cases, would allow employers in a wide array of industries to reclassify as "supervisors" any employee who has any type of oversight, no matter how fleeting, over a lower-ranked or less senior co-worker. Workers who take on apprentices. Lead men in manufacturing crews. Nurses who direct nurse aides.
And, well, I order our dog around. A lot. She seems to like it.
It really is just about that ridiculous. A nurse, for example, has no power to hire or fire, doesn't set schedules, can't mete out discipline. Yet under these rulings, she or he would be considered "management."
Even though no decisions have been issued by the NLRB, the anticipation of a favorable (for them) ruling already has employers chomping at the bit. AFL-CIO Organizing Director Stewart Acuff estimates the NLRB has already deferred about 120 cases in which union election results have been disputed; employers want to wait for the Kentucky River ruling to see if they can simply declare their workers ineligible to conduct unionization votes at all. In New Jersey, nurses with a common contact at twelve New Jersey hospitals threatened to strike before reaching contract language that guaranteed they wouldn't be reclassified as supervisors.
In a mid-June NLRB hearing, the Chief Nursing Officer of Virginia Mason Medical Center in Seattle, Charlene Tachivana, offered a novel argument against a suit demanding that the hospital negotiate with the nurses' union, the Washington State Nurses' Association, before implementing a contested policy. Tachivana argued that all 600 of Virginia Mason's RNs were in fact supervisors -- that they were able to hire, fire, and write policy, and therefore not eligible to be in the union at all. "That's ludicrous," says Virginia Mason nurse Jeaux Rinehart, noting that he and other nurses cannot hire, fire, or write policies. "She's trying to take attention from [the suit] with a much larger issue" -- namely, the anticipated Kentucky River rulings.
"You could see health care and hospital strikes across America if the decision is too broad," says Acuff. He estimates 300,000 nurses could be affected by the rulings, and up to 1.5 million other workers: "Team leaders and gang leaders in ports, lead men in mines, lead men in docks at manufacturing facilities and warehouses, engineers, people who oversee apprentices in trades....almost every senior worker does this to some extent." A study to be released this week by the Economic Policy Institute estimates that in a worst case scenario, up to eight million union workers could be disenfranchised. Such a ruling, covering two-third of America's unionized labor force, would not only decimate organized labor, but undercut much of the Democratic Party's organizing and get-out-the-vote infrastructure. Thus far, the controversy has gotten remarkably little media coverage.
The Bush administration has done this sort of thing before: changing the interpretation of regulations so as to affect sweeping changes without the bother of getting a controversial policy through Congress. It's used such tactics extensively, for example, to eviscerate environmental policies. It's happened in labor law, too. "Whenever they redefine a word, working people get screwed," says David Groves of the Washington State Labor Council, which is also organizing to stop the Virginia Mason reclassification.
Given the NLRB record under Bush, labor leaders are not optimistic that Kentucky River will turn out well. They are particularly exercised that in Kentucky River, as in all its other cases since Bush took office, the NLRB refused to hear oral arguments - the practice in all previous administrations. "To be willing to consider a decision of this magnitude and not even be willing to consider the effect on working people is unconscionable," says Acuff. "This administration has been more antithetical to the rights of workers than any since Herbert Hoover."
The NLRB is expected to rule on the three cases, ironically enough, by Labor Day. Labor leaders want the public to ask members of Congress to, at minimum, force the NLRB to actually hold hearings on Kentucky River. This week, July 10-14, workers will demonstrate in cities across the country; details are at AFLCIO.org.
I'll bring my dog.
Dr. Martin Luther King, Jr., would have been 77 on Sunday. He has been dead for 38 years.
As his living memory fades, replaced by a feel-good "I have a dream" whitewash that ignores much of what he stood for and fought against, it is more important than ever to recapture the true history of Dr. King -- because much of what he fought against is resurfacing, or still with us, today.
King was, along with Mohandas Gandhi, one of the most internationally revered symbols of nonviolence in the 20th century. He spent his too-brief adult life defying authority and convention, citing a higher moral authority, and gave hope and inspiration for the liberation of people of color on six continents.
MLK, Jr. Day, the holiday, has devolved into the Mississippi Burning of third Mondays. What started out as gratitude that they made a movie about it, gradually becomes revulsion at how new generations of Euro-Americans mislearn the story.
King is not a legend because he believed in diversity trainings and civic ceremonies, or because he had a nice dream. He is remembered because he took serious risks and, as the Quakers say, spoke truth to power. King is also remembered because, among a number of brave and committed civil rights leaders and activists, he had a flair for self-promotion, a style that also appealed to white liberals, and the extraordinary social strength of the black Southern churches behind him. And because he died before he had a chance to be widely believed a relic or buffoon.
What little history TV will give us to commemorate his birthday is as much about forgetting as about remembering, as much about self-congratulatory patriotism -- that King was American -- as self-examination, that American racism made him necessary, and that government, at every level, sought to destroy him.
We hear "I have a dream"; we don't hear his powerful indictments of poverty, the Vietnam War, and the military-industrial complex. We see Bull Connor in Birmingham; we don't see arrests for fighting segregated housing in Chicago, or the years of beatings and busts before he won the Nobel Peace Prize. We don't hear about the mainstream American contempt at the time for King, even after that Peace Prize, nor the FBI harassment or his reputation among conservatives as a Commie dupe.
We don't see retrospectives on King's linkage of civil rights with Third World liberation. We forget that he died in Memphis lending support for a union (the garbage workers' strike), while organizing a multiracial Poor Peoples' Campaign that demanded affordable housing and decent-paying jobs as basic civil rights transcending skin color. We forget that many of King's fellow leaders weren't nearly so polite. Cities were burning. We remember Selma instead.
And we forget that of those many dreams King had, only one -- equal access for nonwhites -- is significantly realized today. A half-century after the Montgomery bus boycott catapulted a 26-year-old King into prominence, even that is only partly achieved. Blacks are being systematically disenfranchised in our presidential elections, and affirmative action and school desegregation are all but dead. Urban school districts across the country these days are as segregated and unequal as ever, and the imminent confirmation of Samuel Alito to the U.S. Supreme Court likely heralds a new era where employers and landlords can discriminate with near-impunity.
But an even bigger problem, as a generation dies off and the historical memory fades, is that Dr. King has become an icon, not a historical figure (distorted or otherwise). History requires context; icons don't. The racism King challenged four and five decades ago in Georgia and Alabama was also dominant throughout the country. Here in Seattle, few whites know that history: the housing and school segregation, laws barring Asians from owning land (overturned only in the '60s), the marches downtown from predominantly black Garfield High School, police harassment of both radical and mainstream black activists, the still-unsolved assassination of a local NAACP leader.
Every city in America has such histories. We don't know the stories of the people, many still with us, who led those struggles. And we rarely acknowledge that the overt racism of Montgomery 1955 is no longer so overt, but still part of America 2006. It shows up in our geography, in our jails, in our schools, in our voting booths, in our shelters and food banks, in our economy, and in the very earnest and extremely white activist groups that often carry the banner on these issues.
If our cities were serious about his legacy, Martin Luther King, Jr. Blvd. would run through downtowns, and there would be MLK, Jr. Elementary Schools in the suburbs. Instead, in just about every big city in the U.S., school districts and city councils put King back in the ghetto, along with both the legions of people who worked with him and the many more who've taken up his work since.
Opponents of affirmative action and racial equality can claim King's mantle and "if he were alive today" approval only because in 2006, pop culture's MLK, Jr. has no politics. And, for that matter, no faith. For white America, King's soft-focus image often reinforces white supremacy. See? We're not so bad. We honor him now. Why don't those black people just get over it, anyway? We did.
All that is a lie. Dr. King's vision is today as urgent as ever. While Jim Crow and the cruelties of overt segregation are now largely unimaginable, much remains to be done. And for those who carry King's banner, the challenges of apathy and official hostility remain the same: the FBI and NSA spying on peace groups, listening to phone calls, monitoring emails. An administration -- voted for by almost no African-Americans -- that reviles nonviolence and labels its critics as treasonous (rather than communist dupes). And the moral outrage of Americans that made King's work so politically effective? We don't do that anymore. We can torture thousands of mostly innocent Iraqis and Afghans in plain sight, and nobody is held accountable. It'd take a whole lot more than Bull Connor's police dogs to make the news today.
The saddest loss in the modern narrative of Dr. King's career is the story of who he was: a man without wealth, without elected office, who managed as a single individual to change the world simply through the strength of his moral convictions. His power came from his faith and his willingness to act on what he knew to be right. That story could inspire many millions to similar action -- if only it were told. We could each be Dr. King.
Dr. Martin Luther King, Jr., nonviolent martyr to reconciliation and justice, has become a Hallmark Card, a warm, fuzzy, feel-good invocation of neighborliness, a file photo for sneakers or soda commercials, a reprieve for post-holiday shoppers, an excuse for a three-day weekend, a cardboard cutout used for photo ops by dissembling Cabinet members and ungrateful Supreme Court justices. Be sure to check out the Three-Day-Only White Sale at Wal-Mart. Always a better price. Always.
King deserves better. We all do.
The ever-widening scandal surrounding Republican super-lobbyist Jack Abramoff threatens to take down at least a half-dozen Congressmen in 2006, more of their aides, Executive Branch employees, and untold numbers of other members of the Republican Beltway hierarchy. At least four dozen lawmakers from both parties are documented as having taken actions favorable to Abramoff clients around the time they received large donations from Abramoff and/or his clients.
It's a sordid tale of Washington corruption, and of crony capitalism at its worst, and it is so dizzyingly complex that few media outlets and even fewer members of the public have yet appreciated just how thoroughly it indicts not just Republican leadership, but the entire bipartisan way of crafting public policy that masquerades as 21st century American democracy.
Abramoff figures in at least four separate, interrelated scandals:
On this, there can be no question. Regarding Iraq, John Kerry is acknowledging reality. George Bush is not.
Bush embarrassed America when he went before a stony-faced audience at the United Nations Tuesday and claimed that all was well in Iraq, calling it a country well on its way to being a "beacon of freedom in the Middle East." More tellingly, he spent far more time defending his decision to invade in the first place, ignoring the consequences of a war that is now dangerously unraveling.
Meanwhile, Kerry seems to have finally found his voice on Iraq. Kerry is in trouble when he tries to parse his explanation of his vote in favor of war in Congress; no matter how sensible it might or might not be, it plays into the "flip-flop" stereotype Republicans have created for him. But there can be no mistaking the current situation in Iraq, and Kerry is spot on when he thunders, as he did Tuesday, that "the president really has no credibility at this point. He has no credibility with foreign leaders who hear him come before them and talk as if everything is going well... The president needs to live in the world of reality."
Alas, on the most critical issue now facing the country – Iraq and Bush's misbegotten War on Terror – reality is not President Peter Pan's strong suit. White House spinsters will be working hard this week to pretend all is well, crowned by the address to Congress on Thursday of Iraq's appointed U.S. puppet prime minister, Iyad Allawi. Allawi not only has no credibility in his own country, but his government, like U.S. troops, cannot even access nearly half of the country. He is, in the eyes of his countrymen, tainted not only by his past as a thug – first for Saddam and then for Western intelligence agencies – but by the very fact he was installed by and works with the Americans.
If there was ever a chance that Bush's ideal of a democratic Iraq on the American model could be achieved, it's long gone. No politician acceptable to Washington will be accepted at this point by the vast majority of Iraqis. Bush knows this, or at least he should; his intelligence agencies, as well as Congressional Republicans, have been telling him. But he is either stubbornly clinging to his own fantasy world, or, for political reasons, he's refusing to acknowledge the crisis.
The White House hope is that stunts like Allawi's address to Congress can help maintain the fiction of a normalized Iraq, on its intended course, at least until the US election in November. Oddly, it may not matter much to the election; polling suggests that the fiasco in Iraq is not changing the minds of those coveted swing voters. But that's not the point. Every week that goes by where Iraq military strategy is dictated by the political goals of the Bush Administration is a week where the insurgency grows stronger and more soldiers are put in harm's way for crass political purposes.
Kerry, in an unusually pointed speech in New York on Monday, finally got the situation right: "Invading Iraq has created a crisis of historic proportions, and... the prospect of a war with no end in sight." His prescription of more foreign assistance may not help much at this point; more radical remedies are probably needed. But at least Kerry understands and acknowledges the situation.
Judging from his public pronouncements, George Bush either doesn't understand what he has created in Iraq, or – even worse – he understands it, but is working his hardest to ensure that the American public is misled. Either way is inexcusable. And either way leads inexorably to John Kerry's conclusion: that Bush does not have the credibility to lead the world, or the United States.
Reading or watching the news these days can be frustrating. But there's really only one line of reasoning that brings forth in me the urge to slap somebody.
Like, for instance, Myron Ebell of the Competitive Enterprise Institute. Ebell announced to the world last week: "If global warming turns out to be a problem, which I doubt, it won't be solved by making ourselves poorer through energy rationing."
Ebell, and other East Coast pseudo-academic commentators whose fondness for America's fossil fuel consumption is related directly to their paychecks, were then promptly buried under a foot of snow over the weekend. It can't be easy, insisting that the world is flat while having to shovel evidence to the contrary.
As scientists and negotiators from around the world begin their second week in a Milan, Italy U.N. conference on global climate change, one thing is eminently clear: the world is not flat. Major global climate change, triggered by rapidly increasing atmospheric carbon dioxide levels, is an established fact. Human activity as the major cause of it is an established fact. Nobody outside corridors of power in Washington, D.C. and Houston has debated any of this for years. As the body of scientific evidence grows, the scope and speed of climatic changes are, if anything, proving far worse than the most alarmist scientific predictions of only a decade ago, affecting not just temperature -- nine of the ten warmest years in recorded human history have come in the last 14 years -- but extremes in atmospheric pressure, a resulting increase in wind speeds, drought, sea level increases, extreme cold, and extremes in precipitation -- like last weekend's unusually heavy and early East Coast snowfall.
As science has scrambled to track all these changes, and to track the havoc that changing climates are already beginning to wreak on what turns out to be an exquisitely balanced natural world, the phrase "global warming" turns out to be a misnomer -- a euphemism, even, for a cluster of trends so catastrophic that without dramatic human counteraction will, in a matter of decades, threaten food and water supplies and much of the natural and technological infrastructure that we humans have developed to support ourselves. Warming is a symptom -- an important one, as the increased CO2 levels trap more solar radiation in our lower atmosphere -- but only one of many impacts. By using a term that defines the problem as solely one of temperature, we get two levels of denial -- oil company Flat Earthers sneering at "junk science" (didn't Copernicus hear that, too?), or comments like those of Russian President-for-Life Vladimir Putin, who joked earlier this year that for his country, warming "might even be good. We'd spend less money on fur coats and other warm things."
Putin is a central figure this week in Milan. He is expected to announce -- after an electoral victory Sunday that gives him firmer control over Russia's Parliament -- whether Russia will ratify the 1997 Kyoto accord. But Russia only has this much leverage because the obstinacy of the United States leaves Russia's ratification necessary for the treaty to take force -- and Russia's decision is a question only because, after five years of publicly backing Kyoto, Putin's government has backtracked in the past year due to fierce anti-Kyoto pressure from the Bush Administration.
Bush policy on climate change has been nothing less than a crime against humanity -- and, for that matter, a crime against many of our biosphere's other inhabitants too. But it's not just Bush that's been the problem; it's all of us humans, especially all of us in consumption-happy America. As Bill McKibben -- one of the earliest authors to spotlight climate change as an urgent issue with 1989's The End of Nature -- noted recently, global warming is being thought of by leaders and ordinary people alike "in the way they think about 'violence on television' or 'growing trade deficits,' as a marginal concern to us, if a concern at all."
Bush's calculated efforts to torpedo Kyoto, and the ongoing campaigns by oil and energy companies and by Bush Administation officials to cast doubt on the scientific legitimacy of the issue, are reprehensible, but hardly unique. Kyoto's provisions are far short of the steps actually needed to combat the problem -- but it was American negotiators, headed by then-VP Al Gore, who worked to water down the originally proposed treaty. Afterwards, as 120 countries moved to ratify Kyoto, it was Bill Clinton who refused to submit it to the Senate. Enter Bush next. All the while, the clock has been ticking, the seasons turning, the temperatures rising.
Kyoto's provisions expire in 2011 -- meaning that as we approach 2004 we're at the halfway point before Kyoto expires, and it has not even taken force yet, thanks in large part to Washington. At this point, negotiators in Milan shouldn't be worrying too much about the details of Kyoto. Even if Russia ratifies it, negotiators should be more concerned about hammering out a framework for what comes after Kyoto.
By then, China will be a major industrial power. The landscape of carbon dioxide-spewing humanity has shifted significantly since the 1990 levels that provide Kyoto's benchmarks. Russia's post-Soviet industrial economy collapsed, meaning that its emissions in 2000 were down 22.8% from 1990; Germany, with its East German component and with unilateral EU measures, similarly declined by 13.6%. They will rebound. The EU as a whole increased its emissions in the decade by only 1.5% -- a vast improvement over the past, but still nowhere near the modest targets set by Kyoto.
Meanwhile, carbon dioxide emissions here in the U.S., already the world's leading spewer, went up a whopping 18.1% in the same decade -- a decade in which a Democratic president and bipartisan Congress backpedaled on previously set fuel efficiency standards, looked the other way while American automakers foisted gas-guzzling SUVs on the public, scrupulously avoided encouraging energy conservation, and gutted budgets for research into renewable energy sources. This decade's Republican-controlled Beltway has continued all this and launched an unprovoked, unilateral invasion of the country with the world's second-largest known reserves of oil.
Both of our major political parties' approaches to global warming seem to take their cue from Dubya's War on Terror declaration -- namely, bully other governments, and urge ordinary Americans to go about our "normal" lives as though nothing was different. I can just hear some pompous legislator on the floor of Congress: "Mr. Speaker, if Americans seek out cars with better gas mileage, it sends the wrong message! It lets other molecules know that THE CARBON DIOXIDE IS WINNING."
But we will have to live differently, because the world is different. It is already the case that there is no going back to our climatic world of 50, 20, or even 10 years ago. Next year, there will be no going back to the world we are in today. The question now is how to slow the planet's human-caused changes, and how to manage or deflect the impact of the more catastrophic ones. These are issues that transcend borders, domestic economies, and the flat-earth stubbornness of one or another elected official.
This week, the headlines will be about Kyoto. Forget Kyoto; by 2011, it will be history. What is needed, with or without Kyoto, is some sort of momentum, from scientists, governments, and the global public, that demands both changes in individual lifestyles -- especially as they relate to fossil fuel consumption -- and changes in public policy at a global level.
We must look farther ahead, beyond the scope of Kyoto. And we must not look very far at all, because a major part of the problem is in our own front yard.
Rush Limbaugh is back on the air, and his listeners have, since Monday, been effusive in their welcoming of him, both on his show and on other local and national right wing talk radio programs.
Since I wrote a column last month describing my use of the same prescription drug that got Limbaugh into trouble, I've had occasion to be on a number of these shows in recent weeks. Without exception, the host, callers, and I wound up in more or less collegial agreement: that the War On Drugs was a failure, irrational, and has served as a pretext for a vast expansion of intrusive state powers; that if a drug user violates the rights of someone else (e.g., by mugging them for drug money), the full weight of the law should come down on them, but otherwise, it's an individual's own responsibility to decide what substances he or she puts in their bodies; and that substance users who become addicts -- of either currently legal drugs (like alcohol) or illegal ones, or a combination -- should be treated by society as having a health problem, not as being criminals.
For many of these hosts, and their listeners, this is a nearly complete reversal from two decades of throw-away-the-key rhetoric. The occasion has been the case of Limbaugh -- a man who, as is the case for any successful entertainer, has cultivated over the years an emotional connection with his fans that leaves them feeling that they have a personal relationship with him. They care about Limbaugh's welfare; they want him to get better. If he has run afoul of the law, they are willing to forgive the transgressions because of what they regard as decades of good deeds and good will.
The general response of liberals, progressives, and other long-time Rush-bashers has been to cry "Hypocrites!!" from the highest rooftops. Well, of course -- but no more so than on any other issue of late where conservatives seem to have astonishing memory lapses concerning their past claims. (Iraq, anyone?) But that's not the point. Those of us -- progressive, liberal, libertarian, or conservative -- who have castigated the War On Drugs for a generation need not to be alienating its newest critics, but welcoming them as allies and figuring out how to forge a consensus as to what type of public policies should replace the failed War.
As an exercise in behavior control, the War on Drugs is over. The drugs won. Efforts to ban ingestion of psychotropic chemicals will always be doomed; for too many people, it's either too much fun or too essential a balm. And technology is about to kick the whole effort into its well-deserved grave. So-called "designer drugs" herald an imminent era in which chemists can put powerful concoctions on the head of a pin. Try keeping that from coming into the country, or your teenager's bedroom. Today, it's difficult; tomorrow, it will be flatly impossible.
Among progressives, critics of the War have claimed for years that the War On Drugs has been ineffective, expensive, an invasion of privacy, racist, ageist, classist, and an excuse for lost civil liberties and an enormous expansion of state power. But we've often failed to acknowledge that abuse of drugs (legal or not) really does hurt both individuals and communities. And therein lies the potential for a consensus that transcends ideology.
Prohibition begets violent crime, but so, at times, do the drugs themselves. Car accidents kill users and their victims alike. Lives waste away. Those of us who want people to be free to put whatever they want into their own bodies -- and that day is coming soon, whether the official War on Drugs ends or not -- have an obligation to also propose realistic, effective ways to prevent the harm that might result.
The answer must start with personal responsibility, and expand into community support through notions like low-income health care and harm reduction models. But the personal responsibility must come first. This is not a comfortable, or popular, thing for progressives to say; it's terrain often occupied by conservatives in denial about social forces. We, instead, will cite root causes like poverty or socialization as reasons why some people do bad things. But there's truth in both. People also do such things because they choose to.
I live in a neighborhood called the Central District, a now-gentrifying part of town that for decades has been the heart of black Seattle. It's also been Seattle's poorest neighborhood, and block-by-block, some parts of it -- including ours -- have a serious problem with drugs and the other social ills, like prostitution and delinquency, that seem to go with it.
A case of a nearby police shooting of a young African-American after he'd attacked several others shows why some segments of the public, especially early on, supported the War On Drugs. Forget the fate of Devon Jackson, the shooting victim, and listen instead to the all-too-common description of his life.
At age 20, Jackson had a long string of arrests. A neighbor says cops took countless guns from his house over the years. Jackson had been smoking "sherms" -- cigarettes dipped into formaldehyde, a concoction which, on its own, was completely legal. So was his heavy drinking. He'd been having increasingly violent outbursts while on a drug binge for 10 days with his girlfriend and pals, including the friend he killed, Dante Coleman. Coleman, 20, also had a history with the law. He worked at a nearby Safeway, having left high school (it's unclear whether he graduated) two years previously.
In the apartment across a narrow hall, consider Samunique Wilson (age six) and Tre Vaugn Ford Spruel (age two), children attacked by Jackson after he killed Coleman. Tre Vaugn had just been picked up by his mom, age 19, from his great-great-grandmother's house, and had been dropped off at the apartment of his mom's friend (Samunique's mom) and her boyfriend, while mom went across the hall to the party. Tre Vaugn's mom is Jackson's sister-in-law; the boy visited his dad on weekends. His uncle, age 18, was later convicted of first-degree murder during a robbery committed following week. Saminique's dad and step-dad weren't mentioned in media accounts; mom is pregnant. Neighbors say the building where Jackson and little Samunique lived has been a notorious, and largely undisturbed, drug and party haven for years.
Even those critics of the War on Drugs who claim that the War has nothing to do with drug use at all, but has instead been a (wildly successful) state tool for social control of the disenfranchised, have a responsibility to explore ways in which we can respond to realities like the world of Devon Jackson. Without such alternatives, it will be impossible to build the broad political coalition needed to curb or even end the War on Drugs.
There was already broad public acknowledgement that the War On Drugs is at best an inappropriate and failed response, and at worst an anti-constitutional outrage. The case of Rush Limbaugh has created inroads of sympathy for those views among the only significant chunk of the public that still believes in the War's premises. But a consensus that a solution has failed is not the same as a consensus on an alternative solution. Without the alternatives, it's all too easy for our society to throw away its Devon Jacksons and Tre Vaugns, and for a destructive mess like the War On Drugs to continue on momentum.
In the world of Devon Jackson, progressives who want to push effectively for a more economically and socially fair society need to be able to acknowledge common sense: a lot of the people involved had life rough, but also engaged in behavior ranging from pretty messed up to grossly irresponsible and destructive. They are not simply victims of society.
Could public policy responses -- health care, day care, education, job training, or (gasp) welfare -- help? Sure. We need more, not fewer, resources for folks on society's margins, resources instead being sucked up in part by the prison-industrial complex the War On Drugs has spawned. But we also must demand that people, families, neighborhoods, and communities -- on the margins or not -- get our own acts together, and hold each other and ourselves accountable for our damaging behavior.
Every U.S. city has plenty of Devon Jacksons visibly waiting to happen. To prevent tragedy, we must insist on a social ethic of personal responsibility--of, first of all, doing no harm to others or to ourselves. We need to teach people to value themselves and to be able to imagine (and care about) the impact of their actions on others. We need to invest in each other and ourselves.
Otherwise, as drug use inevitably spreads and inhibitions recede, the body count will only increase.
Geov Parrish writes for Working Assets.
On May 1, a little over six months ago, President Bush made his now-infamous flight-suited appearance on the deck of the Abraham Lincoln, delivering the message that the war was over and the mission accomplished.
Only 44 days until the California recall election is over. I can hardly wait.
I happen not to live in the state of California. And I have to say that Californians, bless them, all 39 million or however many are now crammed into the once-Golden State, have become nearly as obnoxious as New Yorkers in their assumption that the rest of our country automatically cares about their local affairs.
And so it is that the alternately gleeful or alarmed coverage of California's gubernatorial recall vote is getting kind of tedious. I don't care that 38 million of those 39 million residents will appear on the ballot. I don't care that at least two-thirds of them are celebrities, former celebrities, porn stars, or other misshapen branches of the human family tree.
But I do care, very much, that progressives who sneer at this recall and at the insta-candidacy of Arnold Schwarzenegger are missing the very important point. Like it or not, California -- unlike New York -- has a solid record as a cultural trendsetter for the nation. And the world, for that matter. Lessons from this campaign will be relevant very soon in every state in the country. In many places -- including the White House -- they already apply. We ignore them at our peril.
First of all, as for the legitimacy of the recall vote itself: Lawdy, I wish every state did this, especially every time the governor and state legislature, regardless of party, slash social programs and load up on new prisons and corporate welfare. This, along with a stunningly inept and typically corrupt approach to the energy-gouging scandal of a few winters ago, has been California Gov. Gray Davis' defining record, and it's why so many people across California's political spectrum despise him. Reactionary Republicans may have organized the recall petitions, but Californians of every stripe signed them, as citizens would in any number of cash-strapped states if given the chance.
As Marc Cooper has noted in L.A. Weekly, the recall is nothing more nor less than a vote of confidence, and Davis has none. Progressives should trust democracy more than this. You can't get much more democratic than throwing the bum out.
More important, however, is who the replacement for the surely doomed Davis might be. At the moment, only two names consistenly dominate polls: Lt. Gov. Cruz Bustamante, and Schwarzenegger.
Everyone from career pols to pundits and reporters (especially at the L.A. Times) to progressives have treated Schwarzenegger's campaign -- not to mention his seeming popularity -- with incredulity. And dismay. And disdain. The former world champion weightlifter turned movie star turned one-man industry is, they suggest, a fluke. He's getting poll numbers because of name recognition, or the novelty of his fame intersecting with an electoral campaign. He has no political experience, no experience, really, running or legislating anything beyond his own fame. His campaign will fade when his 15 minutes cycles through, and when he is exposed for the fish out of water that he is.
Nonsense. Schwarzenegger is just as capable of running the state of California as a tired political hack like Bustamante. And voters, contrary to popular pundit wisdom, are smart enough to know it.
I fell into political writing and journalism after a variety of other jobs and careers, and when I began one of the first political myths I was disabused of was the notion that the elected officials running our cities, counties, states, and country were all smart people. Some are, but some are not; it's the normal human range, really, between brilliant policy wonks and walking brain stems. The last two White House occupants show the range clearly enough. Intelligence isn't all it's cracked up to be.
In modern politics, having good ideas for solving difficult public policy questions is a nice attribute to have, but it's certainly not a necessary one. In fact, sometimes it gets in the way. Much more essential are raw ambition, the ability to schmooze and ask for money, and the ability to look and sound personable and competent in from of an audience and a camera.
That's how we elect our public officials now, and Schwarzenegger is as qualified as anyone on these scores. Sure, what candidates actually do matters, but there's no record of that unless they've already been in office. And by then, the advantages of incumbency are so powerful (due mostly to low public interest and the legalized bribery now central to our campaign funding process) that they can only be overcome if the incumbent is stunningly inept.
Like Davis. Which is why he won a tepid election last year, but can't stop the recall this year. His record has become so bad it has angered nearly all his constituents and has overwhelmed the cynical bribery that kept his political star afloat. The recall is likely to draw far more voters than last year's general election that re-elected Davis, for the simple reason that this time it's not more alienating politics as usual.
And this is why Schwarzenegger is for real. Progressives have had a dismal time getting themselves elected and their policies enacted in recent years. The country has instead drifted ever-rightward. It's no coincidence that during this time, while progs and liberals have insisted on talking about what's wrong and how to fix it, the most successful and visible political careers -- from Reagan to Dubya -- have been thick with image and style, and very, very light on the often nonsensical specifics.
For many Americans, both Reagan and now Dubya have been great presidents. This is, in large part, not because of their policies -- which have support, but not in the numbers popularity polls would suggest. It's been a matter of style, the perception that these are decent guys doing a good job.
The fact that Dubya and the fanatics around him are stunningly corrupt, routinely lie, and have hijacked the country hasn't damaged their popularity or power much until recently, when the overreaching and dissembling on Iraq have become too obvious to ignore. But for 30 months, Bush prospered while blithely lying away, and few have cared. Perception has been everything.
Whether such a political environment is healthy is beside the point. Anyone hoping to unseat Bush, or overcome Schwarzenegger in California, had better have some sizzle, some pop, some style. In the last 25 years, Clinton is the only Democrat who has managed the trick nationally. He's also the only Democrat to reach the White House in that time.
Candidates like Mondale, Dukakis, and Gore, or now Kerry, Gephardt, Lieberman, or even Kucinich, miss the lesson of the trendsetters in California. The 2004 presidential hopefuls shouldn't aim to replace Bush so much as to recall him, as we would recall a defective product. (Or one we didn't order in the first place.) They'd also do well to treat their campaigns not as a test of ideas, but as a launch of a competing product.
It's not how a democracy should run. But it's what we've got.
For the first time in memory, this past week has been a bad one in Washington, D.C. for enormous broadcast conglomerates.
The massive media ownership deregulation pushed through the FCC last month by Republican chairman Michael Powell generated a remarkable amount of resistance from a burgeoning, and relatively new, media democracy movement. Deregulation opponents had vowed to override the FCC by taking the fight to the Republican-controlled Congress. It seemed like a futile notion, but Wednesday, the powerful, Republican-run House Appropriations Committee panel took the first step toward doing exactly that, voting 40-25 to block the portion of the FCC's decision that expanded from 35 percent to 45 percent the percentage of national TV households one company's stations could reach.
The vote wouldn't affect other portions of the FCC decision, and it would still need to be reconciled with a Senate bill; the White House has vowed to veto the House move. Nonetheless, even if it goes no farther -- and it will - - the House vote is an important measure of just how widespread dissatisfaction with corporate control of America's media has become, and that such dissatisfaction transcends usual ideological labels.
But beyond the headlines, another development on the media democracy front last week may have far greater long-term implications for the ability of ordinary people to be heard on the airwaves.
Before Dubya came to power and Michael Powell assumed the FCC's reins, the media democracy movement that is now bedeviling him cut its teeth on another FCC fight -- Low Power FM (LPFM). A 1999 decision by the FCC, when it was under Democratic control, created a vast new category of non-commercial, low power FM stations. The stations were to be locally run, with a radius of about 2-3 miles, and promised to give access to the airwaves to thousands of community, church, and activist groups across the country.
It never happened -- at least, not as originally envisioned by the FCC. The National Association of Broadcasters (NAB) and National Public Radio mobilized Congress to effectively gut the program by passing as law a more stringent set of technical requirements. The NAB/NPR bill eliminated over 80 percent of the proposed stations, including most of the ones in larger cities and towns. Commercial broadcasters, as well as NPR, claimed (despite the FCC's claims to the contrary) that the FCC's original criteria would create unacceptable interference to existing stations.
Congress bought the idea, and as a result, while some Low Power FM stations are now broadcasting, and many others are in the pipeline, only one open frequency for a low power station is available in any of the country's top 50 markets -- as opposed to over a dozen each that would have been available in some cities under the original proposal.
That was three years ago. Last week, however, results came back in from a technical study that Congress ordered as part of its legislation, a study intended to determine definitively whether the original, more lax FCC guidelines would in fact pose a threat to existing stations.
The verdict: almost never.
The study, farmed out by the FCC to Mitre Corp., conducted field research and also asked for listener feedback, using the relatively poor-quality analog receivers common in many households rather than the much higher-quality receivers the FCC had originally used to determine interference levels. The researchers still found almost no problems, either from complaining listeners or from their own field readings.
In the mostly rural areas where it has been available, the volume of applications for LPFM facilities has far exceeded the FCC's expectations, proving that there's an enormous demand for such voices. The FCC, of course, is now in different, more business-friendly hands, and is probably disinclined to revisit the previous commission's proposal. And in the intervening three years, big media corporations as well as NPR affiliates have rushed to install new translators that would now block some possible LPFM frequencies in larger cities. But the upshot is that media activists now have the data to go back to the FCC and to Congress demanding both that the LPFM program be expanded to its original scope and that a moratorium be placed on new translator applications until the LPFM question is re-examined.
More broadly, for years the NAB, as the lobbying arm of the country's largest media conglomerates, has had free run of Capitol Hill; it has been among the most effective of the trade lobbying groups, with "triumphs" like the appalling Telecommunications Act of 1996 to its credit. Its LPFM reversal in 2000 was another such triumph -- but now, media activists and other broadcast lobby opponents can use the LPFM example to discredit the piteous cries of well-heeled lobbyists.
The damage that LPFM would supposedly cause to broadcasters simply didn't exist, and the case for re-instating the original proposal is overwhelming. Now, with any luck, a powerful new form of community and neighborhood broadcasting can be made available to the vast majority of the country's people.
For over 70 years, publicly owned airwaves have been leased out essentially at no charge to a broadcast industry increasingly dominated by a handful of homogenous (and often dreadfully idiotic) voices. For the last quarter- century, radio and television have gotten farther and farther away from the notion of local programming, local ownership, and community service. Finally, the trend may be reversing.
Geov Parrish is columnist for WorkingForChange.com
Remember, way back in December 2000, after the U.S. Supreme Court finally stole, er, ruled that George W. Bush would become the next President of the United States?
One of the primary themes to emerge -- from the ornate hotel lobbies of Washington, from the mouths of AM talk radio hosts, from the new regime's sneering acolytes in cowboy hats and fur-trimmed coats -- was that at last, finally, grown-ups would be running Washington, D.C. No more semen-stained dresses. No more fags in uniform and half-assed missile attacks. No more her. No more children running the world.
At least with Clinton you knew that the most powerful man in the world had reached adolescence, if not much beyond it. But all current evidence suggests that the world is now being run by 7-year-olds.
Oh, to be sure, petulant little children are announcing themselves all around the world these days, from surly little bullies like Ahmad Chalabi (who, after spending years on various playgrounds stealing other kids' lunch money, have come home to be handed a shiny new bicycle called Iraq), to the angry little brat in North Korea trying to get his parent's attention ("I've got uranium now!" "Now I've got a missile!" "Now I'm arming it! Watch me! I really am!" "I said I really am! I mean it this time!!"). Kim Jong II needs time out and a nap; Chalabi needs reform school.
But the most alarming spectacle is in Washington itself, where Peter Pan went and recruited his whole grade school class.
The result is calamity almost beyond words to describe: an appetite for cool comic-book foreign policy, emphasis on blowing stuff up, combined with a Never-Never Land insistence on how the world works and economics learned from watching older siblings play Monopoly.
Little kids, you'll recall, can be incredibly cruel. And so it is in D.C. these days, a dramatic step down from the last depressing administration, where the Clinton crew (including, no doubt, Janet Reno) had at least discovered girls. This collection hasn't even matured enough yet to learn right from wrong, or that actions have consequences, or even to experience the essential step in human development of understanding that the world doesn't start and stop with them, that other people think and act and feel just like they do. Empathy. Instead, this bunch stays at home, watches TV, and plays army all day. It's a nice day; they should at least go outside and play. Clinton needed to be grounded. Junior needs to have his toys taken away.
You want proof? What was Junior's sole major "accomplishment" before daddy's friends got him elected governor of Texas? He used daddy's allowance money and bought a baseball team. These are rich children. Too much attention is being paid to "rich," and not enough to "children."
But more and more, the emperor's outgrown clothes are showing, especially in recent days as the little tyke has finally been confronted in public with truths that contradict his carefully constructed play world. First, he really did go outside and play, to Africa, just to get away from it. But reality dogged him there, too, so mostly he's been pouting and insisting that the tooth fairy really does exist, there is a Santa Claus, Saddam really did buy uranium from Niger. ("And all that other stuff I made up last week is true, too!")
Frankly, the pile of toys Junior's no longer interested in is starting to clutter the living room floor, and Junior also keeps tripping over his now-discarded Disney videos, too. (He's not much for reading.) It's not like he's ever learned, or been made, to clean up his own messes. And he still believes all the stories in those old videos, too -- Iraq's mystery weapons in trailers, made out of propane tanks, and the cool spy-movie ties to Al-Qaeda and stuff. He still can't tell fact from fiction.
But confronted with it, he's reacting the way many small, spoiled kids do -- by blaming his friends, starting with the one he doesn't know very well, the guy who already lived in his new neighborhood when he got here, little Georgie Tenet. ("Hey, I only made him fall on a play sword! It didn't really hurt.") Every time Junior does this, he squeezes his eyes real tight and hopes it'll all just go away so he can go play army s'more. (He's also supposed to be doing homework -- he hates math! -- but video games are more fun.)
The other little kids in Junior's clubhouse are acting about the same way -- except for little Rummy, who likes to torture the neighbor's cats when nobody's looking. Rummy's gonna be trouble when he gets older.
For years, the adults around Junior and his little pals have been making excuses for their behavior. All kids are above average. It was a misunderstanding. He didn't mean to break it. He's really not that dumb. He just learns differently. Isn't he cute? The parents are rich, so teachers are circumspect, even when the extra lessons they give don't stick or he makes Family Circus-style mispronouncements.
But the behavior coming out of Washington these days has become too destructive, too aberrant to ignore, as it sometimes does when spoiled kids are never reigned in from their excesses. These kids are very spoiled, and their excesses are scaring all the adults in the neighborhood, if not the world. Frankly, it would be a huge improvement if this batch got old enough to discover girls.
But that's a long way away, and meantime they're really, really wed to their fantasies and their cruelty and their denials. And their moms and dads don't seem to care. Many, many people could die before Junior and his friends get old enough that they start to learn right from wrong.
At this point, the best hope is that they move to another neighborhood.