Sean Gonsalves

Why You Should Work Less

During his first re-election campaign, FDR came to Bedford, Massachusetts in 1936, stumping for four more years of New Deal.

In the crowd was a young girl with an envelope. She tried to make her way to the President to give him the enveloped note but was turned away by a policeman. Roosevelt told one of his aides: "Get the note from the girl."

The young girl's note read: "I wish you could do something to help us girls ... We have been working in a sewing factory, ... and up to a few months ago we were getting our minimum pay of $11 a week ... Today the 200 of us girls have been cut down to $4 and $5 and $6 a week."

A reporter asked President Franklin about the note. "Something has to be done about the elimination of child labor and long hours and starvation wages," was his reply.

Two years later, Roosevelt signed the Fair Labor Standards Act (FLSA), establishing the minimum wage and 40-hour work week.

A few years ago, I was invited to the cozy confines of the Fetzer Institute in Kalamazoo, Michigan to participate in a weeklong retreat with other writers, activists and thinkers. For hours on end, we talked about the sorry state of the world and what we thought could be done to make it better. Like most of these kind of gatherings, we broke into small discussion groups to probe particular social problems more deeply.

John de Graaf, a longtime television producer and creator of the award-winning documentary "Affluenza," was in my discussion group. He noted the irony of how we live in the most affluent society in the history of the world, yet are increasingly time-poor. John had put his finger on the number one reason why people often can't do anything other than try and make their own lives better -- there's no time for anything else.

Then someone brought up FLSA and said, since FDR signed the bill into law, the time most people spent laboring had only increased -- to the point where, for millions of gainfully employed Americans, working 40-hours a week doesn't pay the bills. An increased workload also diminished most people's ability to even spend quality time with their families, to say nothing about getting involved in social activism.

What we needed, John said, was to "take back our time." And at that moment, Take Back Your Time Day was born; meant to symbolize a "challenge (to) the epidemic of overwork, over-scheduling and time famine that now threatens our health, our families and relationships, our communities and our environment."

Today, John's vision has grown into a 7,400-member citizens organization, pushing for labor-friendly policies and more free time. This year, Take Back Your Time Day (Oct. 24) is celebrating the 70th anniversary of FSLA while calling for a new labor law that would make paid vacation a guaranteed right and not just a voluntary benefit employers "offer" workers.

A recent poll conducted by the Opinion Research Corporation found 69 percent of Americans support guarantee paid vacation law, with the largest percentage of respondents favoring a law guaranteeing three weeks vacation or more. Every demographic showed majority support for a vacation law. Only 27 percent said they opposed the idea.

Respondents were also asked how many weeks of vacation were needed to prevent "burnout." 52 percent said they need three weeks or more and 82 percent said they needed at least two weeks.

The survey also uncovered a sad feature of working life in America. Almost a third of working Americans (28 percent) took no vacation time at all; half took a week or less; and two-thirds got less than two weeks off. The median vacation time: 8.2 days, far below the three weeks most cited as the needed amount of time-off to prevent burnout.

The eighth annual Expedia.com vacation survey backs that up, reporting for "the eighth consecutive year, Americans received and used the smallest amount of vacation time among their (European) counterparts abroad."

Even worse, despite reporting an average of 14 paid vacation days again this year, about a third of employed U.S. adults will not even use all the vacation days they do get.

"Again this year, employed U.S. adults will leave an average of three vacation days on the table, in essence giving back more than 460 million vacation days in 2008. Despite these statistics, Americans do see the value in vacation, with more than one-third (39 percent) reporting they feel more productive and better about their job upon returning from vacation and 52 percent claiming to feel rested, rejuvenated and reconnected to their personal life."

"Work responsibilities" was cited as the biggest deterrent to taking vacation. And even when Americans do take vacation, 24 percent "report that they check work e-mail or voicemail while vacationing." That's up from 16 percent in 2005.

All of this has significant economic implications, de Graaf points out. "Time off is essential to health. Men who don't take regular vacations are 32 percent more likely to suffer from heart disease than those who do, and women are 50 percent more likely. If we want to cover everyone and reduce the cost of health care, one way to do it is to improve our health, and every study shows that more time off can help do that."

Seeing as how both Obama and McCain like to talk about health care and "creating more jobs," I've been meaning to pass a note to their respective campaigns: "I wish you could do something to help us take back our time. Let me tell you about Take Back Your Time Day."

I just haven't had the time.

Our Rights Are Under Attack

As the Fed tries to shore up the levees against the derivative deluge -- and as politicians seek to redistribute wealth upward -- other equally important things are happening in the world, which is why I have a problem with bumper-sticker phrases like "it's the economy, stupid." It reduces politics to economics, as if political behavior can be explained by economic self-interest. If "it's the economy, stupid," then "What's The Matter With Kansas?"

But we're not in Kansas anymore. And we're not in Oz either, where you can click your heels three-times and everything will be 401-OK. This is the land of "secretocracy," where the "living document" of the Constitution is on life-support.

Here in Secretville, the buzz is all about fusion and financial markets ain't the only thing melting down. The Justice Department recently finalized new -- and more lenient -- investigative tactics FBI agents can now use. The new rules fuse together the FBI's General Crimes Guidelines, National Security Investigative Guidelines and Foreign Intelligence Guidelines.

In a joint statement before the Select Committee on Intelligence on Sept. 23, the assistant AG and the FBI's general counsel testified that the FBI was no longer primarily concerned with investigating crimes after they are committed. It has become "an intelligence-driven agency capable of anticipating and preventing" crime.

When you're in prevention mode, you have to do assessments. And that's what has civil liberty watchdogs nervous -- how the Justice Department defines "assessment."

The Electronic Privacy Information Center warns: the FBI's new powers "pose serious threats to the right of individuals to speak and assemble freely without the spectre of government monitoring. The policies also threaten Fourth Amendment rights, as (agents have been permitted to) engage in prospective searches without possessing any evidence of suspicious behavior."

The new rules also present a practical problem. "At a time when it is clear that the FBI has been awash in data and unable to process leads effectively," the new guidelines enable "the agency to obtain even more information that is less likely to result in solid leads."

Not only do the new guidelines allow the FBI to conduct surveillance without a court order, it also allows them to "collect information relating to demonstration activities," without a single iota of evidence that a national security threat exists.

Recall the mid-1970s, when it came to public light, the FBI was working to actively undermine peace groups and leaders like Dr. King (which included an officially-sanctioned effort to persuade King to kill himself), guidelines were put in place to prevent such abuses of authority. The new guidelines unravel those safeguards and fuse it back together again in the name of fighting terrorism, as if terrorism posed an existential threat to America.

Beyond the new FBI guidelines, another area of "secretocracy" has experienced fusion.

Over 40 "fusion centers" have sprouted up across the country, according to the ACLU's report, What's Wrong With Fusion Centers.

Fusion centers are these post-9/11 institutions where local, state and federal law enforcement officials meet with business leaders to share -- not just criminal intelligence -- but also private sector data, with the hopes of mining that information to determine possible patterns of possible future crime. Kinda like that movie "Minority Report" where officers of the Department of Pre-Crime arrest people before any law is broken, except without the gift of "pre-cognition."

"There's nothing wrong with the government seeking to do a better job of properly sharing legitimately acquired information about law enforcement investigations ... But in a democracy, the collection and sharing of intelligence information," especially info about American citizens, "need to be carried out with the utmost care ... because security agencies are moving toward using such portraits to profile how 'suspicious' we look," the ACLU notes.

"New institutions like fusion centers must be planned in a public, open manner, and their implications for privacy and other key values carefully thought out and debated. And like any powerful institution in a democracy, they must be constructed in a carefully bounded and limited manner with sufficient checks and balances to prevent abuse. Unfortunately, the new fusion centers have not conformed to these vital requirements."

We've seen the fusion of political power in a "unitary executive," the fusion of the Fed and financial markets and now we've got Fusion Centers. How many fusions equal full-fledged fascism?

As I was saying, we're not in Kansas anymore.

Financial Weapons of Mass Destruction

First, the U.S. Treasury nationalized Fannie Mae and Freddie Mac, which holds over $5 trillion in combined assets and guarantees most of the mortgages in the country -- an implicit acknowledgement by the government that the mortgage market is broken.

We've overthrown regimes and threatened others with military action for nationalizing industries. When other governments do it, it's evidence of their evil, socialist heart. When our government does it, it's necessary.

Next came Lehman Brothers filing the largest bankruptcy in U.S. history. Then, the following day, the Federal Reserve gave an $85 billion "bridge loan" to A.I.G., the largest insurance company on the planet, holding over $1 trillion in assets with 100,000 employees across the globe.

What we are witnessing is what economists Douglas Diamond and Anil Kashyap call "the most remarkable period of government intervention into the financial system since the Great Depression."

At the heart of this credit crunch mess is something called "derivatives." The Initiative for Policy Dialogue at Columbia University offers a good primer:

Keep reading... Show less

Will Wikileaks Revolutionize Journalism?

As popular a reference tool as Wikipedia has become, our newsroom policy doesn't allow for our reporters to use it as an official source for any story. And for good reason: anyone with access to a computer can edit entries.

Through the various industry grapevines, I've ascertained that the Cape Cod Times isn't the only news organization that considers Wikipedia to be a potentially polluted source.

Wikileaks, however, is a different animal -- despite the similar interface the fledgling whistleblower site shares with Wikipedia.

If you're not familiar with Wikileaks, you should be because, since it debuted last year, the international transparency network behind the site has forced governments and news media to take notice, most recently with the posting of whistleblower documents that indicate "thousands of sterilizations, and possibly some abortions, took place in 23 Texas Catholic hospitals from 2000 to 2003," as reported by the Catholic News Service in the wake of the leak.

The same day of the Catholic hospitals leak (June 15), Wikileaks posted the 219-page U.S. military counterinsurgency manual, Foreign Internal Defense Tactics Techniques and Procedures for Special Forces (1994, 2004).

Wikileaks investigative editor Julian Assange writes that the manual can be "critically described as 'what we learned about running death squads and propping up corrupt government in Latin America and how to apply it to other places.' It's contents are both history defining for Latin America and, given the continued role of U.S. Special Forces in the suppression of insurgencies, including in Iraq and Afghanistan, history making."

Students of U.S. foreign policy history, particularly guerrilla warfare history, will find no real surprises in the counterinsurgency manual, as eye-popping as it may be to some.

In February, Wikileaks posted the secret rules of engagement for U.S. troops in Iraq, which was followed by The New York Times and prompted the Iranian government to hold a press conference, warning U.S. military planners about border crossings. The Washington Post reported on leaked Guantanamo detainee policy documents first posted on Wikileaks that forced the Pentagon to respond.

Wikileaks describes itself as a site that's "developing an uncensorable Wikipedia for untraceable mass document leaking and analysis. Our primary interest is in exposing oppressive regimes in Asia, the former Soviet bloc, Sub-Saharan Africa and the Middle East, but we also expect to be of assistance to people of all regions who wish to reveal unethical behavior in their governments and corporations. We aim for maximum political impact."

Besides having been briefly banned by a judge in the U.S. (the site appears to be based in Sweden), the anonymous founders are international computer geeks who know how to hide in cyberspace and get around things like the Great Firewall of the government in China. In fact, Wired magazine notes that one of Wikileaks' advisers, security expert Ben Laurie, "doesn't even know who runs the site -- other than (co-founder Julian) Assange (who lives in Kenya) -- or where the servers are."

What makes Wikileaks a unique "news" site is that instead of "breaking stories," it publishes leaked documents, now boasting "over 1.2 million documents ... from dissident communities and anonymous sources."

An early criticism of Wikileaks was its posting of anonymously leaked documents without running it through an editing process and without providing any context -- something that many industry insiders (and military brass), including prominent open government advocates like Steve Aftergood, view as "irresponsible," at best.

While Wikileaks Web masters seem immune from government and press criticism, they're not unresponsive, having changed the site a bit since it first hit the net in January 2007. The home page now features analysis of recently leaked documents, as well as "fresh leaks requiring analysis."

The site also notes: "Wikileaks is not like Wikipedia. Every submitted article and change is reviewed by our editorial team of professional journalists and anti-corruption analysts. Articles that are not of high standard are rejected and non-editorial articles are fully attributed."

As for the possibility of someone, including spy agencies, posting forged documents -- well, Wikileaks has an answer for that too.

"Wikileaks believes that the best way to determine if a document is authentic is to open it up for analysis to the broader community -- and particularly the community of interest around the document."

"So for example, let's say a Wikileaks document reveals human rights abuses and it is purportedly from a regional Chinese government. Some of the best people to analyze the document's veracity are the local dissident community, human rights groups and regional experts (such as academics). They may be particularly interested in this sort of document. But of course Wikileaks will be open for anyone to comment."

"Journalists and governments are often duped by forged documents. It is hard for most reporters to outsmart the skill of intelligence agency frauds. Wikileaks, by bringing the collective wisdoms and experiences of thousands to politically important documents, will unmask frauds like never before."

While journalists should view Wikileaks with a healthy dose of skepticism, its short-track record has proven that it cannot be ignored. Welcome to the brave new world of investigative journalism.

In 1788 Patrick Henry wrote: "The liberties of people never were, nor ever will be, secure, when the transactions of their rulers may be concealed from them."

In 2008 Wikileaks is poised to test just how much we believe in the idealistic rhetoric celebrated over the Fourth of July weekend.

Who Cares About the Vice President?

Seems everyone's talking about prospective vice presidential candidates. But who cares about the vice president?

Go ahead and laugh but the VP view of the "founding fathers" wasn't far from the sentiment embedded in that question.

An afterthought in the construction of the Constitution, it was on Sept. 6, 1787 that America's powdered-wig wearin' Constitutional Convention approved Alexander Hamilton's proposal to create the office of the vice presidency, declaring that the Veep should be the runner-up in the race to be president.

That's how VPs were picked until the rules were changed to allow presidential nominees to pick their running mates, which has since been used as a way for candidates to garner more votes with a "more balanced" ticket.

The first two vice presidents had two different perspectives. John Adams famously quipped: "My country has in its wisdom contrived for me the most insignificant office that ever the invention of man contrived or his imagination conceived."

Jefferson wrote: "The second office in the government is honorable and easy; the first is but a splendid misery."

Is it a stretch to think Jefferson considered the vice presidency "honorable and easy" because, as Adams observed, it's "the most insignificant office that ever the invention of man contrived or his imagination conceived?"

As recently as the beginning of the 20th century, VPs were still considered of such little consequence that when Vice President Garret Augustus Hobart died in November of 1899, the office was left vacant, as it had been on ten previous occasions for periods ranging from several months to as long as four years. Don't you just love progress? We've gone from the marginal significance of the vice presidency to the shadowy co-presidency of Dick Cheney. And while Cheney is arguably the most powerful (and secretive) vice in U.S. history, the transformation of the office began long before he was officially embedded in the White House.

The first VP, John Adams, attended a Cabinet meeting in 1791; something no other VP did until 1918 -- the year President Wilson asked Vice President Marshall to preside over the Cabinet while he was off at the Paris Peace Conference.

The expanded role of the vice presidency took another leap when President Warren Harding invited his VP, Calvin Coolidge, to attend all Cabinet meetings.

But, it was Vice President Charles G. Dawes who, in refusing to attend Cabinet sessions, cautioned that by doing so, "the precedent might prove injurious to the country." (Did he foresee Cheney?)

The response to Dawes warning: precedent, schmecedent!

Eisenhower took it to the next level, directing Vice President Nixon to preside over Cabinet meetings in his absence instead of following precedent in which the Secretary of State presided.

JFK and LBJ kept the VP snowball rolling before handing it off to Carter and Reagan, both of whom further expanded the office. Bush and Clinton followed suit.

Then came the JFK expansion, making LBJ chairman of the National Aeronautics and Space Council and the head of the President's Committee on Equal Employment Opportunity.

Johnson kept the ball rolling when he became president, appointing Hubert H. Humphrey to lead his administration's anti-poverty and civil rights programs. Nixon followed up by putting Spiro T. Agnew in charge of promoting the administration's domestic policies with state and local officials.

Walter Mondale helped President Carter craft U.S. policy in South Africa and was at the forefront of Carter's effort to reorganize the U.S. intelligence community.

Vice President George H. W. Bush -- the first VP to serve as acting president (while President Reagan underwent surgery) -- ran the group of advisers that provided Reagan with recommendations on how to respond to foreign flare-ups.

Quayle, believe it or not, headed the National Space Council, while Gore was given a primary role in foreign affairs, environmental policy, and the taxpayer subsidized effort to hand over government communications research and technology to private profiteers.

(And they say government doesn't help create wealth. Tell that to all the entrepreneurs who have made a fortune using technology initially funded with taxpayer R & D money. Gore didn't invent the Internet but he did carry on the long-standing government tradition of giving taxpayer-subsidized research and technology over to private companies for "free" -- the kind of bottom-up transfer of wealth free-market purists like to pretend doesn't exist and never enters the hand-wringing discussion over government hand-outs. But I digress).

So here we are, coming off eight years of the Imperial Vice Presidency of Dick Cheney, which seems to have conditioned us into thinking the VP must be something more significant than a bench-warmer called in to do PR work like representing the White House at state dinners and funerals the President can't make or getting involved in First Spouse kinda stuff like reading to kindergartners in the "inner-city."

Not to understate the semi-importance of VP candidates (especially in Obama's unique case, given America's historical penchant for assassination, especially popular leaders labeled by racial politics as "black") but, isn't it more important to bring some focus to bear on Israel flirting with an attack on Iran?

An Israeli-Iranian war would undoubtedly draw in the U.S. military, putting over a 100,000 U.S. troops now stationed in Iraq in the middle of a mess that'll make the insurgency look like kiddie play; not to mention the potential for the needless death of even more innocents, fanning the flames of the self-fulfilling prophecy of Armageddon that millions of Bush supporters hope to speed up.

Memo to national news editors: We got into Iraq with cooked intelligence and escalating sanctions under the guise of "diplomacy." Fool us once, shame on them. Fool us twice and, never mind, shame -- say hello to zero credibility.

What Liberal Media?

You hear it all the time, especially during election season. "The media is biased" -- a criticism leveled from both the Right and Left.

In fact, there's a cottage industry devoted to "exposing media bias," most of which has people in the news biz rolling their eyes. And for good reason: not that media criticism is unwarranted, it's just that most of it, to put it bluntly, is oversimplified nonsense that generates more heat than light.

Perhaps the weakest aspect of pop media criticism is its lack of clarity. People talk about the media as if it were a single entity.

"The media"? Are we talking about the broadcast or print media? Are we talking about the Colbert Report, PBS, NPR, Fox News, the Wall Street Journal or the Cape Cod Times? Are we talking about reporters, editors, publishers, radio talk-show hosts, columnists, bloggers or TV pundits?

As Washington Post reporter Paul Farhi wrote in a recent issue of American Journalism Review, "critics often blame 'the media,' as if the sins of some are the sins of all. It's not just a bland, inexact generalization; it's a slur. The media are, of course, made up of numerous parts, many of which bear little relation to each other. Critics need to define their terms. Holding 'the media' responsible for some perceived slight is like blaming an entire ethnic or racial group for the actions of a few of its members."

Still, surveys show ever-increasing public skepticism about the traditional news media. According to survey data cited by media scholar S. Robert Lichter, two-thirds of the public thought the press was "fair" in a 1937 survey but by 1984 rolls it dropped to 38 percent, while only 29 percent said the same about TV news.

Adding insult to injury, a national survey conducted by Sacred Heart University in January found that only 19.6 percent of respondents said they believed "all or most" reporting, while an a larger percentage (23.9 percent) said they believed "little" or none of it. Next stop: zero credibility.

These survey results should be taken with a grain of salt, in part because, news consumers tend to overstate how closely they pay attention to news, as the Sacred Heart study indicates.

For example, the survey found that Americans described the New York Times and NPR as "mostly or somewhat liberal" -- about four times more often than they described those two outlets as "mostly or somewhat conservative."

"Leave aside the blunt generality inherent in this. (Is all of NPR -- from "Morning Edition" to "Car Talk" -- "mostly or somewhat liberal?") The more important (and unasked) question about this finding is its shaky foundation. Given that only small fractions of the populace read the Times or listen to NPR on a regular basis, how is it that so many Americans seem to know so much about the political leanings of the Times and NPR?" Farhi asks.

Part of this disconnect stems from the lack of actual content analysis among the general public and an over-reliance on anecdotal examples.

Take this year's primary campaign season, for example. Depending on which candidate you supported in the primaries, the universal claim is that the media was biased for/against Clinton or Obama. Yet, a study of the A sections of three agenda-setting newspapers (the Washington Post, NY Times and L.A. Times) done by researchers at Bowling Green State University paints a more nuanced portrait.

The study found Clinton and Obama received about the same number of "positive" and "negative" headlines from those papers (from Labor Day through the Super Tuesday primaries in early February). About 35 percent of the headlines for Obama were positive and 27 percent were negative. Clinton received 31 percent positive and 31 percent negative. The rest of stories were considered to be either mixed (with positive and negative elements) or neutral.

So what's the deal? Is the entire news biz soooo biased that it warrants such a profound sense of distrust among the public?

My own biased answer is: of course, there are media biases, most of which are on the institutional level; shaping the way news is gathered and delivered, regardless of individual preferences. But, the "media bias" news consumers decry doesn't manifest itself in the way most people think, especially as conceived by those who think the media is "liberal."

That's going to sound "liberally biased" to Limbaugh and O'Reilly fans but it's a bias shared by former Bush press secretary Scott McClellan.

"To this day, I'm often asked about the 'liberal media' critique," he writes in his new memoir. "My answer is always the same. It's probably true that most (journalists) are personally liberal or leftward leaning and tend to vote Democratic. But this tilt to the left has probably become less pronounced in recent years."

I would say that's an understatement.

"Everything I've seen as a White House press secretary and longtime observer of the political scene ... suggests that any liberal bias actually has minimal impact on the way the American public is informed. We in the Bush administration had no difficulty in getting our messages out. If anything, the national press corps was probably too deferential to the White House," McClellan observes.

The run-up to the invasion of Iraq is the most obvious example. McClellan argues that the press were asking the wrong questions, focusing on the "march to war," instead of whether war was necessary. When it comes to Iraq, he writes, "the 'liberal media' didn't live up to its reputation. If it had, the country would have been better served."

For those of us who saw the invasion of Iraq as a war of choice and not necessity from day one, McClellan's observation is, by now, a truism. But what is interesting about his conservative view is that he takes it one step further.

"I'm inclined to believe that a liberal-oriented media in the United States should be viewed as a good thing," especially considering that the last several presidential administrations and the bulk of Congress have been "a succession of conservative/centrist leaders, either right of center or just left of center, who pursued mainstream policies designed to satisfy the vast bulk of middle-class American voters."

"Over the past forty years, there have been no flaming liberals in positions of greatest power in American politics. Under these circumstances, a generally liberal or left-leaning media can serve an important, useful role. It can stand up for the interests of people and causes that get short shrift from conservative and mainstream politicians."

Tell your right-wing friends to put that in their Limbaugh pipe and smoke it.

Moving beyond the oversimplified and misleading debate about liberal/conservative media, there's a deeper problem to consider.

These seemingly intractable, polarized, news-views show no sign of abating. In fact, there's every reason to expect it to get worse. With the Internet and the ability of news consumers to pick and choose what news they want to engage, I wonder how America will ever have a meaningful conversation about any national issue when we're all living in our own individual media bubbles, clinging to news that affirms our individual world view while rejecting any information that doesn't fit neatly into our political philosophy as worthlessly "biased."

That doesn't facilitate conversation. It encourages us to continue shouting past each other.

We Must Combat Government Secrecy

"They hate our freedoms," we've been told. Over and over again. Never mind the willful ignorance involved, even using that shallow Bushism as a measuring stick, we're losing the war on terror.

To say "we're losing" is not even controversial, much less news -- if you've been paying attention.

But in this neo-PC era, in which talking about obvious realities in the political arena is considered beyond the pale ("chickens coming home to roost," for example), merely suggesting that "we're losing" will be taken by some readers as an expression of eternal "anti-American" defeatism instead of the realistic assessment that it is; a necessary first step toward the renewal of the Republic.

Exhibit A: Access to information is the cornerstone of a free and open society. I don't think Ayn Rand, Milton Friedman, F.A. Hayek, Wall Street Journal editorial page lovin' conservatives could disagree with that. Yet, under Bush, access to information has been increasingly restricted in the name of national security at levels never seen before. And it's happening right under our noses.

So, even if you buy into the oversimplified "they hate our freedoms" mindset, the mere fact that since 9/11, America has gone from formal democracy to an official "secretocracy" must have the yet-to-be-captured bin Laden singing praises to Allah.

As former Bush White House press secretary Scott McClellan writes in his new book What Happened: Inside the Bush White House and Washington's Culture of Deception, "keeping the curtains closed and doors locked is never a good idea in government, unless it involves vital matters of national security. Secrecy only encourages people to do things they would prefer others not know about. Openness is critical for accountability."

"The Bush administration lacked real accountability in large part because Bush himself did not embrace openness or government in the sunshine. His belief in secrecy and compartmentalization ... ultimately self-defeating in the age of the internet, blogsphere, and today's heightened media scrutiny."

Last week, an official background paper on the new White House information security policy plan was leaked to the Federation of American Scientists (FAS).

In essence, the new policy, announced early last month, gives federal agencies the authority to designate many government press releases as "Controlled Unclassified Information" under the guise of "standardizing practices" and safeguarding unclassified government information deemed sensitive.

The official acronym is CUI. On its face, the policy gives federal information officers a single catch-all acronym to replace the various labels individual agencies use to control information i.e. "sensitive but unclassified" or "for official use only." There's about a hundred labels used by Uncle Sam to mark degrees of need-to-knowness.

But, beneath the surface we find this 2006 Congressional testimony from the Office of the Director of National Intelligence (ODNI): "The great majority of information which is now controlled can be put in a simple unclassified, uncontrolled category, it seems to me," ODNI Information Sharing Environment program manager, Thomas McNamara, testified.

Here's the rub: under the new Bush CUI policy, as Steven Aftergood of FAS notes, the "great majority of the information" McNamara said should be uncontrolled is likely to remain controlled and unavailable to the public.

What if a member of the public (or reporter working as a proxy for Joe and Jane Q. Public) wants information that a federal agency has stamped CUI? According to the White House background paper, that person should submit a Freedom of Information Act request.

Speaking from experience, anyone who has submitted FOIA requests knows the process is anything but timely, cheap or even necessarily fruitful. The current FOIA system is, as my grandmother used to say, slow as molasses in January. (Several years ago I submitted a FOIA request to the Interior Department. It took six months and $360 to get a few hundred pages of fairly innocuous documents).

The unveiling of the new CUI policy happens to coincide with news that The National Archives and Records Administration (NARA) is overwhelmed and backlogged, facing a mind-boggling increase in electronic and classified records.

"The National Archives today faces two overwhelming challenges -- the exponential increase in government-held electronic records, and the geometric increase in currently classified and previously declassified records -- with which NARA has neither the resources nor the strategy to cope," National Security Archives director Thomas Blanton testified at an oversight hearing two weeks ago.

The hearing also revealed an interesting budget detail. We spend more than $8 billion a year keeping secrets and only $44 million declassifying them, which helps explain why the National Archive administrators are calling for a "classification tax" on federal agencies to fund a National Declassification Center.

They're also pushing Congress to change the standards for how government info is classified; how historical records are released; and for the establishment of independent review boards -- a la the Kennedy Assassination Records Act and the Nazi War Crimes Disclosure Act.

"Institutionalizing presumptive withholding in a government-wide CUI policy could make it harder to overcome current secrecy practices when the opportunity to do so presents itself," Aftergood cautions.

The irony is hard to miss. In our Information Age, the information citizens need to make informed decisions is becoming less accessible.

Inheriting the Mess-o-patamia left behind by Bush & Co., it would be nothing short of miraculous if the first 100 days of the next administration accomplished nothing more than to pull back the curtains, open the White House windows and let the sun shine in. Then -- maybe, just maybe -- we can start "winning."

How the Government Is Passing Secret Laws

Once upon a time, a team of federal attorneys went before the Supreme Court only to discover that their entire case was based on a revoked executive order and therefore moot.

True story. Look it up. Panama Refining Company v. Ryan. The revoked presidential order was understandably missed by the attorneys. The revocation had never been made public -- an example of what legal scholars refer to as "secret law."

Cases like that caused Congress, in the '30s and '40s, to pen legislation aimed at bringing order to the dissemination of vital government information, amid the chaotic complexity of state administrative laws and downright shoddy record-keeping. Congress also established statutes to keep a growing body of secret law in check.

That's how we got the Federal Register Act of 1935, the Administrative Procedures Act of 1946 and the golden key to open government (and investigative reporting) -- the Freedom of Information Act (FOIA).

Those legislative acts exemplify one of the defining features of American government -- the publicizing of laws and regulations. The political philosophy isn't hard to understand. Secret laws are the antithesis of a free and open society, which explains why the first U.S. Congress mandated that every "law, order, resolution, and vote (shall) be published in at least three of the public newspapers printing within the United States."

But, never mind -- for the moment -- the decline of newspapers, and the harmful implications it has for democratic governance. Even more alarming is the underreported increase of unpublicized "secret laws," clandestinely cultivated in recent years.

We're talking everything from secret interpretations of the Foreign Intelligence Surveillance Act and opinions from the Office of Legal Counsel (OLC) to secret Presidential directives and transportation security orders.

And don't let the word "opinion" throw you off. If, for example, they're "opinions" issued by the OLC -- like the now infamous Yoo torture memos -- those kind of "opinions" are binding on the executive branch.

So, while the Washington press heavy-hitters were analyzing flag pins and pastors, a Judiciary subcommittee hearing was held on "Secret Law and the Threat to Democratic and Accountable Government".

Among the half-dozen or so witnesses to testify was the director of the Project on Government Secrecy at the Federation of American Scientists, Steven Aftergood -- one of the nation's preeminent authorities on secret law. What should have been a top-story across the country was rendered invisible by a tsunami of triviality.

Here's some testimony you probably missed:

"There has been a discernible increase in secret law and regulation in recent years" to the point where "legislative intervention" is required to "reverse the growth."

Unsurprisingly, secret law really became entwined with the government during the Cold War. But today, "secrecy not only persists, it is growing. Worse, it is implicated in fundamental political controversies over domestic surveillance, torture, and many other issues directly affecting the lives and interests of Americans."

The law that governs espionage activity has been re-interpreted by the FISA Court, the specific nature of which has not been disclosed to the public?

In August 2007, the American Civil Liberties Union petitioned the court on First Amendment grounds to make public those legal rulings, after redacting classified information. The court denied the ACLU petition, claiming it didn't have the expertise to decide what information should be redacted.

The denial was issued despite it being evident "that there is a body of common law derived from the decisions of the (FISA court) that potentially implicates the privacy interests of all Americans. Yet knowledge of that law is deliberately withheld from the public. In this way, secret law has been normalized to a previously unknown extent and to the detriment, I believe, of American democracy," Aftergood testified.

Other areas of concern: "there appears to be a precipitous decline in publication of OLC opinions in recent years ... In 1995, there were 30 published opinions, but in 2005 there were 13. In 1996, there were 48 published opinions, but in 2006 only 1. And in 1997 there were 29 published opinions, but only 9 in 2007."

"One secret OLC opinion of particular significance, identified last year by Sen. Whitehouse, holds that executive orders, which are binding on executive branch agencies and are published in the Federal Register, can be unilaterally abrogated by the President without public notice."

Such orders mean "Congress is left with no opportunity to respond to the change and to exercise its own authority as it sees fit. Worse, the OLC policy ... implies a right to actively mislead Congress and the public."

Here's something else that's been waaaay underreported. As of January 2008, the Bush administration has issued 56 National Security Presidential Directives on a range of national security issues. Most of those directives have not been disclosed. "Texts of the directives or descriptive fact sheets have been obtained for about a third of them (19)," Aftergood testified. Only the titles have been obtained on 8 of the directives and absolutely no information is available for 10.

Congress has also gotten in on the action, having "participated in the propagation of secret law through the adoption of classified annexes to intelligence authorization of bills, for example."

Aftergood concluded his testimony, rightly observing that "it should be possible to identify a consensual middle ground that preserves the security of genuinely sensitive national security information while reversing the growth of secret laws."

That's why he's pushing for the passage of the State Secrets Protection Act -- S. 2533 -- which aims to balance conflicting interests of secrecy and public disclosure.

"The rule of law, after all, is one of the fundamental principles that unites us all, and one of the things we are committed to protect. Secret law is inconsistent with that commitment."

Of course, whenever someone points out how civil liberties have taken a back-seat in the name of "national security" under Bush, what's the typical response of true believers?

They call talk radio, blog and write letters-to-the-editor about how "liberals" and "leftists" aid and abet terrorists with a naive insistence that America's political leaders adhere to quaint luxuries like long-established Constitutional freedoms.

The old saw -- "loose lips sinks ships" -- has been replaced by another now familiar brain-dead mantra: "if you're doing nothing wrong, you have nothing to worry about." But the metastasizing growth of secret law pulls the rug out from underneath that flimsy argument. And for obvious reason: you can't know what you don't know.

Whistle-Blowers Under Attack

Look! In the sky. It's absurd. It's insane. No, it's Irony Man.

Not Iron Man -- Irony Man. News junkies know him as Scott Bloch, after it was reported last week that the FBI had raided the office of Special Counsel. In the wake of the raid, the National Whistle-blower Center issued this statement, hinting at Bloch's Irony Man identity: "In a notably ironic turn of events ... Bloch -- who is charged with protecting federal whistle-blowers -- has been under investigation for, among other things, whistle-blower retaliation within his own agency. Most of the whistle-blower community has been disappointed with Bloch's tenure in office. He had no actual expertise in the field and was viewed as a patronage appointment."

I couldn't get in touch with Bloch but I think he would agree with my assessment that the FBI's timing couldn't be better -- at least for the folks organizing "Whistle-blower Week" in the nation's capital, which happens to be this week!

But I was able to catch up with the president of No Fear Institute, Dr. Marsha Coleman-Adebayo. NFI is organizing the Whistle-blower Week conference.

Coleman-Adebayo was a senior Environmental Protection Agency representative to the White House under the Clinton administration. While serving on a special U.S. commission working with the South African government, she blew the whistle on the death she discovered was being wrought in South African communities where vanadium pentoxide is mined by poor villagers to supply the steel alloy America relies on for making bridges, airplanes, cars, even knives, forks and surgical instruments. Vanadium is one of the chemical compounds mixed with steel that allows it to bend and contract without breaking.

When Coleman-Adebayo found that lack of environmental regulations and labor laws in South Africa were literally killing poor miners and their families, Coleman-Adebayo tried to call attention to the silent killer. Her superiors told her to forget about it but her refusal to forget about it led to her increasingly hostile ouster.

In August of 2000, a federal jury found the EPA guilty of violating her civil rights. Coleman-Adebayo became an activist and organized a grass-roots campaign that eventually led to the passage of the Notification of Federal Employees Anti-discrimination and Retaliation Act (the No FEAR Act), which was signed into law by President Bush in 2002.

A year later, Congress began to gut the law. Since then, Coleman-Adebayo and the rest of the whistle-blower community has been working with congressional allies on new legislation.

"One of the goals of Whistle-blower week is to push for new legislation," she told me on Mother's Day.

One is called the Congressional Disclosure Protection Act, which is designed to protect workers against whistle-blower retaliation if they should testify before Congress.

"Almost always, when you testify (as a whistle-blower) before Congress, you are immediately retaliated against, usually fired. This Act will provide some protection when workers exercise their constitutional right to talk to Congress."

Another law being advocated is the Civil Rights Tax Relief Act -- to prevent the IRS from taxing compensatory damages awarded to whistle-blowers whose civil rights were violated. When Coleman-Adebayo was compensated for the EPA's retaliation, the IRS took half of it back in taxes! The Civil Tax Relief Act would end that foolishness.

"People don't know it but they've gutted the 1964 Civil Rights Act through taxation. There are people who have gone into bankruptcy trying to get their day in court under a law that Dr. King died for."

Then there's No Fear Act II, designed to make No Fear I gut-proof by defining what disciplinary action must be taken by employers who violate workers civil rights in retaliation for whistle blowing, left undefined in No Fear I.

"I think it will go down as one of the most important pieces of civil rights legislation in history," Coleman-Adebayo said.

Because she has seen corruption in federal government first hand through her experience in South Africa, Coleman-Adebayo believes "one of the most important things whoever the next president can do is to clean up the federal government."

"We are hoping Senator Obama, Clinton or McCain will take up the mantle to clean up the corruption. Until the corruption is cleaned up, nothing the government does will work for the people. Whistle-blowers are simply the canaries in the mine. We die off first -- but not long after, the carbon dioxide comes out of the mines."

"The whistle-blower community is one of the most courageous communities I've been involved with. Through loss of jobs, families, even death, in some cases, they've made the decision to stand and fight. That's what the week is all about -- a community of people willing to stand up and fight for their country."

The canaries are calling. Will you listen?

A Bad Week for Journalism

"Great minds focus on ideas. Average minds focus on events and small minds focus on people" - anonymous

I'm hoping this week is a better one for journalism.

Last week began with the American Society of Newspaper Editors reporting that 2,400 full-time newspaper jobs were lost in 2007 -- the largest annual drop in 30 years, bringing the total number of tanked news workers to about 15,000 over the past decade.

"It was an even larger decrease than the 2,000 drop-off in the recession year of 2001," laments Rick Edmonds, media business analyst for the Poytner Institute.

Then, millions witnessed ABC's Democratic presidential debate moderators George Stephanopoulos and Charlie Gibson putting all of their feet (via their mouths) on the accelerator toward the collapse of modern journalism, in their trivial pursuit of "tough questions" on behalf of "ordinary Americans."

At this point, I wouldn't be surprised if the Washington press corps asked Obama if he's ever given someone the middle-finger and if so, what does that say about his character? "Mr. Obama, are you aware that one time, someone burned a flag somewhere in America at the precise time you were giving a speech. Does that say something about your patriotism?"

Notice when candidates are pilloried with red-herring inconsequential GOP talking points, campaign commentators point to the "wise" public, adroitly able to sniff out political flaws, as the reason. But when there's widespread public opposition to war policies or corporate-friendly trade agreements, the "wise" public turns back into the stupid, unthinking rabble the true elite have always seen it as, in need of a lecture about personal responsibility and what the experts say is in the "national interest."

And do you recall political reporters grilling W about cocaine allegations and his "character" when he was running? Neither do I. Bush said he didn't want to talk about it and that was that.

I won't belabor the absurdity of two highly paid "newsmen" claiming to have the slightest clue about the experience of "ordinary Americans" or the patronizing responses of Clinton and McCain, essentially arguing that economically-assaulted Americans are not obviously bitter, which implies the masses are enjoying the nearly $4-a-gallon ride. But Gibson thinks an annual salary of $200,000 is middle-class, when the overwhelming majority of Americans earn a quarter of that! And we're talking about Obama's elitism?!

Memo to Gibson and former Clinton confidante (conflict of interest?) Stephanopoulos: Other than "analyzing" the meaning of bitterness and the Sean Hannity inspired Weather Underground nonsense, Obama's "character" issues are well covered in David Mendell's book Obama: From Promise to Power.

Maybe you ought to read it, where you can learn why thinking Americans are way past you, having already read about why Obama begrudgingly accepted his campaign manager (David Axelrod's) advice to dumb-down his policy-wonkish speeches to better connect with ordinary people before he ran for the Senate, later criticized by Washington pundits, Clinton and McCain as being "empty rhetoric."

The book, published LAST SUMMER, covers everything from young Barry in Hawaii to the emerging Barack in Chicago who spent his post-Harvard years actually living and working with poor people (unlike his "regular folk" challengers). Mendell also explores Obama's relationship with the "controversial" Rev. Wright, since you can't seem to get enough of questioning the patriotism of a former Marine who happened to passionately articulate what a lot of "bitter" people are thinking and feeling.

If you want to keep a secret from "informed" Obama-critics, apparently the safest place to hide it is inside a best-selling book.

Though most public criticism I see and hear of the "mainstream media" doesn't distinguish between print journalism and TV news (and there are important differences, in terms of format, content and variety), it's still fair to ask: how are newspaper staff cuts and the Stephanopoulos/Gibson farce related?

Newsweek's Tony Dokoupil notes: "less than one person in five believes what he reads in print, according to the Project for Excellence in Journalism" ... and a recent Sacred Heart University study found that nearly nine in 10 Americans believe that journalists are actively biased."

It's good that news consumers are skeptical but the internationally televised gotcha "debate" in Philly has only made a bad news biz situation worse. And when I say "bad," I'm speaking in relative terms. "Bad" means news organizations are only bringing in 20 percent profit, instead of 30 percent margins owners used to rely on. Not exactly the poorhouse.

Having been a reporter and columnist for 13 years and an assistant news editor for a year, I don't want to be too glib about what many of my colleagues consider as a "crisis in journalism." News gathering is an expensive, labor-intensive, boots-on-the-ground mission and the current business model is in serious financial trouble.

Why should non-journalists care about newspaper cutbacks and why should cynical bloggers hold off on celebrating a dying "old media"?

PBS NewsHour anchor Jim Lehrer provides a succinct answer. "Most of the major stories about which there is so much talk, consternation, blogging, and yelling on shout shows all began with a print news story," Lehrer told USA Today.

He's right. The overwhelming majority of original reporting is done by professional print journalists. Most stories covered on local TV news stations are culled from the morning paper. Talk radio wouldn't exist without newspapers and just about every political and news blog in the country is a literal para(web)site of print journalism.

"The fewer the resources that are devoted to (print journalism), the poorer the public." Take The Washington Post story about the scandalous conditions at Walter Reed Army Medical Center, for example.

"Nobody would have known about that without The Washington Post devoting four months, with two reporters working full time, to that story. Those kind of resources won't be available if newspapers continue to cut back."

Without strong newspapers, "my worry is that nobody else is going to fill in that reporting vacuum."

That giant sucking sound you hear is democracy wheezing.

Murky CIA Activity at Military Outposts

With California weather in my blood, Cape Cod spring feels like an extension of winter.

What keeps me warm until summer comes is baseball -- and fantasies about vacationing on a tropical island like Guam, where my 6th- and 7th-grade best friend, David Reed, and his Navy dad were transferred to from the now defunct Oakland Navy Base.

"Where's Guam?" I asked.

"It's some tropical island in the North Pacific Ocean. Kinda like Hawaii, but no tourists," Dave said. Then, already honing my gift of asking conversation-changing questions, I said: "Why do we have a base in Guam?"

It wasn't until years later I learned that Guam is a key FOB. That's military jargon for "forward operating base," just one of a million or so military acronyms.

In a world where America is the self-appointed global cop, a FOB is like a police precinct -- a strategically located substation from which hardware and personnel can be quickly dispatched to keep the neighborhood rabble in line.

Diego Garcia is the other key FOB that people who consider themselves well-informed about the Busheviks "war on terror" ought to know about.

David Vine, assistant professor of anthropology at American University and author of the forthcoming book Island of Shame: The Secret History of Exile and Empire on Diego Garcia, details the post 9/11 significance of these FOB's, especially Diego Garcia -- the coveted military outpost in the Indian Ocean's Chagos Archipelago, where the beaches look like one of those Corona beer commercials.

In the 1950s, U.S. war planners were worried about local populations catching the decolonization bug sweeping the Third World. So the U.S. Navy came up with the "Strategic Island Concept," which, in part, identified the British colony of Diego Garcia as a good place to build an isolated base, helping to ensure that former colonial subjects in the Middle East and Africa understood that freedom means whatever the hell the Washington consensus says it means.

But, there was one small problem. Actually, 2,000 small problems -- the Chagossians, with ties to the island since the Portuguese first shipped in slaves and indentured laborers from Africa and India in the late 18th century to work the coconut plantations run by French Mauritians.

When British officials were secretly negotiating a 50-year lease with the U.S. in the 1960s, British diplomats were cutting a deal to give Mauritius its independence -- minus Diego Garcia, which just so happens to be in violation of the U.N. Charter, if you're into that kind of namby-pamby stuff like me.

The Brit playbook called for the Palestine play -- relocate much, if not all, of the indigenous population into a neighboring country to make way for new settlers. For the Palestinians, GB had Jordan in mind. For the Chagossians, it was Mauritius that was to absorb the dispossessed.

Of course, the Chagossian problem would be a lot easier to handle because there were only a couple thousand refugees and not several hundred thousand with milennia-old roots in "holy land." And like Golda Meier famously described Palestinians, the Chagossians have been said not to exist, which explains why most mainstream news accounts of the tiny atoll include some line about it being "an uninhabited island" -- a remnant of British government propaganda intended to "as one official put it, 'maintaining the fiction' that the Chagossians were transient contract workers rather than people with roots in Chagos for five generations or more," Vine observes.

Vine goes on to point out the growing military importance of DG ever since the Chagossians took their coconuts to Mauritius. Fast forward to forward operating base Diego Garcia during Gulf War I. It served as the prepositioned weapons-and-supply cache for Marines sent to Saudi Arabia in 1991. The island, named after a ship, later became a launch pad for lobbing long-range bombs on Iraq.

After the '91 war, "the dream for many in the military became the ability to strike any location on the planet from Barksdale Air Base in Louisiana, Guam in the Pacific, or Diego Garcia," Vine reports.

After the 9/11 attacks, DG became even more strategically significant. The Air Force sent 2,000 of its personnel to a new 30-acre housing facility there called "Camp Justice."

(Seriously, who the hell comes up with these ridiculous names)?

When the U.S. invasion of Afghanistan began, B-1, B-2, and B-52 bomber sorties were flown out of "Camp Justice" and the island's blue lagoons were used to store prepositioned weapons and supplies for the 2003 invasion of Iraq.

In 2006, with the publication of Stephen Grey's Ghost Plane documenting the presence of a CIA-chartered plane used for rendition flights at DG, reports of "Camp Justice" being a CIA "black site" for detainee interrogation started to eke out. Official rumors were followed by a Council of Europe report identifying Diego Garcia as a secret CIA prison location, along with "black sites" in Poland and Romania.

This past February, British Foreign Secretary David Miliband told Parliament he learned of two instances when the Bush administration, in violation of the base lease with Britain, used Diego Garcia like a Guantanamo University satellite campus.

"The State Department's chief legal adviser said CIA officials were 'as confident as they can be' that no other detainees had been held on the island ... Within days, U.N. special investigator Manfred Novak announced new evidence that others had been imprisoned on the island. Many suspect the United States may hold detainees on secret prison ships in Diego Garcia's lagoon or elsewhere in the waters of Chagos."

With the legal dye having already been cast for secret "renditions" of murkily-defined "enemy-combatants," kept in secret prisons without recourse to Habeas corpus, consider yourself warned. If some guy in a suit approaches you on the street and says you've won a free dream trip to the exclusive tropical paradise of Diego Garcia -- RUN!

Photos I've seen of the island are gorgeous but it would be hard to appreciate the beauty while getting waterboarded.

In the meantime, you might ask your Congressman: why there hasn't been hearings on where -- and what -- in the world is Diego Garcia and those other FOB's?

The Color of Wealth

I'm morally exhausted from dealing with, and talking about, race too.

But if we're going to have a conversation, I like the tone and tenor set by the Senator from Illinois after being forced to go there because of a transparently hypocritical "controversy" in which the black guy is predictably caricatured as "anti-American" and/or "anti-white reverse racist."

In his historic Philadelphia speech on race, Obama envisioned two parallel tracks "on the path of a more perfect union."

Black America, he advised, should embrace "the burdens of our past without becoming victims of our past," while white America, Obama rightly noted, ought to honestly -- and without guilt -- confront the fact "that what ails the African-American community does not just exist in the minds of black people; that the legacy of discrimination -- and current incidents of discrimination, while less overt than in the past -- are real and must be addressed."

Before there can be meaningful discussion, typical Americans will need to come face to face with some meaningful facts about U.S. economic history. Beneath the superficial race talk is the very real and complex issue of the color of wealth.

Actually, the important book The Color of Wealth -- written by a multi-racial research team at United for a Fair Economy -- is as good a resource as any to delve into the complexity of race and class in America. It's also a good way to get acquainted with some basic history that helps explain this great country's persistent racial wealth divide.

The racial impasse: according to poll after poll, the majority of white America sees African-American economic prospects being just as good, if not better, than their own. The general perception in white America is that "the playing field is level," the polls tell us. And, if you're an ethnic immigrant, you don't see what all that white supremacist history has to do with you anyway. Yet, a good segment of black America continues to talk about the persistence of "institutional racism" and how whites have an unfair and unacknowledged advantage etc.

Starting with the (obvious) observation conservatives seem to think is some kind of sublime insight into human nature, The COW points out: "of course, individual effort does make a difference in financial success, compared to how the same individual would have fared without putting forth an effort. But Americans begin the race from different starting lines. Not only do well-off people, primarily whites, have significant head starts, but even many working-class whites have modest advantages when compared with working-class people of color, most of whom begin far behind whites' starting line."

So while it's true, for example, that black per capita income doubled between 1968 and 2004, jobs and income are only one small part of the picture. Wealth and assets, and how these economic foundations cascade down through generations by way of inheritance, is at the heart of the matter.

If you look at Census data, The COW correctly notes that "three-quarters of white people own their homes, while a slight majority of people of color are renters. In times of inflation, housing becomes easier to afford for homeowners with fixed mortgage rates, while renters see their housing costs rise."

"In times of recession or depression, those with savings accounts can better weather unemployment, while those without savings can be sunk into debt and deprivation. And in times of economic growth, those with assets can invest them or borrow against them to take advantage of business opportunities."

It gets deeper.

And Thomas Shapiro goes deep in analyzing the research on race and inheritance in The Hidden Cost of Being African-American. Shapiro, a sober-minded white guy, reports that whites are much more likely to inherit money from deceased relatives than people of color, noting that one in four white families received an inheritance after a parent's death, averaging $144,652, while only one in 20 black families inherited money or assets with an average worth of $41,985.

Another study found that as of 1989, one third of white baby boomers stood to inherit more than $25,000, compared to one in 20 black baby boomers.

Those numbers indicate that most white people DO NOT get any inheritances from deceased family estates. But when Shapiro interviewed black and white working class families, he found it far more common for white working-class families to hand down modest sums.

And as The COW points out, "whites who get such help often don't think of themselves as inheritors, but consider such transfers to be just a normal part of family life. Contributions to a down payment on a house and college tuition are the most common forms of family financial aid."

"About half of white families give this kind of head start to young adults, compared with about one in five black families."

Citing Shapiro's scholarship, COW notes that "in white families, money flows from parents to children, while in black families, money flows from adult children to their parents and other relatives."

None of these tip-of-the-iceberg facts means that white Americans haven't really earned it, or worked hard. But it does point to the inescapable importance of previous generations' economic status in explaining present day wealth distribution -- whether a family's financial foundation goes back to the 1862 Homestead Act when millions of acres of land were given to whites exclusively; or involves GI Bill college benefits used by millions of white World War II vets not accessible to most blacks because of segregation; or traceable to restrictive property covenants that prevented white home owners from selling to black buyers until the 1950s.

Personal responsibility? YES. But, as Shapiro puts it, "the real story of the meaning of race in modern America must include a serious consideration of how one generation passes advantage and disadvantage to the next. While ending the old ways of outright exclusion, subjugation, segregation, custom, discrimination, racist ideology, and violence, our nation continues to reproduce racial inequality, racial hierarchy and social injustice that is very real and formidable for those who experience it."

We can't even begin to have a fruitful talk about race without acknowledging some basic historical facts, without which you can color this whole debate stuck on stupid.

Why Does Congress Leave War to the President?

"Conservative or liberal, we are all constitutionalists" -- Barack Obama, The Audacity of Hope

It's like a perfect storm. A "unitary executive" jet stream swirls over the nation's capitol. There's hailstorm 24/7 news coverage of presidential politics. Add to the mix the 5th anniversary of the fog of war in Iraq and we're talking near-zero visibility.

For us fair-weather fans, we can take some solace in the irony that the present fog happens to coincide with Sunshine Week -- a time the Fourth Estate devotes to shining a light on the Constitution.

Let there be light -- even if it's just a sliver of sunshine to chase away the shadows cast over the Constitution -- the explicit source of authority to "declare war ... raise and support Armies," as well as the implicit power of overseeing military matters.

These are powers that America's constitutional authors saw fit to invest in Congress; not the President (see Article I, Section 8).

And that's why the first ever U.S. Congressional investigative committee was established to probe the 1792 military engagement against this continent's indigenous people in the "Northwest Territory." U.S. forces were under the command of General Arthur St. Clair and Congress wanted to know how the hell a bunch of "backward" Indians managed to wipe out half the General's army.

Charles Stevenson, a former longtime professor at the National War College and now with the Nitze School of Advanced International Studies at John Hopkins University, tells us that for the next century or so about half of all Congressional investigations were related to military activities. But in the second half of the 20th century, only about 10 percent of all congressional hearings involved defense or foreign policy issues.

As a former longtime professor at the National War College, Stevenson's scholarship provides useful reference material, noting that "despite widespread views that the standard -- and preferred -- practice is for Congress to go on vacation once a war starts, leaving all key decisions to the President and his commanders, there are ample precedents showing vigorous congressional involvement in the management and oversight of major military operations. Sometimes that involvement has been disruptive or even harmful, but often it has been constructive."

The origins of this Constitutional debate can be traced to August 17, 1787 when the Committee on Detail took up the power to "make war." The point was made that Congress would act too slowly, but James Madison's argument won the day when he suggested changing "make war" to "declare war," which would give the President "the power to repel sudden attacks" without violating the spirit of checks and balances.

And so the argument goes: supporters of broad Congressional war powers cite Madison, while unitary-executive-types call on Alexander "Strong President" Hamilton, though putting war power in the hands of Congress wasn't the real flashpoint of the early debate. Standing armies was the issue; so much so that prominent patriots like Patrick Henry and James Monroe refused to sign the new Constitution because of their opposition to standing armies.
In fact, the standing army beef is what gave birth to the Third Amendment, prohibiting soldiers from being "quartered in any house, without the consent of the owner."

Interesting to note: Congress has only declared war in five conflicts but has authorized military action on 15 occasions, using a variety of language with varying degrees of specificity (not including the various military engagements that occurred without formal Congressional approval).

Even more interesting: since World War II, not a single military action has been authorized by Congress using the "declare war" phrase. Wars after World War II have been pursued through other use-of-force authorizations.

That includes authorizations for military force in both Iraq wars -- the difference being that the 1991 war had U.N. Security Council backing before hostilities began. The other big difference is that Poppy Bush's Iraq War was 80 percent funded by other nations. Iraq War II has been financed with borrowed money because of W's stubborn commitment to tax cuts for the wealthy.

Congressional power to end military engagements? There's Nicaragua 1932. Somalia 1994. Haiti 1995. Oh wait, I skipped Algeria 1815, when Congress refused to give the President an I-declare-war card.

The most far-reaching Congressional war power on the books is the 1973 War Powers Resolution, enacted with an override of President Nixon's veto.

That law, among other provisions, requires the President to consult with Congress before committing troops to hostile action.

"The bottom line," to go back to Stevenson, "is that Congress need not sit on the sidelines as wars approach or are fought. The precedents ... provide an ample menu of options, if lawmakers are willing to make the judgments and take the risks and opportunities available."

After five years of war in Iraq, the fog has clouded the Constitution. Of course, even in the sunlight, there are those who will cling to the foggy notion that the President is the be-all and end-all when it comes to military matters. But, in the sunlight, such reasoning can be seen on the wrong side of the Constitution and without historical precedent.

Did Clinton Underplay the Gender Card?

Maybe Hillary just has bad timing. She just so happens to be running for Prez when the country is suffering from a virulent strain of Bush fatigue, exacerbated by a recurring post-Clinton hangover, having downed a bottle of vintage 1990s neoliberal economics served in a really hot-looking stock-bubble glass. Now, we're all on the rocks.

That's one part Newt Gingrich eye, two parts corporate-friendly trade agreement, preferably NAFTA brand, with a few shots of right-wing Monica Lewinsky pucker to really give you that warm and fuzzy feeling inside.
Throw in the fact that Hillary is running against a political Barackstar and -- man -- she's really up against it. (Though it's possible she could pull it off, with the help of superdelegates and a few million stubborn voters in Texas and Ohio).

Not that Hillary hasn't talked about the historic possibility of being the nation's first female president. But, why hasn't she played the "gender card" -- to the hilt? And I'm not saying that just because March is Women's History Month.

Ever hear of "womenomics?"

Though the term has been around since at least 1999 (and probably longer in feminist circles), "womenomics" really made a splash in April 2006 when The Economist chose to focus on the untapped Gross Domestic Product (GDP) growth to be had in closing the gender gap.
"Arguably, women are now the most powerful engine of global growth," , The Economist observed, laying out the case for why it makes economic sense to increase the participation, and pay, of women in the workforce, the world-over.

Anticipating the (sexist?) counterargument that the more women work, the fewer children they will have, Economist editors offered this rebuttal:

Keep reading... Show less

Understanding the Obama Surge

I like Hillary. And I don't quite understand the visceral hatred she evokes in some people. But I also don't get why the Clinton camp likes to talk about "experience" and her "record."

She's only been a senator for seven years. What record? I called up my peeps in the Big Apple to ask if I've missed something. Did Hillary invent the Internet? Cure cancer? Rein in corporate power? What record?

Whatever.

Bottom line: other than Kucinich, any Democrat who's been in the House or Senate while Bush has been in office shouldn't be talking about their record.

But, if you must compare Clinton and Obama's record, a good place to start is Congressional Quarterly.

"Judging by their Senate records, voters could pick either one of them and get more or less the same package. Clinton and Obama may ... (talk) about their differences ... but the reality is that their Senate careers have been more similar than their campaigns would ever admit," CQ reports.

"Their voting records are nearly indistinguishable. Although both have good working relationships with Republicans, Congressional Quarterly's annual vote studies show that Clinton and Obama both had strongly partisan voting records last year. In fact, both of them joined their fellow Democrats in mostly party-line roll calls more often than their own majority leader, Harry Reid of Nevada. In the past year, Clinton voted with her party on 98 percent of the questions that pitted a majority of Democrats against a majority of Republicans, while Obama's score was 97 percent. Reid sided with his party on only 95 percent of those votes."

CQ notes that both have had some successes but those are the exceptions to the rule. "Obama can claim credit for being a central player, along with Democratic Sen. Russ Feingold of Wisconsin, in the enactment of last year's lobbying and ethics law; Clinton's intervention at key points helped pave the way for the creation in 1997 of the State Children's Health Insurance Program, or SCHIP."

Pocketbook issues may be the Number One voter-concern, but the invasion and occupation of Iraq is the defining moral/political issue of our time, which CQ barely mentions.

"There is one major disagreement that isn't reflected in their Senate records: Clinton voted to authorize the Iraq War in 2002, while Obama spoke out against it. Obama has won strong support from anti-war Democrats because of that difference, but because he wasn't in the Senate at the time, he wasn't able to cast an official vote against the war."

Missing from the CQ analysis is the fact that Clinton, unlike Edwards, has never come out and said: "You know, I was absolutely dead wrong about Iraq." Instead, she politiks the issue by doing the whole we-were-given-bad-info show, which is simply not credible given that people like me were writing about the lack of WMD in Iraq as early as 2000, based on explicit information provided me by former UNSCOM inspector Scott Ritter and other on-the-ground experts.

The idea that I had better intelligence on Iraq than Clinton is absurd and so is the I-was-misled line. No, the reality is, she didn't do her homework.

And, to top it off, she voted for Lieberman-Kyl amendment, declaring the Iranian Revolutionary Guard a terrorist organization, which many see as the needed justification to green-light an attack on Iran and evidence that Clinton is stuck in her distorted pre-Iraq invasion judgement.

I'll admit I have a certain natural Gen X affinity for the Senator from Illinois -- not because he's black. That's pre-millennial thinking. (The whole gender/race/Vietnam War/liberal-conservative hang-up the boomers are on ain't -- yes, I said, ain't -- an issue for us post-Civil Rights kids. Race and gender still matter, but not ultimately. It's like, even though we've never had a black or woman president, we're past that. You think the Bush twins have Lawrence Welk in their iPod? I bet they jam to Will I Am, and probably even Kanye West, his Katrina emergency fundraiser remarks notwithstanding).

Again, I like Hillary; McCain too -- as people. But, I don't vote based on whether or not I think it would be cool to break bread or have a beer with a candidate. And though I'm on the senior end of the generation that's making its presence felt in this surge of Obamaism, I'm not so naive to think presidential politics is THE ANSWER to America's problems.

In fact, I've written a number of columns explaining my view of U.S. "change"-history; namely that every step this nation has taken toward fulfilling its democratic (small d) potential was preceded by a movement -- whether we're talking the abolition of slavery, woman's suffrage, labor rights, or civil rights. Obama speaks to that.

But, it's not his rhetoric or record that's entices me. I'm intrigued because he's the only left-handed candidate in the field. See, I'm a southpaw myself. And lefties have a tendency to blaze their own path in this right-handed world, partly because lefties use the right (creative) side of their brain. And if there's one thing this Bush-fatigued nation needs is to regain its right mind.

What Is the Point of Congress?

Seems our neighbors in the northeast have grown over-weary of the current White House regime treating the Constitution and the laws enacted by the people's representatives like an optional menu.

Petitioners in Brattleboro, VT gathered enough signatures (5 percent of the electorate) to put a question on their upcoming town ballot that calls for Bush and Cheney to be arrested "for crimes against our Constitution."

Predictably, Bush loyalists and assorted Drudge Report readers across the nation are none too pleased -- even though the Vermont measure, which will be voted on March 4, is about as symbolic as they come.

News of the ballot question circulated through cyberspace. A storm of neo-complaints followed, hitting Brattleboro like a wicked Nor'eastah.

"In e-mail messages, voicemail messages and telephone calls (to Brattleboro officials), outraged people are calling the measure the equivalent of treason and vowing never to visit Vermont" the Associated Press reported.

One caller asked: "Has everyone up there been out in the cold too long?" Another said: "I would like to know how I could get some water from your town. It's obvious that there is something special in it."

Others, like Brent Caflisch of Rosemount, Minn., sent an e-mail message that was a bit less circumspect. Oh ya. "Maybe the terrorists will do us all a favor and attack your town next, our country would be much safer with several thousand dead wackjobs in Vermont," according to the AP account.

That's what they're calling defenders of the Constitution these days -- "wackjobs," like Paul Craig Roberts.

Though he's not a Brattleboro resident, Roberts is one of millions of Americans bitten by the same impeachment bug buzzing around Brattleboro. He also happens to be the former Assistant Secretary of the Treasury during the Reagan administration, former associate editor of the Wall Street Journal editorial page, contributing editor of National Review and author of the book "The Tyranny of Good Intentions."

Roberts breaks it down like this: "In truth, Congress gave up its law-making powers to the executive branch during the New Deal. For three-quarters of a century, the bills passed by Congress have been authorizations for executive branch agencies to make laws in the form of regulations. The executive branch has come to the realization that it doesn't really need Congress. President Bush appends his own 'signing statements' to the authorizations from Congress in which the President says what the legislation means. So what is the point of Congress?"

In case you're wondering, a "signing statement" is a presidential footnote attached to a law passed by Congress. It instructs the executive branch on how to interpret the law, essentially giving the president line-item veto power to cherry-pick provisions of law to be followed or ignored, turning the idea of checks-and-balances into a joke.

Historically, signing statements have been used, on rare occasion, by Republican and Democrat administrations going back to the days of James Monroe and Andrew Jackson. But since Reagan, signing statements have been used with increasing frequency. According to the Law Library of Congress, No. 43 has issued over 700!

Ironically, perhaps, it was candidate Clinton's husband, and now fierce campaigner, whose promiscuous use of signing statements was checked by the Supreme Court in the 1998 case Clinton vs. City of New York, declaring line item vetoes unconstitutional.

Now, with everyone focused on the primaries, Bush gives us another example in signing the 2008 National Defense Authorization Act last week, attaching a signing statement even after having rejected Congress' first version because it would have supposedly made the Iraqi government vulnerable to "expensive lawsuits."

Congressional Quarterly reported on the provisions Bush intends to ignore: "One such provision sets up a commission to probe contracting fraud in Iraq and Afghanistan. Another expands protections for whistleblowers who work for government contractors. A third requires that U.S. intelligence agencies promptly respond to congressional requests for documents. And a fourth bars funding for permanent bases in Iraq and for any action that exercises U.S. control over Iraq's oil money." (More on this here.)

Bush may be a lame-duck but don't sleep. If Congress doesn't grow a Brattleboro-like backbone and impeach this administration before they leave office the precedent will be set and the Constitutional question of the future will be the one Roberts is asking: What is the point of Congress?

Where Are the Working Class Heroes?

"A working class hero is something to be" -- John Lennon

Dear John,

What do you think about a "Working Class Hero" remix? Maybe change the chorus a bit. "A working class hero was something to be..."

In the years before your death, compassionate politics focused on the poor and the working class. The politics of today, at least the "compassionate conservative" variety, has cut-and-run from the "War on Poverty," proclaiming the half-hearted effort a failure. In the new millennium, the "War on the Middle Class" is all the rage.

Oddly enough, John, serious people -- mostly Art Laffer lovin,' Ron Paul Republicans -- still argue we live in a "classless society," which means you're considered a "radical" provocateur of "class warfare" if you talk about class out loud. It's classy not to talk about class. Apparently, panhandling policies geared toward removing the poor from sight aren't enough. Now, we don't want to even hear from poor folk. Today's motto is: the poor should not be seen, or heard. Next stop: eugenics. Survival of the richest.

It's no longer compassionate to serve the poor anything other than a nice, warm cup of shut-the-hell-up to go with their healthy portion of Bill Cosby sermon. Outside of pious worship services and stop-gap charity organizations, you can't talk about poverty without explicitly or implicitly implying that the poor deserve to be poor because they're stupid and lazy.

Even the leading Democrat candidates are careful not to utter the words "poor" or "working-class" in their speeches. It's all about "the middle class" -- a phrase more slippery than a hockey rink covered in Crisco.

Of course, there's lots of vague and vacuous verbiage slithering out of politicians mouths. Words like "change" and "hope" and "experience." And "middle class" -- for which, there's simply no consensus on how to clearly define. Ask the world's economists for a definition, line their answers up next to each other, and you still couldn't reach a conclusion.

OK, that's an exaggeration. Economists have a squishy sense of what kind of loot qualifies as middle-class. But even that's misleading because being middle-class isn't just about income. What's middle-class on Cape Cod is different than what's middle-class in Charlotte, N.C. or Marin Country, California, for example. Depending on where you live, the price of middle-class life varies.

And depending on what expert you ask, middle-class income ranges from $40,000 to $100,000 a year, give or take. But if you ask Mr. and Mrs. Average American, you'll get a much different picture. According to the National Opinion Research Center, 50 percent of families who earn between $20,000 and $40,000 a year think of themselves as "working class" or "middle-class." Nearly 40 percent of families earning between $40,000 and $60,000 annually, and 16 percent of families who earn over $110,000 a year, think of themselves as "middle class."

Congress recently asked its research service to define "middle class." Using 2005 Census Bureau data, and beginning with a look at income levels, CRS found 40 percent of the nearly 115 million households in the U.S. earned less than $36,000 a year. The next 40 percent rung up the economic ladder made between $36,000 and $91,705 annually. The top 20 percent made $91,705 or more. But, as MSNBC reported, "those numbers don't adequately reflect the state of mind of those who consider themselves middle class. Surveys have shown that, while people consider $40,000 a year to be the low end of what it takes to buy a middle-class life, some people who make as much as $200,000 a year still consider themselves middle class."

The popular middle-class state-of-mind may explain why politicians pander to the mushy middle but that shouldn't be confused with populism or appealing to the true American majority. Close to half of all American households are bringing in less than $36K a year!

Of course, John, it's ridiculous to think the life-opportunities for a family earning $40,000 annually -- a quarter of which might go to pay daycare expenses -- is even in the same ballpark as 200K a year families. And that's what's got me scratching my head.

When presidential candidates talk about "the middle class," are they talking 200K or the 20 to 40K range? It would be interesting (and maybe disheartening) to hear the candidates get more specific about which "middle-class" they're referring.

I won't hold my breath, waiting for an answer. So I figured I'd write to you, John, because you have a better view. Maybe you can tell me: Where's the working-class hero?

What Happens When Politicians Promise Change

I'm old enough to remember when parents actually made their children go outside and play. Matter of fact, when I was a kid some of my kinfolk would even suggest where to play. "Hey, Sean, why don't you go play in traffic." They were joking. I think.

Maybe it was the Marine in him, but my stepfather's go-outside-and-play ethic didn't let up, even on special occasions like the Super Bowl. I think it was the year the 49ers gave the Broncos a world class horse-whoopin', Pop said something like: "Why are you watching people play on TV? Never mind watching the game. You should be outside doing it yourself."

Times have changed.

But to be fair, it was easier go out and play when I was a kid, back when TV went off the air in the wee hours to sound of the national anthem; back when there were only three or four channels, including all of those snowy channels on the UHF dial; back when Super Bowl winners stomped their opponent into oblivion by halftime. There wasn't any of this down-to-the-wire, New England Pats style games where the game is won by a field goal or because of a referees call. Nope. In my day, the Super Bowl was over by halftime. "Oh, it's 48 to 3? Let's go outside and play football, like the man said!"

So, I suppose my stop-being-a-couch-spectato childhood helped condition me to be skeptical -- maybe even cynical -- about presidential politics, which has all the trappings of a spectator sport where muted questions lurk in the dark recesses of the mind: Why am I rooting for this team? Who cares which teams wins? I don't win anything, other than the fleeting and overrated feeling of I-told-you-so pride. Nothing changes for me or anyone I know.

I know the conventional wisdom. Voting is a civic duty. Election '08 is a watershed historical moment. "Change" and "hope" are in the air. Reminds me of when Bill Clinton's political star was on the rise, only to be followed by him "feeling our pain," while marching the Democratic Party steadily rightward.

With Mike Huckabee and Barack Obama being the underdog winners in Iowa, "change" and "hope" are with us again. And that's a good thing.

Like most Baptist preachers, Huckabee is a smooth talker; though Obama's Iowa victory speech was on the next-level -- genuinely moving and probably a lock for Great American Speech Hall of Fame status. But, all this "change" rhetoric is a little suspicious. And it's not just the voice of my step-father talking.

After studying the foreign policies of past presidents, whenever I hear presidential candidates talking about "change" I reflexively sneer: yeah right. Because one of the most striking aspects of the foreign policy decisions of past presidents is their consistency -- not in style, but substance; no matter who's in the White House defending the "national interest" abroad. The consistency being that "national interest" is understood to mean business (class) interest, as Maj. General Smedley Butler tried to tell us in "War Is A Racket" back when FDR was in office.

Change? One of Hillary's key advisers is Madeliene Albright -- the diplomat who, when asked on "60 Minutes" if she thought the 500,000 Iraqi children who died under the U.S.-led sanctions (during the Clinton years) was worth it, answered: "We think the price was worth it."

One of Obama's foreign policy advisers is Anthony Lake. The same Anthony Lake who was "pushing for the invasion of Haiti" as a diplomat in the Clinton administration, as he told PBS' "Frontline" in 2004.

On the Republican side, Huckabee is reportedly yukking it up with political strategist Dick Morris, Clinton's former adviser. And Huckabee's national campaign manager is GOP insider Ed Rollins, Ronald Reagan's former campaign director.

Guiliani is down with Norman Pohoretz, author of "World War IV" and the conservative intellectual who thinks we should bomb Iran, like, yesterday.

In Sen. McCain's corner is Alexander Haig, whose record overseeing U.S. policy in El Salvador, Guatemala, Honduras and Nicaragua sends shivers down the spines of peace-lovers everywhere.

Mitt's man is Cofer Black, longtime CIA officer and Blackwater USA executive.

Change? You want that in quarters or dimes and nickels?

What Does the Future Hold for Us?

It's the start of a new year. 'Tis the season to frolic in the future -- a time to consider our utopian dreams and confront our dystopian nightmares. Or, as my grandfather is fond of saying, now's a good time to hope for the best but prepare for the worst.

So, let's check the futurology front. And who better to consult than the forward-looking folks at The Futurist magazine?

Every year since 1985, TF editors have been publishing "the most thought-provoking ideas and forecasts" that surfaced in their pages in the previous year. This year's "Outlook Report" is fascinating, as always, including the Top 10 future forecast for 2008.

1. The world will have a billion millionaires by 2025.

Meanwhile, water-shortages and poverty will grip two-thirds of the earth's population. Wealth-gap? How about wealth chasm?

2. Fashion will go wired as technologies and tastes converge to revolutionize the textile industry.

We're talking "color-changing or perfume-emitting jeans and wristwatches that work as digital wallets." They call it "smart fabrics."

3. The threat of another cold war with China, Russia, or both could replace terrorism as the chief foreign-policy concern of the United States.

Citing Edward Luttwak, the editors note, "scenarios for what a war with China or Russia would look like make the clashes and wars in which the United States is now involved seem insignificant. The power of radical jihadists is trivial compared with Soviet missile capabilities, for instance. The focus of U.S. foreign policy should thus be on preventing an engagement among Great Powers."

4. Counterfeiting of currency will proliferate, driving the move toward a cashless society.

There's Big Brother and then there's Big Brother's big brother -- Big Biz Brother.

5. The earth is on the verge of a significant extinction event.

The World Resources Institute reports the coming century "could witness a biodiversity collapse 100 to 1,000 times greater than any previous extinction since the dawn of humanity. Protecting biodiversity in a time of increased resource consumption, overpopulation, and environmental degradation will require continued sacrifice on the part of local, often impoverished communities."

6. Water will be in the 21st century what oil was in the 20th century.

Though the world is full of water (the salty kind), water shortages and droughts are popping up faster than season premiers of yet another new reality TV show. Future war protestors will be carrying signs that say "No War for Water!"

7. World population by 2050 may grow larger than previously expected, due in part to healthier, longer-living people.

Think the world is crowded now? Too much traffic? The UN upped its global population forecast from 9.1 billion people by 2050 to 9.2 billion.

8. The number of Africans imperiled by floods will grow 70-fold by 2080.

Despite the racially-tinged savage imagery we have of the "dark Continent" in the U.S., Africa is actually experiencing rapid urbanization, which is why World Trends & Forecasts cautions that "if global sea levels rise by the predicted 38 cm by 2080, the number of Africans affected by floods will grow from 1 million to 70 million."

9. Rising prices for natural resources could lead to a full-scale rush to develop the Arctic.

And we're not talking only oil and natural gas. There's also a huge Artic supply of nickel, copper, zinc, coal, freshwater, forests, and even fish, all of which are needed to feed the insatiable and metastasizing global economy.

10. More decisions will be made by nonhuman entities.

"Technologies are increasing the complexity of our lives and human workers' competency is not keeping pace well enough to avoid disasters due to human error," pushing us toward "electronically enabled teams in networks, robots with artificial intelligence, and other noncarbon life-forms" who "will make financial, health, educational, and even political decisions for us."

Memo to Election '08 reporters: It's important to focus on the Big Issues of today. But the time is ripe to get the presidential candidates talking (out loud) about the Big Issues of tomorrow.

Questioning them about any one of the above forecasts might stimulate a much-needed discussion about vision, while getting the candidates to reveal something deeper than poll-driven answers and platitudes. Listening to people talk about their vision for the future is how you know if someone is utopian or eutopian.

The word "utopia" comes from the Greek for "no place" or "nowhere." Utopian thought is meant to describe a vision of a better future society -- but one beyond our grasp.

Eutopia is a vision of a preferable place -- but one with a bridge that gets us from here to there. Visions of a better society don't attract a critical mass of people. Only future visions with a visible, viable bridge can do that -- a lesson many progressives have yet to learn.

It's not just a vision thing. It's also a bridge thing.

Juicing Up Baseball Players and Mercenary Soldiers

"The steroid culture ... biceps bulging, chests shaven and buttocks tender." -- Tom Verducci, Sports Illustrated.

Like Michael Vick, Barry Bonds may soon be trading in his uniform to suit up in a Yankee-esque prison pinstripe -- though, perhaps not without a small measure of vindication.

But I won't lie. As a former baller and faithful follower of the great American past time, its sad to see players (workers) get Mitch-slapped by MLB's official report on The Steroid Era, while baseball execs get T-ball kid-glove treatment for their culpability in fostering a drug-friendly environment, kinda like the official Abu Ghraib investigation.

In fact, the Mitchell Report is reminiscent of another recent official disclosure -- the CIA's public unveiling of the "family jewels." At that time, I wrote about being left with the same dejected feeling I had when Geraldo cracked Al Capone's vault on live TV: That's it?! Like Yogi said: it's déjà vu, all over again.

Of course, having been accused by some readers of playing the "race card" (whatever that means) for questioning the selective moral outrage being heaped on Barry Bonds enlarged head over the years, it'll be interesting to see the response of the millions of Bond-haters to the asterick-sizing of Roger Clemens.

As uncomfortable as it may be for the "colorblind," sports analyst David Zirin hit a home run with his take on "The Rocket" science contained in the 409-page report.

"The Mitchell Report confirms not only suspicions about Clemens, but also the existence of an outrageous media bias and double standard. While seven time MVP Barry Bonds was raked over the conjecture coals for years, Clemens got a pass. Two players, both dominant into their 40s, one black and one white, with two entirely different ways of being treated. It doesn't take Al Sharpton to do the cultural calculus."

Phil Taylor of SI.com concedes the point: "The names in the Mitchell report confirm what Bonds' defenders have been saying all along, that if he did use performance-enhancing drugs, he had plenty of company, and that it's unfair to single out his accomplishments as tainted when so many of his fellow ballplayers also were users. Today, feeling the weight of those 80-plus names, it's hard to argue that point."

But I will say this for baseball puritans -- I mean, purists: their passion is admirable. In a Yeatsian world where "the best lack all conviction, while the worst are full of passionate intensity," it's nice to see that kind of passion for the integrity of a tradition.

Now, if only we can somehow inject that pathos into the body politic, we'd be getting somewhere. There's two other steroid scandals brewing and both of them are far more vital to the health of the nation.

Where's the no-cheating moral tsunami in the wake of the suit recently filed by the estates of Iraqis killed when Blackwater USA personnel opened fire on civilians in Baghdad's Nisoor Square on Sept. 16?

The First Amended Complaint filed two weeks ago alleges: "Blackwater routinely deploys heavily-armed 'shooters' in the streets of Baghdad with the knowledge that up to 25 percent of them are chemically influenced by steroids or other judgment-altering substances, and fails to take effective steps to stop and test for drug use."

Writing for Wired, Noah Shachtman brings the real 'roid picture into clear view, comparing MLB and the Pentagon's common problem: "Our military's use of the private military industry has become an addiction that parallels athletes' increasing turn to artificial substances to get ahead." Just as steroids give athletes the ability to hit the ball further, Shachtman contends, "so too has injecting more than 160,000 private military contractors into Iraq."

Shachtman also points out what should be obvious to even the casual observer: "short-term performance enhancement comes at a cost." The side-effects of outsourcing military operations "has led to such results as billions of dollars missing in taxpayer funds, soldiers poached away from a stretched thin military, and contractors 'Getting Away with Murder,' as one recent report on the industry was entitled."

"Do we just accept Bonds (Clemens) and Blackwater as the future? Or, are we going to put an 'asterisk' besides the recent era and reign back in our addictions?"

Then, on Pearl Harbor Day, the New York Daily News dropped this bombshell.

"NYPD brass is considering joining the ranks of pro sports and giving cops random tests for anabolic steroids ... The proposal comes after 27 NYPD officers cropped up on the client lists of a Brooklyn pharmacy and three doctors linked to a pro sports steroid ring."

Similar reports are popping up across the country.

There may be "no crying in baseball," but when there's far more hue-and-cry about steroid-using sports stars on athletic fields then there is about juiced up cops and private mercenaries roaming real life battlefields in which the lives of spectators are more at risk than the participants, it's a sad day.

The Missing Link in Creationism

In my neck of the woods -- actually Woods Hole in Falmouth, Mass. to be exact -- a new front in the "Culture War" has opened up.

A federal lawsuit has been filed against a biologist at the world-famous Woods Hole Oceanographic Institution by a zebra fish researcher named Nathaniel Abraham, alleging his civil rights were violated when he was fired because his belief in creationism.

The same day that story broke in the Cape Cod Times, the Associated Press had a story about how anti-evolutionists have come up with a new strategy in the battle against the unifying principle of the biological sciences.

The AP reported: "arguments for inserting skepticism, rather than religious concepts, into evolution lessons emerged after a federal court ruling nearly two years ago struck down the teaching of intelligent design in biology classes in Dover, Pa., said Michael Ruse, the director of Florida State University's program on the history and philosophy of science."

Ruse calls it "Strategy No. 4." What were the first three strategies? Strategy No. 1: Prohibit teaching it. The 1925 Scopes Monkey Trial put an end to that strategy.

Strategy No. 2: Get creationism taught in schools -- the literal biblical account of creation -- as an alternative to the "theory" of evolution. But courts rejected that strategy in the 1980s, Ruse said.

Strategy No. 3: Promote "Intelligent Design (ID)" -- the notion that "the universe's order and complexity is so great that science alone cannot explain it."

That strategy hit the legal wall in Dover, Penn., where a judge ruled that ID was religion-in-drag, pretending to be science, which meant teaching it in public schools constituted a violation of the separation of church and state.

And that brings us to Strategy No. 4: "Ruse described it as presenting evolution as an 'iffy hypothesis' instead of what it really is -- a scientific theory 'that's accepted like the Earth goes around the sun.'"

The new strategy seems to be losing steam too. "A suburban Atlanta school board abandoned its effort to put stickers in high-school science books saying that evolution is 'a theory, not a fact,' and South Carolina's Board of Education rejected a proposal to require students to 'critically analyze' evolution."

I don't know what strategy our zebra fish creationist is employing but I do know that in the "culture wars," as our conservative brethren call it, the teaching of evolution is considered nothing less than a satanic assault on the image of God.

I confess my heresy: like the Jesuit theologian/paleontologist Pierre Teilhard did 50 odd years ago, I'm a believer whose made his peace with evolution. But then, I've never understood why science and faith are discussed as if they're mutually exclusive. Folks who think evolution is an inherently atheist argument or those who think evolution disproves the existence of God are people with little imagination.

The evolution vs. creationism debate may be an unavoidable political fight but much more relevant and revealing is what many evolution-believing secular conservatives and evolution-denying religious conservatives have in common: a belief in social Darwinism.

A popular misconception is that Darwin coined the phrase "survival of the fittest." Actually, Darwin's thing was "natural selection," which turns out to involve lots of cooperation.

The origin of "survival of the fittest" can be traced to British philosopher Herbert Spencer, who had an illustrious career justifying racism and imperialism with his pseudo-science 50 years after Darwin published The Origin of the Species.

Spencer bastardized Darwin's theory and attempted to apply his misunderstanding of evolution to politics and economics. Thus began a political tradition in this country that has reached its apogee today, in which public policy is seen as a vehicle to prevent the weak from being "parasites" on the "fit."

Former Labor Secretary Robert Reich marvels as I do at how "the modern Conservative Movement has embraced social Darwinism with no less fervor than it has condemned Darwinism."

Listen to Spencer's own words: "Society advances where its fittest members are allowed to assert their fitness with the least hindrance."

Listen to any domestic policy debate about crime or education and you'll hear Spencer lurking beneath the surface in arguments justifying everything from war to incarceration rates to wealth disparities.

All that supply-side, Ronald Reagan, freedom stuff about meritocracies and the liberal conspiracy to "dumb-down" America with egalitarianism is social Darwinism -- in defense of the liberty of the "natural aristocracy."

So while science battles evolution-opponents, I'm trying to understand a conservative political species that opposes evolution on religious grounds while supporting social Darwinism on the political and economic grounds.

There's a missing link here.

The Get-Tough- On-Crime Bug is Making Us Sick

Doctors recommend people get the flu shot this time of year. And I recommend folks get inoculated for the get-tough-on-crime bug before the campaign season gets in full swing.

The highly contagious mental malady prevents people from thinking clearly about crime and punishment. It's how we get legislation like the "Aid Elimination Penalty" provision in the Higher Education Act (HEA) that bars students with drug convictions from receiving federal financial aid for college; apparently, to teach them a lesson.

By "them," in this case, we're talking about disproportionately disadvantaged students. Rich, stoner kids don't need financial aid.

The push to repeal the provision from the HEA was dropped in Congress two weeks ago, despite the efforts of the Coalition for Higher Education Act Reform, though they did get one small victory in getting the law amended so that the AEP applies only to offenses committed while a student is getting financial aid.

Still, an estimated 200,000 students have been denied financial aid because of AEP, as Congress moves toward reauthorizing the bill, AEP and all.

It's just one of the many strains of the get-tough-on-crime virus that attacks the popular political mind, re-defining "justice" as an institutionalized form of making sure "those" people get what they "deserve."

In his 1966 study of the American penal system, Dr. Karl Menninger discerned a diagnosis, offering two simple observations that exposes the get-tough-on-crime approach for what it is: an irrelevant distraction in dealing with the "crime problem."

First, Menninger observed, most criminals are never caught, meaning: convicted criminals -- the people in prison -- are only the minority of law-breakers foolish enough, brazen enough, poor enough or unlucky enough to get caught.

And, of the minority of offenders who are behind bars, not even all of them are guilty, as we are reminded with alarming frequency by periodic news reports of yet another wrongly convicted inmate later exonerated by DNA evidence. It was true in 1966 when Menninger wrote The Crime of Punishment and it's true today.

According to the most recent figures I could find -- the FBI's 2002 Crime Index Offenses Cleared data -- law enforcement agencies, nationwide, had a 20 percent clearance rate for all crimes.

For violent crimes, the clearance rate was 46.8 percent (for murder it was 64.0 percent; aggravated assault, 56.5 percent; forcible rape, 44.5 percent; and robbery, 25.7 percent) -- compared to 16.5 percent for property crimes, excluding arson.

Given the official numbers, we can see that a symptom of the get-tough-on-crime bug is the idea that "the crime problem" will be significantly impacted by meting out punitive justice to the minority of criminals who are actually caught.

Secondly, and more importantly, Menninger reminds us, most convicted criminals will eventually be released. If we remove law-breakers from society to merely punish and humiliate them for their crimes because they "deserve it," and then release them as economic and political pariahs, by doing stuff like denying drug offenders financial aid for college, the only thing we've done is warehouse workers in labor camps that double as crime universities where penitentiary professors teach their pupils the tricks-of-the-trade.

Arguing about whether the criminal justice system should be about retribution or rehabilitation is besides the point. Because it won't change the reality that if you put people with problems in a place that makes them worse-off and then turn them loose on society without any resources, the only thing you've done is exponentially increase the chances of that person becoming a repeat offender, guaranteeing many more future victims.

Menninger's medicine goes right to the heart of the get-tough-on-crime rationalization: "'Doesn't anybody care about the victims?' cry some demagogues, with melodramatic flourishes. 'Why should all this attention be given to the criminals and none to those they have beaten or robbed?'"

"This childish outcry has an appeal for the unthinking. Of course no victim should be neglected. But the individual victim has no more right to be protected than those of us who may become victims. We all want to be better protected. And we are not being protected by a system that attacks 'criminals' as if they were the embodiment of evil."

Until the "crime problem" is diagnosed and treated as a "social safety problem," like Menninger suggested, we'll keep having these get-tough-on-crime policies creating more future victims. Inoculate yourself.

Music for the Brain

This week's installment of As the World Burns, features presidential politics, which I find amusing for a number of reasons, not the least being the way electoral politics gets covered in the popular press, as if going to the polls every couple of years is the very essence of democratic (or republican) citizenship.

Last week, Michael Tanner of the Cato Institute wrote a "fair and balanced" commentary for Foxnews.com, raising the question: "what if economic conservatives stay home on election day?"

From his laissez-faire libertarian perch, Tanner characterizes most of the GOP presidential candidates (except Ron Paul) as being nothing more than big government hucksters in conservative clothing. He then veers off into anti-intellectual territory to drive home an ideological point about limited government, at the expense of presidential hopeful Mike Huckabee.

What's Huckabee's cardinal sin? He "failed to call for spending cuts. He actually wants to increase spending on a variety of programs, from education to infrastructure. He even wants the federal government to fund art and music programs in the nation's schools." Gasp!

Not that Tanner is interested in engaging the reader in a serious discussion about the role of federal government, education, art or music, but his remarks do touch on a serious question, as it pertains to the indoctri -- I mean, education -- system in this country.

Just to riff on one chord, let's consider the question: what does music have to do with improving education?

Going back to Plato and Aristotle, music has been considered one the "Four Pillars of Learning."

Plato said: "The decisive importance of education in poetry and music: rhythm and harmony sink deep into the recesses of the soul and take the strongest hold there. And when reason comes, he (the student) will greet her as a friend with whom his education has made him long familiar."

Aristotle said: "We become a certain quality in our characters on account of music."

Even Allan Bloom said: "Music is at the center of education, both for giving passions their due and for preparing the soul for the unhampered use of reason."

Add to that a growing body of research that tells us arts/music education enhances academic achievement, which is why a consortium of the nation's largest educational associations issued a statement of principles on "The Value and Quality of Arts Education," calling for basic arts education to be recognized as a serious, core academic subject.

A week before Tanner entertained us with a bit of ideological idiocy, Harris Interactive, an online polling and market research firm, published a survey, reporting that people with "more education and higher household incomes are more likely to have had music education."

Some survey highlights:

Two-thirds (65 percent) of those with a high school education or less participated in music compared to four in five (81 percent) with some college education and 86 percent of those with a college education. The largest group to participate in music, however, are those with a post graduate education as almost nine in ten (88 percent) of this group participated while in school.

Participating in music programs can also provide people with certain skills that can be utilized in a job and career. Just under half (47 percent) of those who were in a music program say music education was extremely or very important in giving them the ability to strive for individual excellence in a group setting.

A plurality (44 percent) say music education was extremely or very important in teaching how to work towards common goals and two in five (41 percent) say it was extremely or very important in providing them with a disciplined approach to solving problems.

Just over one-third say music education gave them the skill of creative problem solving (37 percent) and how to be flexible in work situations (36 percent).

If you're interested in other reference material, check out the May 23, 1996 issue of Nature, which published a study about first-graders who participated in music classes and saw their reading skills and math proficiency increase.

Also, according to several studies conducted by the College Board, music/art students scored consistently higher on both the math and verbal sections of the SAT.

Some people say that art and music are an impractical luxury for schools, given the hyper competitive demands of "globalization" that require a strong background in science and math.

Laying aside the wrong-headed workforce-training assumptions behind such an instrumentalist educational philosophy, another way of looking at it is: given the "new economic" reality -- where workers won't have long-term jobs or careers but multiple jobs and careers -- the advantage goes to those with nimble minds and creative intelligence; not the proficient test-takers our education factories are producing.

Improving the achievement gap? Raising test scores? Preventing kids from dropping out? We need more music education, not less. The math is simple.

How to Improve No Child Left Behind

It looks like Congress will recess for the holidays before they take up the re-authorization of the No Child Left Behind Act. That means it's probably a safe bet to assume education will emerge as a central campaign issue in the run-up to regime change in Washington -- the 2008 elections.

In the meantime, plenty of suggestions will be offered as ways to improve NCLB 2.0. So like an open source programmer, I'll just contribute a bit of code just to get the idea ball rolling.

But, first, let's test your knowledge of noted achievers.

What did the famed attorney of the Scopes trial, Clarence Darrow's parents and teachers say about him when he was a school boy? That he would never be able to speak or write.

One of the intellectual giants of the 20th Century, philosopher Jean Paul Sarte had to pretend to do what? Read.

The towering literary figure Marcel Proust had problems in school. Why? He couldn't complete a paper.

What terrified Carl Jung as a student? Math.

What did Beethoven's tutor think about his pupil, who, by the way, is said to have never learned how to multiply or divide? His tutor thought he would never be a very good composer.

Fill in the blank. Behind his classmates in reading and writing, Pablo Picasso ___ school? Answer: Pablo Picasso hated school.

President Woodrow Wilson couldn't do what until he was eleven? Read.

Why did Thomas Edison run away from school? His teacher caned him for not paying attention or being able to sit still in class.

Okay, here's where the test gets a bit more challenging.

Harvard University Professor of Cognition and Education Howard Gardener made a name for himself with his work on "multiple intelligences." What are Gardener's seven intelligences?

Answer: linguistic, spatial, kinesthetic, musical, interpersonal, logical mathematical, intrapersonal.

Psychologist and dean of arts and sciences at Tufts University, Robert Sternberg is known for developing what theory? Answer: The "triarchic theory of intelligence," which is a fancy way of saying that there are three kinds of human intelligences: componential, contextual, and experiential.

"Intelligence tests and other tests of cognitive and academic skills measure part of the range of intellectual skills. They do not measure the whole range. One should not conclude that a person who does not test well is not smart. Rather, one should merely look at test scores as one indicator among many of a person's intellectual skills," Sternberg points out, which would explain why he served on the American Psychological Association task force that told Charles Murray and Richard Herrnstein where to stick their Bell Curve.

The New York Times Education Life page reports had an article recently highlighting Sternberg's most recent effort. For the second year in a row, "Tufts is inviting applicants to write an optional essay to help admissions officers pinpoint qualities the university values -- practical intelligence, analytical ability, creativity and wisdom."

"These attributes make students intellectual leaders, according to Sternberg, a psychologist whose work on measuring intelligence inspired the experiment." I have no idea how to reconcile Gardener's seven types of intelligences with Sternberg's three, but the essence of what they're saying is articulated in psychiatrist and educational consultant Dawna Markova's book How Your Child Is Smart.

"Because of extensive research that has been done, and our own experience, we believe that children who have been taught through their natural learning styles become the achievers in school; those who experience difficulty do so because they are not being taught in ways that respond to how they learn."

OK, test over.

Here's my code (idea): before any new fill-in-the-bubble crazy NCLB legislation is considered, let's develop a test that measures multiple intelligences and think up a way to identify teachers’ teaching styles.

Then, parents and teachers would have a better way to assess and adjust long-term learning plans that should be required for every student entering elementary school, matching them with the appropriate teaching styles along the way.

Is there a better learning environment than a place where a child's natural will to learn and their innate intelligence is tapped by a conducive teaching style? And what's more motivating to a teacher than a student willing to learn?

In the meantime, my oldest daughter and I have been college-shopping. With a passion for Spanish and poetry and genuine intellectual interest in cognitive science, she's due to graduate from high school next May (holding down a 3.8 cumulative GPA after three-and-half-years on strict AP/Honors diet). I'm going to encourage her to fill out an application for Tufts.

How Racism Affects IQ

"If we mean to have heroes, statesmen and philosophers, we should have learned women. If much depends ... on the early education of youth and the first principles which are instilled take the deepest root, great benefit must arise from literary accomplishments in women." -- Abigail Adams
Thanks to my family support network, I never fully internalized the idea that most, if any, of my private-school peers were more intelligent than me. But, I did assume that everyone outside my family did.

That's why I never asked for help in school. I wrongly thought that asking for help amounted to an admission that black people were, in fact, inferior, as was periodically pronounced from the ivory towers of academia and other corners of the race-conscious IQ industry.

Fortunately, I was able to flip that negativity into motivation. But when I read about the remarks made recently by the esteemed biologist James Watson, I winced at the thought of how many black youngsters might continue to internalize the destructive but persistent message that they are inherently less intelligent. And I wonder if they too will turn that negativity into motivation.

Far from the modern conservative utopia of a "color-blind society," Euro-centric racial chauvinism seems to rear its ugly head in the popular press every few years or so -- from the eugenics of Arthur Jensen to Charles Murray and Richard Herrnstein's Bell Curve to Dr. Watson, renowned for his role in describing the double-helix structure of DNA and for representing America on the international Human Genome Project.

Watson told the London Sunday Times he's "inherently gloomy about the prospect of Africa" because "all our social policies are based on the fact that their intelligence is the same as ours -- whereas all the testing says not really."

He expressed a hope for human equality but added that "people who have to deal with black employees find this not true." And here we go again.

But did you hear about the recent work of African-American economist Roland Fryer of Harvard University and University of Chicago economist Steve Leavitt, who co-authored the popular book Freakonomics?

Fryer and Leavitt responded to Watson's scientifically-unfounded claims by pointing to Department of Education research that includes test data of the mental abilities of one-year-olds.

"While you might think it would be impossible to capture anything meaningful at such a young age," Leavitt writes, "it turns out that these measures of one-year-olds' intelligence are somewhat highly correlated with IQ scores at later ages, as well as with parental IQ scores."

They found "no racial differences in mental functioning at age one, although a racial gap begins to emerge over the next few years of life ... The observed patterns are broadly consistent with large racial differences in environmental factors that grow in importance as children age."

"We cannot rule out the possibility that intelligence has multiple dimensions and racial differences are present only in those dimensions that emerge later in life."

Their paper was rejected in the second round of reviews for publication in the American Economic Review. But, these findings do seem to dovetail with insights coming from cognitive scientists.

"The sort of learning a child acquires in kindergarten and the early grades is not the true foundation of her education," leading cognitive researcher Dr. Stanley Greenspan notes. "The 'three R's' and all that follows -- symbolic and increasingly abstract academic knowledge -- cannot be understood by a person who has not grasped the sequence of skills that make learning possible."

Greenspan is talking about the pre-school emotional roots of cognitive development. But, the trend in education, especially under No Child Left Behind, is to focus on external carrots and sticks when its internal emotive experience that primarily shapes the way brains grow and think.

If we were serious about increasing student academic achievement, we would see to it that all women get top-rate pre-natal and post-natal care, especially single-mothers of future students. Instead, legislation that would increase the number of children with access to health care -- indispensable for physical and emotional health -- gets vetoed in order to appease the private insurance industry.

If we were serious about education reform, we would put a policy-emphasis on early childhood development, based on what science is telling us about the need for creating learning environments that have nothing to do with the "factory" test-taking model that typifies American public schools.

In the meantime, I've got to build up my three year-old son's EQ (emotional intelligence quotient) to counter the likelihood that his IQ will be questioned someday simply because of his skin color.

Why Have the Children Been Left Behind?

A child's education should begin at least one hundred years before he is born. -- Oliver Wendell Holmes Jr.

The debate in Congress over whether to reauthorize No Child Left Behind (NCLB) is underway.

What's with these politically-calculated, brand-name, PR-speak, Orwellian euphemisms? Clear Skies Act. Operation Enduring Freedom. USAPatriot Act. No Child Left Behind. Who, other than Hal Lindsey fans, would want a child to be left behind?! What unenlightened creature, harboring "the soft bigotry of low expectations," in the words of the President, would be opposed to legislation that promotes "academic excellence?"

Loaded terms aside, the Repubs are ready to march in line behind Bush while the Dems say they have issues with the law's mandate that state's rely on standardized tests to measure "adequate yearly progress" in reading and math.

The most obvious problem with NCLB is the gap between the lofty sounding rhetoric coming out of the President's mouth, and the money. Since the law came into effect in 2002, it's been underfunded by an estimated $56 billion.

Last month, Bush held up New York City as an example of how to improve "underperforming" schools. "If New York City can do it, you can do it," he said.

Two weeks later, the New York press is reporting how "the feds are cheating the city out of $3.3 billion in education funds -- money promised to help kids pass a slew of new standardized tests."

In 2007, for example, Bush promised $1.8 billion to Big Apple schools, but only delivered $834 million -- a 54 percent shortfall, according to U.S. Rep. Anthony Weiner, D-Brooklyn and Queens.

Despite the under-funding of NCLB, student test scores have generally increased and the so-called achievement gap between ethnic groups has narrowed somewhat since the law was enacted. But, no one can say whether those gains are because of NCLB or other factors.

And there doesn't seem to be any discernible relationship between standardized test results and good grades, class rank or other measures used to predict success in college or the job market.

Ben Sears of Political Affairs magazine captures the situation succinctly. "This is resulting in narrowing the curriculum as schools focus on preparing for the tests and are forced to reduce instructional time for 'non-tested' subjects."

"All this forces one to wonder. Could NCLB as presently written be part of the long range plan of the Administration to undermine public education? If the law's harsh provisions result in more schools being branded 'failures,' could that lead to an exodus from the public schools in to the proliferating charter schools or religious or other private academies?"

"And could the law generate such frustration with the federal government's clumsy attempt to influence education policy, that it causes a 'backlash' movement opposing any federal role?"

Of course, high standards and accountability are worthy ideals but the public appears to be growing weary of this over-reliance on testing. A Phi Delta Kappa/Gallup poll in June found that 52 percent of public school parents felt there's too much testing -- up from 32 percent in 2002. And, 75 percent of public school parents said the focus on testing was forcing teachers to teach to the test, not the subject matter.

This growing skepticism is based on more than just gut instinct or warmed-over Dr. Spock feel goodism. A survey conducted by the Center on Education Policy shows that 71 percent of America's 15,000 schools had cut instruction time at the expense of other subjects like history, art and music.

But, even more basic than these concerns are those coming from the scientific community. "Contrary to traditional notions ... emotions, not cognitive stimulation, serve as the mind's primary architect," Dr. Stanley Greenspan details in his book The Growth of the Mind.

So while we're obsessing over high-stakes testing and focusing on the minds of teenagers, there's not much attention being paid to what's in the babies' heart -- the foundation of theirs (and ours) educational future.

And not just the babies but their mothers too. I think Abigail Adams was onto something when she wrote to her husband and Founding Father, John: "If we mean to have heroes, statesmen and philosophers, we should have learned women."

"If much depends ... on the early education of youth and the first principles which are instilled take the deepest root, great benefit must arise from literary accomplishments in women."

Building a Case to Fight Iran

As if the imperial hubris displayed by the Bush administration in Iraq wasn't enough, we're seeing the pieces to another pre-emptive war puzzle fall into place.

Last week, the Senate passed the Lieberman-Kyl amendment, 76-22 -- a resolution urging the State Department to designate the Iranian Revolutionary Guard Corps a "terrorist" organization.

The Sens. Joseph Lieberman and Jon Kyl proposal drew bipartisan support, while a small group of Democrats sounded alarms that the Iranian "terrorist" label could be interpreted as congressional authorization to use military force against Iran.

Sen. Christopher Dodd, D-Conn., who voted against the amendment, reminded his spineless colleagues about the 2002 congressional vote authorizing President Bush to launch the illegal Iraq invasion. "We shouldn't repeat our mistakes and enable this president again."

Sen. James Webb, D-Va., called the Lieberman-Kyl measure Dick Cheney's "fondest pipe dream" -- a reference to Cheney cheerleaders and other Bush hawks who have been beating the war drums over Iran.

On Sunday, former U.N. ambassador John Bolton and now a war-mongering shill for the American Enterprise Institute, gave a speech in England over the weekend, frothing at the mouth about how the U.S. should "consider the use of military force" against Iran.

Bolton talked about "a limited strike against their nuclear facilities," adding that Iranian President Mahmoud Ahmadinejad was "pushing out" and "is not receiving adequate push-back" from the West. He said air strikes should be followed by an attempt to remove "the source of the problem" -- Ahmadinejad.

"If we were to strike Iran it should be accompanied by an effort at regime change as well, because I think that really sends the signal that we are not attacking the people, we are attacking the nuclear weapons program," he said. Sigh.

Norman Podhoretz, who is now a senior foreign policy adviser with Rudy Giuliani's 2008 presidential campaign, told London's Sunday Times that he urged Bush to bomb Iran in a meeting with the current White House Occupant late last spring at the Waldorf Astoria hotel in New York.

Regime change? A reality-distorting demonization campaign focusing on one man? Sounds familiar.

One hint that the attack-Iran crowd is not dealing with reality is symbolized by the amateurish attempt of Columbia University President Lee Bollinger to give Ahmadinejad a tongue-lashing. Look, Ahmadinejad's Holocaust-denial is disgusting, nor is he a champion of human rights, but we shouldn't let emotion blind us to reality. As the Hasidic proverb reminds us, when you add to the truth, you subtract from it.

Bollinger called Ahmadinejad "a dictator." As a well-known legal and free-speech scholar, I hope Bollinger was embarrassed. He ought to know that the Iranian president doesn't really run the country. The big decisions in Iran are made by the Grand Ayatollah, who has forbidden the development of nuclear weapons by Iran as being contrary to Islam.

And what about the 1981 Algiers Accord, which secured the release of the American hostages held in Tehran? The U.S. pledged in that agreement to forsake overthrow attempts of the Iranian government, as we did to the Mossadegh regime in 1954.

In typical Bush-world fashion, Bolton calls for "regime change" without acknowledging the reality, on-the-ground.

While neocons argue that air and missile strikes against Iran would cripple the regime to a point where opponents would rise up against the government, Iranian opposition leaders, on the ground, say that war "would certainly unify the (Iranian) population around the regime and would be used to justify further repression," Middle East expert Stephen Zunes points out.

Zunes also rightly dashes unrealistic hopes for a coup, given that pro-U.S. elements in the Iranian military were "thoroughly purged soon after the revolution."

"The leadership of Iran's military and security forces, while not necessarily unified in support of the more hard-line elements in government, cannot be realistically expected to collaborate with any U.S. efforts for regime change in their oil-rich country."

What recent history has shown, over and over again, is that the most effective means for democratic regime change comes from internal nonviolent movements, like the kind that toppled dictatorships in countries such as the Philippines, Bolivia, Madagascar, Czechoslovakia, Indonesia, Serbia, and Mali, Zunes elaborates, citing a study by the conservative-leaning think-tank, Freedom House.

The Freedom House study, after examining the 67 transitions from authoritarian regimes to varying degrees of democratic governments over the past few decades, concluded that the changes were not because of "foreign invasion, and only rarely through armed revolt or voluntary elite-driven reforms, but overwhelmingly by democratic civil society organizations utilizing nonviolent action and other forms of civil resistance, such as strikes, boycotts, civil disobedience, and mass protests."

Lord, wake us from our slumber.

The Counterterrorist Companion

"Whatever is funny is subversive, every joke is ultimately a custard pie." -- George Orwell.

Whether we agree on the precise nature of the legacy of 9/11, one thing's hard to dispute: terrorism is the sun of the U.S. political universe and the issue around which all other issues orbit.

And now, along comes Zack and Larry Arnstein's book, The Ultimate Counterterrorist Home Companion, a welcome gift for the citizen-soldier. (Actually, it was a gift to me. The authors sent me a copy for free! It's one of the cool things about being a nerdy book-loving columnist. Lots of free books!)

The 165-page "manual" is a quick read, providing a hilarious antidote to the fear-mongering fueling the "war on terror," prodding us to lighten up, poking fun at the powers-that-be, while offering a subtle reminder that deserves to be taken seriously: nothing is more subversive to power than to laugh in the face of it - be it the power wielded by terrorists or the power abused by governments everywhere.
The Counterterrorist Companion begins by paying humorous homage to the most hyperbolic political slogan to come down the pike in a long time.

"9/11 has changed everything. No longer can America live the peaceful culture of mutual respect that we have been blessed with since the beginning of our nation," begins the first chapter, exposing the absurdity "changed everything" with another absurdity.

"We are more afraid than ever about terrorism, and rightfully so ... terrorism has risen to become one of the Most Leading Causes of Death in America, preceded only by tobacco use, car crashes, and being eaten by a bear."

That's followed by a chart showing where terrorism ranks among other causes of death with the added caveat that "in order to make a nice looking chart, we have understated the number of deaths caused by tobacco use by a factor of 10."

The Counterterrorist Companion contains 34 short chapters, ranging from "Planning Your Family Antiterrorism Drill: What to Do When Little Lucy Can't Assemble Her Matador 25B Anti-tank Rocket Launcher in under 37 Seconds" on down to "the Religious Chapter."
Of course, there's a few corny parts in the book but you try writing 165 pages of side-splitting humor and I'd bet not every word is going to be the greatest satirical barb ever written.

But even when it seems the authors have gone too far astray in their comedic analysis -- Chapter 9 ("Moats: How to Make Them, Are They Still Useful?), for example -- the reader is brought right back home with an incisive, if indirect, insight into contemporary American culture.
Just when I was saying to myself: moats? -- comes: "Yes. Before you begin the process of isolating yourself from society completely, you will need to read this chapter."

The Moat chapter concludes by snickering at the Hollywood-inspired arguments used by "war on terror" cheerleaders while highlighting an area where America is truly vulnerable -- shipping ports.

Terrorists "prefer the more exciting skydiving/helicopter/snowboarding-type attacks. What could be more boring for a terrorist than launching an attack by a slow-moving container ship?"

The Arnstein father-son duo really hit their stride in their chapter on Iraq, "Invading Iraq: It's the Thought That Counts."
"The 2003 invasion of Iraq was a brilliant move in our government's effort to combat terrorism, filled with subtleties that are not often noticed by radical left wing organizations like CNN. Invading Iraq is the equivalent of avoiding a bar fight by pouring a milkshake on your head. Just when your enemy is so angry that he is ready to punch you in the head, you do something so unexpected and bizarre as to make your enemy reconsider his plan of action ..."

"The invasion of Iraq was just like throwing a wild pitch every now and then to keep the batter guessing, and you know what? It worked. Our enemies and friends alike no longer consider us capable of rational thought, and that, friends, is right where we want them."
Unless you're as uptight and humorless as Ann Coulter, you'll find the Counterterrorist Companion an amusing reminder that the most potent weapon in the fight against terrorism is to not be terrorized.

And sometimes you just gotta laugh to keep from crying.

Winning the Class War

Every year around Labor Day, United for a Fair Economy (UFE) issues a report on the excesses of CEO pay.

This year's report -- Executive Excess 2007: The Staggering Social Cost of U.S. Business Leadership -- found that "top executives averaged $10.8 million in total compensation, which is 364 times the pay of the average American worker, a calculation based on data from an Associated Press survey of 386 Fortune 500 companies."

Another key finding: The top 20 equity and hedge fund managers raked in an average of $657.5 million, or 22,255 times the pay of the average worker. Meanwhile, the study notes, workers at the lowest rung of the economic ladder just got their first federal minimum wage hike in a decade. Over that same decade, UFE reports, CEO pay has increased by 45 percent.

None of that is very surprising. What's interesting is the finding that compensation for U.S. business leaders now "wildly dwarfs" the big bucks being paid leaders in other sectors.

The top 20 CEOs of publicly traded corporations last year took home, on average, $36.4 million. That's 38 times more than the top 20 in the nonprofit sector and 204 times more than the 20 highest-paid generals in the military.

Executive Excess aims to dispel the notion that excessive executive pay is a necessary function of modern economies. If that were true, the report's authors argue, "business executives that American executives compete against in the global marketplace would be just as excessively compensated as American executives. They aren't. Top executives of major European corporations ... last year earned three times less than their American counterparts."

Such grotesque pay differentials essentially mean we, as a society, are discouraging needed leadership talent from entering less lucrative fields, such as education, where we could use an infusion of talent.

The thing I like about UFE reports is they always include pragmatic policy proposals. This year's report offers six:

Keep reading... Show less

The Fear of Terrorism?

Thanks to Temple University math professor John Allen Paulos, it can be demonstrated mathematically why the threats to our civil liberties should be of more concern than terrorism threats.

Paulos’ approach to terrorism draws on probability theory and a bit of common sense, specifically, on “the obvious fact that the vast majority of people of every ethnicity are not terrorists.�

Imagine a near-perfect, information gathering and interpretation system that could identify terrorists and stop them before the act of terrorism is committed. Because no system is perfect, Paulos’ system is assumed to be 99 percent accurate. And, of course, for this near-perfect terrorist fly-trap to be really effective it would also have to be able to correctly identify nonterrorists 99 percent of the time.

Such a system would only catch terrorists, right?

“Well, no,� Paulos wrote in an analysis for the LA Times back in 2003. It bears repeating, as the terrorism-centered presidential campaign season heats up, brought to you by Fear Inc.

Paulos applies the near perfect data-mining numbers to a country about the size of America -- a nation of 300 million in which 1,000 “future terrorists� lurk among the citizenry.

With a 99 percent detection rate, the system will identify 990 of 1,000 future terrorists. Pretty good.

But the flip side is ugly. In a nation of 300 million (minus 1,000 future terrorists) there are 299,999,000 nonterrorists. If the system is 99 percent accurate, one percent will be improperly detained as an “enemy-combatant.� How much is one percent of 299,999,000? Just under 3 million. That’s 3 million innocent Americans for every 990 Jose Padillas.

Just to bring it home, we’re talking about 3,000 times more innocent Americans being caught in the dragnet than the number of guilty ones! That alone ought to have each one of us thinking real hard about political priorities.

Despite my miniscule efforts and those of others in the dreaded “mainstream media,� the national media have fallen short on providing context in the “war on terror,� aiding and abetting America’s foreign policy cataracts problem.

How often do you see reports of terrorism with context that points out the relative rarity of actually being a victim of terrorism? And how many articles do you see that call into question the alarmism of say, Gen. Richard Myers, former chairman of the Joint Chiefs of Staff, who said that if terrorists were able to kill 10,000 Americans in an attack, they would “do away with our way of life.�

As John Mueller wrote in a recent of issue of The Bulletin of Atomic Scientists, it’s the subtext of this kind of fear-mongering that’s most interesting. “These hysterical warnings suggest: the ‘existential’ threat comes not from what the terrorists would do to us, but what we would do to ourselves in response.�

Mueller also refers to the 1999 Gilmore Commission, a government-funded advisory group that assessed domestic response to WMD terrorism.

The group “pressed a point it considered ‘self-evident,’ but one that nonetheless required ‘reiteration’ because of the ‘rhetoric and hyperbole’ surrounding the issue: Although a terrorist attack with a weapon of mass destruction could be ‘serious and potentially catastrophic,’ it is ‘highly unlikely that it could ever completely undermine the national security, much less threaten the survival, of the United States.’ To hold otherwise ‘risks surrendering to the fear and intimidation that is precisely the terrorist’s stock in trade.’�

Over the weekend, GOP Sen. John Warner, who wants U.S. troops to start coming home from Iraq by Christmas, said he may support Democratic legislation ordering withdrawals if President Bush refuses to set a return timetable soon.

And, then fear-mongering followed. Sen. John Cornyn (R-Texas), who is also a member of Senate Armed Services Committee, responded by saying: “I don’t think it’s in our best interest to put so much pressure on the new Iraqi government that it absolutely collapses. We don’t want to allow that to happen, because it would make us less safe here at home.�

Fear isn’t just the “stock in trade� of terrorists. It’s a booming industry in America. And if we continue to trade true freedom for security, in fear, the “war on terror� will defeat us from the inside.

Unravelling Wartime Myths

Some readers just don't want to hear it from me -- writing about the myths of World War II.

Maybe they'll feel better if it's coming from Edward W. Wood Jr., a guy who was awarded the Combat Infantryman's Badge, the Purple Heart, and the Bronze Star. A retired city planner and author of "Worshipping the Myths of World War II," Wood is quick to point out: "I was wounded in France, 60 miles east of Verdun ... after only a day and a half in combat. I'm no expert on long-term combat experience." But it's way more combat experience than any of the leading architects of the war in Iraq.

"I got hit in the head, the small of my back, and pelvis with shrapnel from artillery fire," he told me last week. "The wound shattered my life. In those days, you couldn't talk about the emotional impact."

He's 82 now and has spent a lifetime trying to understand war and its impact on those involved. "Worshipping the Myths of World War II" is a product of his very personal, honest and courageous exploration.

Sean Gonsalves: Some people consider talk about the myths of World War II disparaging to veterans. Why do some equate demythologizing with anti-Americanism?

Edward W. Wood Jr.: There's two kinds of soldiers: those who have been in combat and the guys who haven't. I think those two groups have vastly different attitudes about war. Another reason people react that way is because people in the United States have absolutely no idea what war entails. I think a lot of very good people believe that these myths really describe what war is. Therefore, to demythologize means, for them, putting down people who have been involved.

But I don't think we want to look at what our tax dollars are doing. We just don't want to look at the reality that's there. None of this is a disparagement to those who've really seen combat. I am really anti-war now and yet, in terms of my own personal life, I have no regrets about having been in World War II. I believe in nonviolence. But, I'm in favor of a draft because I think we all would think a lot more carefully about war.

Why do you think these myths are so persistent?

These myths are really rooted in our past and go beyond World War II -- going back to the King Phillips War. There were Manifest Destiny wars with Mexico, Phillipines and the rest. Teddy Roosevelt believed you weren't really a man unless you served in war. I don't think we really want to look at that and it's not taught very much in school. It's a very subtle message. I think that's why it persists. The myths exist because we don't look carefully at our history and because it gives people a great glow to wrap themselves in it. But I think it would be healthy to just be open about.

What criteria do you use to judge artistic interpretations of World War II?

There are essentially two really important criteria, I use:

1) Does the author or filmmaker say something about himself that he didn't want to tell the world? Does it really delve into what combat does in the deepest kind of way? Does it wrestle with the moral dilemma of killing -- seeing and watching people get killed -- and how is the act of killing treated?

2) Does it reflect the extraordinary complexity of human reaction to death? And modern war is not just about soldiers in combat but also the impact on civilians. If you don't have that part of the equation, you're missing at least 50 percent of the story.

How do you see these myths impacting the way people view the war in Iraq?

If you look at how we got into the Iraq war, you see the president and his administration using (World War II) as an example for Iraq; comparing Saddam to Hitler and comparing themselves to Churchill and Roosevelt. I don't think that's quite appropriate.

We appeal to the idea of the 'Good War' and a war against evil where the enemy is dehumanized into this monstrous evil ... But, actually, World War I and World War II were really one war. If you look at it that way, what happens in the 1930s was a function of the Versailles Treaty and the terrible reparations imposed on Germany ... So we built up a situation that was going to inevitably lead to another war.

Iraq itself was made out of whole cloth in the early 1920s almost by fiat, putting the Kurds, the Shias and Sunnis in one country. So, even now, we are dealing with the consequences of World War I.

Do combat veterans who've served in the infantry have a different perception of what war means, compared to soldiers in the Navy or Air Force?

The guys who flew in World War II I respect beyond measure -- the 8th Air Force, for example. But I do think it's a different experience for soldiers in the Air Force and Navy, compared to soldiers involved in ground combat. It gives you a very different attitude about what war is.

In the final chapter of your book, you write: "the issue today is not just the simple one -- how can we leave Iraq? -- but, rather one far more difficult: How can we turn from war as the solution to our international problems?" What do you see as the answer to the important question you raise?

I don't have the answer. But we need to get away from the myths and try to understand what war is really like. In this war, the only ones who pay the price are the soldiers and Iraqi civilians. That's so terribly unfair. It infuriates me. If we are in a war, the whole country should pay a price for it.

I've turned against war but if we're at war, we can't have a war and all these people making money off it and living normal lives. It hurts me to see people who go about their lives while terrible death and suffering are occurring in Iraq.

America is not very good at recognizing who its real enemies are. We get it all confused with this problem of evil. We see it in others but never in us.

That's hard for people to even hear, never mind think about.

I'm running against the grain and it gets lonely sometimes.

BRAND NEW STORIES
@2022 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.