The Republican 2016 presidential primary season opened with the “Sheldon Adelson Primary.” The eight wealthiest person in the country, worth an estimated $40 billion, doesn't have to wait for the official GOP primary season to start. He holds his own primary.

Republicans even called it "the Sheldon Primary." Adelson granted audience to GOP presidential hopefuls at the spring meeting of the Republican Jewish Coalition, in Las Vegas. Over the course of four days of Scotch tastings, golf, poker tournaments, and private meetings, the 80-year-old casino mogul examined the GOP's most likely 2016 presidential candidates.


  by  DonkeyHotey 

Adelson single-handedly kept Newt Gingrich in the 2012 presidential race, with nearly $16 million in campaign contributions, some of which financed Gingrich's infamous documentary, "When Mitt Romney Came To Town." When Gingrich ran out of hot air, Adelson poured more than $30 million into Romney's campaign. Whoever wins Adelson's support will have his billions behind them in 2016.

Spending $93 million on losing candidates in 2012 hasn't made Adelson gun-shy about 2016. Adelson is placing his bets more carefully. "He doesn't want some crazy extremist to be the nominee," Adelson friend and GOP donor Victor Chaltiel says. "He wants someone who has the chance to win the election, who is reasonable in his positions, but not totally crazy." (Adelson has advocated using nuclear weapons against Iran. So, "crazy" is relative.)

The "Billionaire's Primary" is a return to what Paul Krugman calls "patrimonial capitalism," where a wealthy few control the "commanding heights of the economy, and use their wealth to influence politics. Thanks to the biggest wealth transfer in U.S. history, the rich are richer than ever. And, thanks to the Supreme Court's Citizen's United decision, there's no limit on what they can spend.

The new billionaire political bosses aren't limiting themselves to national politics. Charles and David Koch made the top 10 in Forbes magazine's list of the wealthiest people on the planet. According to a George Washington University Battleground poll, most Americans have never heard of the Koch brothers, but the Koch's wealth is "trickling down" into local politics.


  by  DonkeyHotey 

Along with spending tens of millions of dollars on 2014 Senate races, the Washington Post reports that the Kochs are funneling money into "hyperlocal" races, through their Americans For Prosperity organization. The Wisconsin chapter is engaged in an Iron County board election, challenging incumbents as "anti-mining" radicals, and distributing 1,000 flyers in a county with just 5,000 voting age residents. AFP is also involved in a local race in Iowa, and property tax fights in Kansas, Ohio, and Texas.

What are the Kochs up to? David Koch says, "Somebody has got to work to save the country and preserve a system of opportunity." But the New York Times is more specific: "The idea is to embed staff members in a community, giving conservative advocacy a permanent local voice through field workers who live in the neighborhood year-round and appreciate the nuances of local issues."

This is nothing new. It's a time-honored strategy, rooted in the notion that, "all politics is local." It worked well for Ralph Reed and the Christian Coalition in the 80s and 90s. Now billionaires are using this strategy, but to what ends, and what are the implications for American politics?

Right-wing billionaires are building their own political machines, to promote their personal interests and preserve their profits. The Koch brothers have poured millions in to campaigns against Obamacare and climate science, as part of a broader campaign against government regulation — which they perceive as a threat to their fossil fuel investments and personal fortunes.

Adelson will do "whatever it takes" to stop internet gambling, to protect the profit margins of his casinos. He's hired former Democratic senator Blanche Lincoln's government consulting firm to lobby for his Vegas corporation. Though not a long-time supporter, Adelson has given Sen. Lindsey Graham (R, SC) $15,600 in campaign contributions. Graham reportedly preparing a bill to ban internet gambling.

Adelson and the Kochs show how the wealthy can use their wealth — in a post-Citizens United political landscape — to impact races and shape policy. Their fire-hoses of money can easily drown out other messages, and narrow the field of candidates for office. The cost of running for office increasingly requires candidates have personal wealth, or wealthy patrons. Those who have neither almost need not apply, even at the state and local level.

Wealthy patrons like Adelson and the Kochs don't invest without expecting an eventual return. They're likely to get what they pay for. A joint Yale and U.C. Berkeley study is evidence that money  does buy access. The study showed that campaign donors are more likely than constituents to get meetings with lawmakers — as a result of, or in hopes of getting campaign contributions. Meeting with constituents may secure votes, but meeting with donors or potential donors can secure enough money for re-election campaigns. (So much for Justice Anthony Kennedy's argument that huge campaign contributions "do not lead to, or create the appearance of, quid pro quo corruption.")

Billionaire political bosses like Adelson and the Kochs are America's new oligarchs. Political parties may at least be influenced by public opinion, but American oligarchs act in their self-interest without concern for public sentiment. They are accountable to no one, and the lawmakers on their payrolls are more accountable to their billionaire political bosses than to the rest of the American electorate.

Wendell Potter writes, in a series of posts about the poor state of dental coverage, that millions of Americans — 36 percent, according to a 2013 survey —put off visiting the dentist because of the cost of dental care. Last year, I became one of them. In pain, and in need of emergency dental care, I was shocked to find out how little my dental insurance covered.

At the age of six, I was physically assaulted by my dentist. The experience left me with dental PTSD that made my childhood dental visits terrifying. As an adult, my dental visits became rare, were usually emergencies, and required Valium.

Last fall, a toothache drove me back  to the dentist’s office. I was able to get the care I needed, but my fear of being in the dentist’s chair again was nearly matched by anxiety over at how much it would cost, and anxiety over how I would pay for it.

I was fortunate.

I wasn’t one of many Americans who end up seeking dental care in the emergency room — at 10 times the cost of preventive care. The ADA Health Policy Resources Center puts the number of dental-related ER visits in 2010 at 2.1 million, and the cost to the health system between $867 million and $2.1 billion.

I wasn’t one of nearly 50 million Americans who live in areas with too few dentists or none at all. Potter describeswitnessing thousands of people waiting in the rain for hours to get medical care in barns and animal stalls on a Virginia fairground, during a three-day Remote Area Medical health care expedition. Most were there to see a dentist, not a doctor. Some drove from as far away as Florida and Wisconsin for dental care they couldn’t find or afford back home.

I found a dentist near my office in downtown Washington, who could see me right away. She took an x-ray, and told me that I needed an emergency root canal, set up an appointment with an endodontist, and gave me a referral. I would also need to return for tooth restoration and a dental crown.

I wasn’t one of the estimated 130 million Americans without dental insurance. I had dental coverage — albeit rarely used — through my employer. Otherwise, I might have ended up in an emergency room, in far worse pain, with my health in even greater danger.

I was, however, one of many Americans who didn’t know how much some dental procedures cost. According to the Bureau of Labor statistics, the cost of dental care has risen faster than the cost of medical care overall. Between 2008 and 2012, only hospital and adult day care costs rose faster, but annual spending caps on dental coverage remain stuck at their original Watergate-era levels, between  $1,000 and $2,000.

I was also one of many Americans who didn’t know how little dental insurance covered. The dentist’s office staff assured me I had “very good” dental insurance. Yet, the cost of a root canal, restoration, and dental crown was close to the $1,500 annual spending cap on my coverage.

I got more bad news when I returned to the dentist for restoration and a temporary crown. She did a thorough set of x-rays, and said that I needed another root canal. Unbeknownst to me, a tooth had cracked as a result of bruxism, and decay had set in.

My dental insurance covered two root canals per year, but the cost of just one — and the necessary reconstruction and crown — was within a few hundred dollars of the annual cap on my dental insurance. It would not cover the second root canal, restoration, and crown.

There is really no such thing as true dental insurance. With traditional health insurance, you actually get more back than you paid in premiums, when you have serious medical problems. It’s not so with dental insurance.

You’ve probably paid more in dental insurance premiums than you’ll get back if you ever need serious dental work. Your annual coverage will run out long before your need for dental care. You’ll have to make up the difference out of your own pocket. If you don’t have, and can’t borrow the money — or find a better price — you go without dental care or put it off for as long as you can.

That’s what I did. After my second root canal, I didn’t make another appointment. I knew I’d have better dental coverage starting in January, when I would finally be covered by my husband’s insurance. Before the Court struck down DOMA, I’d have incurred thousands of dollars in taxes if I’d used the family plan that covered my husband and our kids, because it would’ve been considered a “gift.”

I have a new dentist now. He knows all about my childhood experience, and works to make sure my visits are as free of pain and anxiety as possible. The two permanent crowns are done. I’m due back this week for a cleaning and checkup, and plan to do so every six months from now on. I hope the regular visits, and my own dental  hygiene regimen, will reduce my chances of having another dental crisis.

Like I said, I’m fortunate. Childhood trauma caused me to neglect my dental care, but I was able to get the care I needed — eventually. There are millions of Americans who can’t, through no fault of their own.

This week marks the four-year anniversary of the Affordable Care Act, which only mandates dental coverage for nearly 5.3 children eligible for other federal programs, but not adults. Perhaps, as a commenter on the first post in Potter’s series suggests, it’s time for an “Affordable Dental Care Act.”

Rep. Paul Ryan (R-Wis.) has become the latest right-winger to blame black poverty on “culture” and character. Just as he got it backwards on families and poverty, Paul Ryan gets it twisted on poverty and black men. Ryan went on William Bennett’s “Morning in America” radio show to promote his recent “survey” of anti-poverty programs, and to preview his legislative agenda to cutting funding and case loads for anti-poverty programs. Ryan cited the work of Charles Murray, a conservative social scientist and co-author of The Bell Curve: Intelligence and Class Structure in American Life, who believes genetic differences make African-Americans less intelligent than whites and that, “a lot of poor people are born lazy.” Ryan then launched into a dog-whistle politics take on poverty, using coded language about “inner-city culture” to blame poverty on lazy, immoral black people, without coming right out and saying so.

“We have got this tailspin of culture, in our inner cities in particular, of men not working, just generations of men not even thinking of working, or learning the value and the culture of work,” said Ryan. “So there’s a real culture problem here that has to be dealt with.”

Ryan later backpedaled on his remarks, telling Crew of 42’s Lauren Victoria Burke, “This has nothing to do whatsoever with race,” and that his words were “taken out of context.” But Rep. Barbara Lee (D-Calif.) called Ryan out. “Let’s be clear,” Lee said in when Mr. Ryan says ‘inner city,’ when he says, ‘culture,’ these are simply code words for what he really means: ‘black.’” Ryan has reached for his dog whistle before; once when he told a reporter that the solution to America’s “crime problem” was to go into inner cities and “teach people good discipline, good character,” and again when he complained that “urban voters” gave Obama the 2012 election. Rep. Lee hit the nail on the head. Paul Ryan’s attempt to wriggle off the hook drives it home. Ryan’s remarks on “inner-city” men has a ring of the other shoe dropping. The first dropped when Ryan released his survey of the war on poverty. In the introduction, Ryan writes that, “the single most important determinant of poverty is family structure,” and cites Daniel Moynihan’s 1965 report identifying “the breakdown of the family” as the primary cause of poverty in the black community. So it’s no surprise that Ryan has returned to the subject of black poverty. Paul Ryan is indeed talking about blacks. In his remarks to Burke, Ryan went on to say:

“This isn’t a race based comment it’s a breakdown of families, it’s rural poverty in rural areas, and talking about where poverty exists — there are no jobs and we have a breakdown of the family. This has nothing to do with race.”

When Ryan addresses poverty in rural areas, where the faces of poverty are mostly white, the problem is not that people won’t work, but that “there are no jobs.” When it comes to the “inner city” poor, it’s a different story. Recent statistics pull Ryan up short. A 2006 poll by the Washington Post, the Kaiser Foundation, and Harvard University showed that “Black men report the same ambitions as most Americans — for career success, a loving marriage, children, respect."

  • Black men want to work. Three in four men in the Washington Post/Kaiser/Harvard survey said they value being successful in a career. More than white men or black women.
  • Black men value marriage and family. More than half in the Washington Post/Kaiser/Harvard poll said they placed a high value on marriage.
  • Black men believe in the American Dream. Nine in ten in the Washington Post/Kaiser/Harvard study would tell their sons they can become anything they want to in life.

Contrary to what Ryan and a host of other right-wing pundits believe, black “inner city” men want to work. But the reality is that there are no jobs. The 30-year slow bleed of manufacturing jobs out of American cities and out of the country hit black men the hardest, because black men were over-represented in those jobs. In many cases, those were the best jobs — and the only good jobs — black men could get. Good wages and benefits, fought for and won by labor unions, meant that men without a college education — men like my father — could lift their families from sharecropping to middle-class status in one generation, and give their children opportunities they only dreamed of. Now, good jobs with livable wages have been replaced by low-wage jobs with no “dignity of work,” and no hope of affording even basic necessities, let alone a chance at a better life. As I wrote last week, the “breakdown”of the family is not the cause of poverty. It’s a symptom. Paul Ryan can spare black men his lectures on character and the “dignity of work.” Bring good jobs, with food benefits and liable wages back home, and we’ll take care of the rest.

Paul Ryan says that "the left" is offering Americans "a full stomach and an empty soul." The truth is that conservatives like Paul Ryan are offering Americans empty stomachs and empty rhetoric . The American people want more than that. Near the end of his Thursday morning speech at CPAC (the Conservative Political Action Conference), Paul Ryan told a story about a boy who didn’t want his free school lunch.

The story wasn’t true. Eloise Anderson, an aide to Wisconsin Gov. Scott Walker, did tell Ryan the story at a congressional hearing last summer, but she never met or spoke to any little boy who told her he didn't want his free school lunch.

The story was purloined from a book titled “An Invisible Thread.” The book is about a friendship between Laura Schroff and Maurice Mazyck. They met in New York in 1986, when she was an ad executive and he was an 11-year-old panhandler.

The "brown bag" conversation did happen, but had nothing to do with school lunch programs. Ironically, Schroff and Mazyck are now partnering with No Kid Hungry, an organization dedicated to ending child hunger in the U.S., in part by connecting low-income students with federal programs like school lunches.

It’s never a good idea to take anything that an aide to Scott Walker says as gospel. But Paul Ryan can’t even manage a decent copy-and-paste job on the economic data that he misused and misrepresented to support his screed against anti-poverty programs. He can hardly be expected to fact-check such a good-sounding story.

Ryan’s story isn’t real, but the stigma attached to subsidized school lunches is. Lunchtime can be the most socially stressful part of the school day, for any student. Invisible, ever-shifting social boundaries crisscross school cafeterias. So much is riding on where students sit, or even whether they have friends to sit with.

Students who get subsidized lunches have much more to deal with. Lunchroom practices sometimes reveal students low-income status to their peers. Some schools have separate lines for students receiving subsidized lunches, and students who buy theirs. Others have an “a la carte” line, where students with cash can buy items not available in the subsidized lunch line.

It gets worse.

No wonder some students choose to go without lunch, and not face the stigma.

School districts are finding ways to relieve that stigma.

  • New York schools have held regular promotions, inviting professional athletes to eat subsidized lunches in their jerseys.
  • Other schools have integrated lunch lines, and implemented cashless systems, so that all students go through the same line, and those receiving free lunches are less easily identified.
  • Boston public schools serve free school lunches to all students, even if their families are able to pay, as part of an experimental federal initiative, designed to make it easier for students from low-income families to get free meals, by eliminating the need to fill out forms.

Low income students would face even more stigma if the GOP had its way. Rep. Jack Kingston (R-Ga.) doesn’t mind billing taxpayers for his lunches, but Kingston suggested that schools should have low-income students do janitorial work, like “sweep the floor of the cafeteria,” to “instill in them that there is no such thing as a free lunch.”

Kingston said his remarks were not targeted at a particular income group, but it’s a safe bet that students who can will buy or bring their lunches, and not have to clean up after their classmates. Kingston’s scenario would require “the children from poor families to stick around the cafeteria to sweep up while their better-off friends hitch off to recess.” Students who already skip eating lunch to avoid stigma might just skip school altogether.

Kingston’s views echo those of other conservatives. They reflect a conservative agenda that blames the poor, stigmatizes those who need help, and shames those who receive help.

Republicans are willing to walk their talk.

Paul Ryan need only go to Wall Street – or, for that matter, through the walkways of National Harbor, the shiny new suburban Washington enclave where the CPAC conference was being held – to find “full stomachs and empty souls,” where Americans pick up the lunch tab for some of the very banksters who drove the country into financial disaster and recession. If conservatives prefer full stomachs in corporate boardrooms to full stomachs in America’s classrooms, they are the one’s with “empty souls.”

Originally posted at OurFuture.Org.

The GOP's shenanigans surpass even the worst childish behavior, and are far more damaging. The Republican-engineered government shutdown is doing real harm to real people, and endangering an already fragile economy.

Continue reading….

It’s hard to imagine a more relevant moment for the National Urban League to release its State of Black America 2013 report. This year, after all, marks the 50th anniversary of the 1963 March on Washington and the 150th anniversary of the Emancipation Proclamation — two historical events of enormous importance to African Americans. It seems even more appropriate that the Urban League’s report is released on the same day that President Obama — our first African-American president, recently re-elected to a second term — presents his annual budget to Congress. Could there be a more appropriate moment to assess how far we’ve come, how far we’ve yet to go, and what kind of leadership is needed to move us forward?

Continue reading….

This is not a post I wanted to write, for a number of reasons. I'll get to a few of them later. But the main reason is because I know I'm probably opening myself up to a lot of stuff I'd rather not deal with. But if there's one thing I've learned over decades as a gay activist is that it's important and empowering to come out. If you don't other people just tell your story as they see fit.

As long-time reader to Alternet, I've sighed and shook my head every time I read an article asserting that ADD/ADHD is "not real," "a myth," or a "made-up ailment" created to boost big pharma 's profits. I wondered if the writers knew anyone living with ADD/ADHD, lived with it themselves, and why I saw so few posts offering another perspective.

Then I realized that, as a sometime contributor to Alternet, I had remained silent for too long.

So, in the spirit of ADHD Awareness Week (which, of course, was last week) , I'm coming out.

According to the CDC, about 5.4 million children have been diagnosed with ADHD as of 2007. Up to 9% of school age children, and about 4.5% of the adult population have ADHD or ADD. There are millions of faces of ADHD, and I'm one of them.

Ten years ago, at the age of 33, I was diagnosed with Attention Deficit Disorder. At the time, I was on the verge of what would be another nosedive in a life that seemed to be one long slow-dance with failure, depression, and anxiety. ; I was just a couple of weeks away from being fired from yet another job. I knew the cycle well enough that by then that I could see the writing on the wall. Try as I might, there wasn't anything I could do to stop it. And I did try.

Don't Know Much About ADHD History

One of the things I hear most often from people who don't "believe in" ADHD is that it has "only been an issue within the last decade or so." This usually follows or is followed by claims that ADHD is a conspiracy of "genocide against black boys" (I've actually heard this) a "conspiracy to make men docile" (I've heard this too), or a "big pharma conspiracy to get us all addicted to their psychiatric drugs." I wrote a post in response to all of this and more a few years ago, which included an ADHD history lesson.

Where to begin? First of all, we’re not talking about “high-energy children” that have just had too much Kool-Aid. And it hasn’t “only been an issue within the last decade or so.” Under one name or another, the characteristics known today as Attention Deficit Hyperactivity Disorder or Attention Deficit Disorder have been associated with each other, observed and recorded for over 95 years.

1902 – Dr. Still, a British doctor, documented cases involving impulsiveness. He called it “Defect of Moral Control.” He did believe, however, that this was a medical diagnosis, rather than a spiritual one.

“Deficit Moral Control.” Kinda has a nice, nearly Puritanical ring. Doesn’t it? Unfortunately, many people haven’t even gotten as far in their understanding of ADD/ADHD as Dr. Still’s 1902 assessment. Anyway, it became known as “Post-Encephalitic Behavior Disorder” around 1922, was called “brain damaged syndrome” for a while, got treated with stimulants as early as 1937, and then came Ritalin around 1956. The point is, ADD/ADHD is not a phantom condition that didn’t exist until it was invented in the 80′s.

It’s not new. In fact, it may be more than 100 years old. At least one article I found references an 1845 children’s story called “Fidgety Philip” that may be the first account of ADHD published in medical literature. (And if you ask me, “The Story of Johnny Look-in-the-Air” sounds alot like a kid with “inattentive type” ADD, such as yours truly.) And that’s just when it was documented and observed. It’s probably existed much longer. If you ask Thom Hartman, it’s actually prehistoric. Before becoming the focus of scientific study and observation, it was probably mislabeled much like schizophrenia was once believed to have its origins in demonic possession. But, alas, we live in an age where science alone doesn’t carry much weight when it comes up against what people believe.

It's been known by different names, but ADHD ; been around much longer than a few decades. It's been observed and described by scientists, authors, educators, etc., for more than160 years. But it's probably been around as long as people have been around.

The difference is that back in the time of "Fidgety Phillip" and "Johnny Look-in-the-Air," kids with ADHD were labeled as "bad kids." Kids with inattentive ADD were probably labeled as "lazy." People with ADD/ADHD were labeled as dumb, or — as the title of one book about adult ADD put it — "lazy, stupid, or crazy." They got punishment instead of treatment, and ridicule instead of help.

Weird Science

History aside, there's scientific support for the existence of ADHD as a "real" developmental disorder. Granted, the jury is still out on what causes ADHD, and it may never be narrowed down to a single cause, but there is some consensus among international scientists.

The central psychological deficits in those with ADHD have now been linked through numerous studies using various scientific methods to several specific brain regions (the frontal lobe, its connections to the basal ganglia, and their relationship to the central aspects of the cerebellum). Most neurological studies find that as a group those with ADHD have less brain electrical activity and show less reactivity to stimulation in one or more of these regions. And neuro-imaging studies of groups of those with ADHD also demonstrate relatively smaller areas of brain matter and less metabolic activity of this brain matter than is the case in control groups used in these studies.

These same psychological deficits in inhibition and attention have been found in numerous studies of identical and fraternal twins conducted across various countries (US, Great Britain, Norway, Australia, etc.) to be primarily inherited. The genetic contribution to these traits is routinely found to be among the highest for any psychiatric disorder (70–95% of trait variation in the population), nearly approaching the genetic contribution to human height. One gene has recently been reliably demonstrated to be associated with this disorder and the search for more is underway by more than 12 different scientific teams worldwide at this time.

Numerous twin studies demonstrate that family environment makes no significant separate contribution to these traits. This is not to say that the home environment, parental management abilities, stressful life events, or peer relationships are unimportant or have no influence on individuals with ADD/ADHD. They certainly do. Genetic tendencies are expressed in interaction with the environment. Also, those having ADHD often have other associated disorders and problems, some of which are clearly related to their social environments. But it is to say that the underlying psychological deficits that comprise ADHD itself are not solely or primarily the result of these environmental factors.The International Consensus Statement on ADHD quoted above was published in January 2002. Since then:

Along with biological and genetic influences, research suggests that ADHD can also be acquired -- not through too much exposure to the internet, smartphones, television, computer games, etc., but through prenatal exposure to alcohol and nicotine.

Not of this will be enough to convince ADHD denialists, especially those who might consider information from sources like the AMA and NIH to be tainted, or a least suspect. But for me, and anyone who lives with ADD/ADHD, it's at least an explanation for what goes on in our day-to-day lives and in our heads.

In My Head

Back to my story. I was diagnosed with "ADHD Inattentive Type" or "ADD, without the H." The difference between the two basically comes down to the absence of hyperactivity. Thus most people call it "ADD" as opposed to "ADHD." The proposed revision to the DSM requires six out of of nine symptoms to "have persisted for at least 6 months to a degree that is inconsistent with developmental level and that impact directly on social and academic/occupational activities."

The nine symptoms of inattention sound like what most people experience from time to time. Who doesn't have trouble sustaining attention during some activities? Who hasn't had their mind wander during a lecture or a meeting? Who doesn't lose things from time to time? Who doesn't get distracted sometimes?

The key phrases in the DSM focus on the length of time the symptoms have occurred, and the impact on "social and academic/occupational activities." I usually explain that the difference between ADD and regular forgetfulness, etc., is that the symptoms occur with enough frequency to have a detrimental effect on day-to-day life. And if symptoms are "inconsistent with developmental level," then they're occurring in a person who has no apparent reason to have such difficulties. (Thus, people with ADD endure a lifetime of hearing, "You should be able to do this." The implication being that … well, I'll get to that later.)

But I don't think any catalog of symptoms can capture what it's really feels like to have ADD. A few years ago, I saw a commercial for Strattera, a nonstimulant ADD medicine, that was close to the way I usually describe it to people:

Imagine that you're sitting in a room with no doors or windows. There are no exits. There's a television in this room with you. It's on, it's very loud, and it keeps randomly switching channels. There's no power button. You can't turn it off. There's no volume. You can't turn it down. You can't escape.

(I recently read very similar description by Jake E.S. Taylor, a teenager with ADHD, in his book ADHD and Me: What I Learned from Lighting Fires at the Dinner Table.)

Now, imagine that you can carry that room around in your head. Television is just as loud and random as before, but no one else can hear it. Imagine trying to go about your daily life of going to school or going to work, having a relationship, or just trying to carry on a conversation. Imagine trying to do that in a world mostly full of people who don't have random televisions blaring away in their heads, and can't imagine anyone else really does either. Including you. "It's all in your head," they say.

Well, yeah. That's the problem.

Medication, in my experience, doesn't shut off that television. It turns down the volume, and slows down the channel switching. It doesn't "cure" my ADD. But it does make the symptoms of ADD manageable, so that I can use the other tools I need to manage the my ADD symptoms.

Some of us manage pretty well without treatment, for a while. We compensate, not always successfully, but maybe enough to get by.

Getting By

I got by for a long time. I actually made it all the way through high school without any major problems. My grades weren't great. They were good enough to get me into local magnet school I graduated from, but not enough to get me on the honor roll. I did well in some subject. English Lit. and History were always my best subjects. Math and Science were always my worst. Always. But I could always count on my other grades to keep my GPA at least slightly above average.

That's probably one of the reasons my ADD went undetected. I wasn't obviously struggling academically. Sure, I had trouble in a couple of subjects, but I wasn't flunking out. And I did well in other subjects. It would have been hard for a teacher to tell that I was missing a lot, because I was sitting quietly and appeared to be listening intently.

What saved me was that I was always a avid reader, I learned that most of what I missed in class I could get if I just did the assigned reading. If there was a term paper, that was even better. I've always been a writer. I usually turned in well written (if not perfectly spelled papers) papers that helped my grades. But behind the scenes, disorganization and procrastination threatened to sink me. In a pattern I would repeat throughout my life, I started the school year off the same way almost every year.

At the start of every school year, I'd buy the latest, ultra-organized binder —like the "Data Center" or "Trapper Keeper" that were popular in the 80s — with a pocket for everything, and even a handy class schedule waiting to be filled in. I'd get a five-subject spiral notebook, for taking notes in class. I swore to myself "This will be the year" that I stay top of things. "This is the year I'm going to get organized and stay organized." I promised myself that I was going to keep track of my assignments, keep track of due dates, pay attention and take notes in class, etc.

It always ended the same way. My efforts at organization lasted a few months, at most, before the novelty of a new system wore off. Then I'd become overwhelmed and start forgetting to write down assignments. Maybe I'd even lose my binder. My notes would be nonexistent or indecipherable because I'd "quietly" zoned out in class, just like Deborah Moore described in an article about inattentive ADD:

Another inattentive tendency could be summarized by the adage, appearances are deceiving. Inattentive students often seem to be paying attention as they sit quietly, and, indeed, they may stare directly at the instructor for an entire class period. Yet, during this time, their thoughts have drifted from the real world around them. In such instances, their bodies remain stationary while their minds wander aimlessly through a universe of ideas and images; frequently, their academic performance reflects this lack of connection with classroom activities.

I could be sitting in class, staring right at the teacher, appearing to listen to every word. But that television in my brain kept switching channels. I would mentally "check out," without even being aware that's I was doing so. At the end of class, I realized that I'd sat there the entire time and suddenly realize that class was over and I'd everything the teacher had said. As a result, I missed a lot stuff, and struggled to catch up or just to keep my head above water.

The reason I missed stuff went back to my Inattentive ADD.

Children are naturally dreamers. It's not unusual to find them staring out a window, lost in thought about some invented escapade. Daydreaming is how they create and explore new ideas.

Snapping back to reality can be more of a problem for some children than others, though. Kids with attention problems will stare off into space in the middle of class, preferring to stay lost in their own mind rather than return to the classroom. If trouble concentrating and focusing are constant problems for your child, they could be signs of ADHD.That's another part of why my ADD went undetected. I wasn't the kid who was jumping out of his seat, running around, and disrupting the class. I was the kid quietly staring out of the window. The disruptive kid is more likely to get immediate attention, as it's obvious there's a problem . The quietly distracted kid isn't as likely to draw attention.That why those of us Moore called "Undiagnosed Dreamers" often go undiagnosed.

Ironically, the "low key" nature of inattentiveness may well have made it a more insidious force for personal disaster than the highly visible and dramatic hyperactive variation; these individuals simply attract less notice within classrooms and families. Described as "dismissed and undiagnosed dreamers" by learning disabilities specialist Paula Stanford, inattentive ADDers are usually diagnosed later in life than their hyper counterparts; in fact, many of them may never be diagnosed at all and spend their lives floundering and repeatedly failing to meet expectations.

Inattentive students don't annoy adults or behave in a volatile manner. They don't wiggle in their seats and disrupt students sitting around them. Indeed, they may even appear to maintain concentration by staring fixedly at a textbook or a lecturer for periods of time, but this apparent "focus" may mask a wandering mental state. As Stanford notes, "It's hard to see distractibility.”

People...expect to see a child in the back of the class "bouncing off the wall." This example child is always talking and can never concentrate on anything put in front of him. This child was never an example of me....I was not hyperactive; I just could not concentrate, memorize or work on something that did not interest me. (Excerpt from an essay by an inattentive ADD adult client of Brainworks.)

I recognized another part of myself in Moore's essay when I read it years ago. As much as I relied on reading outside of class and writing papers to keep my grades up, I also relied on having good relationships with teachers who would bail me out by letting me turn in assignments late from time to time.

During my elementary years, without realizing it, I learned how to manipulate my teachers into letting me turn my homework or other assignments in late...phrases like "Well, I guess, turn it in tomorrow," were frequent. Even with my ability to manipulate teachers, I still heard, "Why can't you do this? Why don't you concentrate? It's not that hard....You can't see past the end of your nose."

"Why can't you do this?" "Why don't you concentrate?" Those the constant refrain from my teachers, along with report car comments like "Terrance could do even better if he applied himself." That's when I became acquainted with one of the most dispiriting aspects of ADHD. People assume you aren't trying. You may be dancing as fast as you can, and still saying two steps behind. But nobody sees that. Maybe they don't know about ADHD. Maybe they don't "believe in it." But they assume you're just doing enough getting by.

Well, you are. You're working your ass off, and just getting by. That is, until you're not.

Hitting the Wall

Everything changed when I went off to college. I graduated from high school, thanks to my ability to compensate for what I didn't know at the time was ADD. It cost me a lot in terms of stress and near-constant anxiety, but I graduated in the summer of 1987. That fall, I set off for college.

I only went 100 miles from home, but in terms of what I was used to it was worlds away. I didn't know then that people with undiagnosed ADD/ADHD often "hit the wall" in college. I managed to get through my freshman year without seeing that wall heading straight for me. Looking back, there were signs.

By the second half of my freshman year, a major depression had set in. Depression had been a reality for me all through my school years. There were probably a number of reasons. (When you're an effeminate, non-athletic, black gay boy growing up in the South during the Reagan era, how can you not be depressed?) Having undiagnosed, untreated ADD was probably one reason for my depression.

ADHD and depression are common bedfellows. ;(Not to mention anxiety.) Children with ADD/ADHD are at greater risk for depression. Depression is also more prevalent among adults with ADD/ADHD.

It's not that ADHD causes depression, but it contributes to depression by its very nature. Someone with undiagnosed and untreated ADD/ADHD (especially into adulthood) has probably experienced a long series of failures: in school, work, and relationships, etc. The result is a sense of failure and low-self worth that grows feeds into depression with every fresh failure. Even when things are going relatively well, familiarity with failure causes feelings of anxiety, because you know it's all going to fall apart again. You never know when or how, but you know it will. It always does.

My freshman year of college was the first time I was diagnosed with depression; or mis-diagnosed. I say mis-diagnosed, because the anti-depressants I was prescribed helped somewhat with my moods, but did little to alleviate the symptoms of untreated ADD that were contributing to the depression. Over the years, I changed dosages, and changed medicines, but little else changed. I felt better for a while, but by then, I was already on my way to hitting the wall.

I did hit the wall at the beginning of my sophomore year. I had reached the limit of my compensatory abilities. Plus, the drinking that started in my freshman year had begun to morph into full fledged alcoholism. In retrospect, drinking was probably a way of "self-medicating"; not so much to treat my ADD symptoms, but to forget the

The combination of all of the above, plus the death of a close friend that fall, was too much. I became so depressed that it would take me hours to get out of bed in the morning. I started missing classes. When I did go to class, I realized that I was in over my head in some of them (like Biology and Algebra), and stopped going. I flunked out of every that first quarter.

Lost Time

After a lot of tears and explanations, my parents agreed not to yank me back home to finish college. (After 18 years of growing up in the closet, I wasn't about to go back to it, which is what moving home would have amounted to.) But they had a condition for letting me go back: I would not take a full load of classes. Instead, I would take a partial load — two classes per quarter, instead of three. That was the beginning of what I would come to call my "lost time."

"Lost time," Benjamin Franklin once said, "is never found again." My sophomore year was the beginning my "lost time." Though it took me a while to notice, I was inevitably falling behind my peers. In few years, I looked around and the people I'd started out with were graduating and moving on to graduate school, or starting careers. I was still trying to finish my bachelors degree. Those same peers would finish grad school, graduate from law school, or further advance in their careers while I continued trying to finish my B.A. After seven years, I finished.

Seven years. Three years longer than it was "supposed to take," made even longer by a couple of failed attempts to pass Algebra during the summer.

I don't think I could have put into words the sense of relief I felt, or the sense of loss that it was mingled with. The relief was that I finally finished. My mom and brother came to my graduation, and watched me don my cap and gown for the ceremony. The loss I felt at the time was because I could look around me and see how much time lost. The friends and peers I'd expected to graduate with in the beginning had long since moved forward in their lives and careers. I felt how much time I'd lost, and I had a strong desire to finally "get started," and start catching up.

I didn't know about the Ben Franklin quote then. In fact, I only came across it a few months ago. At the time, I had hope that lost time could be found again, or at least made-up for somehow. Almost twenty years later, my experience bears witness to the truth of Franklin's quote.

Granted, those years were not a total loss. I got sober during my last year of college, and have stayed that way for 20 years. That's something. That I managed to stay sober through all that would pass after graduation is the main reason I lived long enough to get my diagnosis.

It's what I had to live through that gets to me. I'll tell more of that story, and how diagnosis and treatment changed things for me, in the next post.

Detroit SkylineIt has been called the city that moved America; the city that spawned the sound of a generation. For decades, Detroit was the assembly line of the American Dream. Its auto factories produced the cars that made possible the suburban life that defined the American middle class and provided jobs and wages that lifted more families into the middle class. Now, with its abandoned factories and vacant lots, Detroit symbolizes the deterioration of the American Dream it once fueled. So, it is only right that Detroit is one of the first stops on the road to rebuilding that dream. Today, the Congressional Progressive Caucus brings its Speak Out For Good Jobs tour to Detroit. Caucus members promise to "listen to what everyday Americans have to say and take that back to Washington with them as they continue to fight to reinvigorate the American Dream." If so, Detroit has a story to tell; one of a city and a dream in decline.

A Dream in Decline

It's hard to pinpoint exactly when Detroit's downward slide began, but in the past 30 to 40 years it has paralleled a middle-class decline driven by stagnant wages and the offshoring of good jobs that grew America's middle class. Those jobs led African Americans in the South to migrate north, and immigrants to America's shores in pursuit of a dream that their willingness to work hard would place within reach. Those jobs and that dream, by 1950, drove Detroit's population to its peak of 1.8 million. Today, the jobs that made Detroit the Motor City are long gone, and the dream that fueled its growth has stalled. The unemployment rate for Detroit, at 11.3%, surpasses the unemployment rate for rest of the state. The city's black unemployment rate, at 25.7%, surpasses the overall rates for the city and the state. Many of those who have jobs don't earn much. The census shows that Detroit's per capita income is nearly half the national average, and that one third of its citizens live in poverty. Just as jobs left Detroit, so have its people. Sixty years ago, the promise of good jobs and the promise for a better life caused an influx of workers from across the country, and around the world. Now, the absence of both good jobs and much hope for their return is fueling an exodus. Michigan is the only state that has lost people in the past decade, and Detroit probably has a lot to do with that. The city lost 25% of its population in the past decade, dropping to 790,000 from 951,000 in 2001, an echo of the "white flight" of the 1960s and 1970s, as black people escape Detroit's high crime and poor services for deteriorating "second-hand" suburbs. As a result, Detroit's vacancy rate has risen to 27.8% from the 10.3% rate reported in the 2000 census, as job loss and foreclosure crisis fueled population loss. (The city's had 55,000 foreclosures since 2005, and another wave is expected when moratoriums are lifted.) A lot-by-lot survey of the city revealed that fully a quarter of its lots are vacant.

Scrap City

Detroit has become a city of abandoned buildings, abandoned people and abandoned dreams. Its empty buildings, vacant lots and abandoned factories are sites for the adventures of urban explorers, who wander its ruins as archeologists might wander through the ruins of Pompeii, looking for clues about how the people who once occupied them might have lived, and hints about what caused the decline of this once great city. They post videos and photographs of their explorations on sites like YouTube and Flickr. Professional photographers seem to find a kind of sad beauty and mystery in Detroit's ruins, and capture that mystery in evocative imagery.

But the story of how Detroit went from being the Motor City to Scrap City is no secret. There's no mystery. The decline of Detroit isn't the result of unknown circumstances. It's what happens when manufacturing disappears, taking jobs with it. It's what happens when people and their dreams of better lives for their families and brighter futures for their children and grandchildren are abandoned.

Photos of Detroit show boarded-up and vacant homes.  New York Times reporter Katharine Seelye describes this "as dramatic testimony to the crumbling industrial base of the Midwest." The U.S. Labor Department reports that Michigan lost more than 320,000 manufacturing jobs, just between 2001-2008.  Little wonder then, that without job prospects, hundreds of thousands of residents have been forced to leave. Seelye says the massive drop-off in population is "the largest percentage drop in history for any American city with more than 100,000 residents."  The only comparable flight would be the "unique situation of New Orleans," where 29% of the city evacuated after Hurricane Katrina in 2005. What's especially disheartening is knowing that Detroit's exodus was preventable.  Failed manufacturing in Michigan, which has left so many without work, is the result of failed U.S. trade policy and little effort by successive administrations to ramp up America's industrial base in the face of changing global economic conditions. Times are getting dire.  What's urgently needed is for the U.S. to implement a national manufacturing strategy to bring back good-paying jobs before it's too late.
Now, parts of Scrap City -- the city formerly know as the Motor City -- are actually being scrapped. Last year, in a move to save Detroit by destroying Detroit, the city government used federal money to begin razing 10,000 empty residential buildings by 2013. Remaining residents aren't sad to see the derelict structures go, as they have been crime magnets in communities. There are even suggestions that much of Detroit, once razed, should be left to return to farmland. It's part of a "Managed Decline" approach that basically means giving up on a city and finding something better to do with the land, or hoping that someone else does. As one writer put it, after watching residents clap and cry when vacant homes that have long burdened their communities finally come down, "The hope is that as more homes are demolished, the problems they bring will be demolished too."

The Road Back

It will take more than bulldozers and hope to get Detroit and its people on the road back to good jobs, stronger communities, and better lives. It will take a plan; a road map that clearly shows the way back or at last helps navigate the next leg of the journey. When the Congressional Progressive Caucus pulls into Detroit to listen to what its people have to say, they will find no lack of ideas on how to get Detroit moving in the right direction. A post at Winning Progressive, titled "Saving Detroit," points out that question isn't how best to raze Detroit, but to invigorate it.
The question becomes how do we reinvigorate Detroit? The city's mayor and others are proposing to "down-size" the city, which essentially means getting people to move out of the least populated neighborhoods so that city services can be shut off to those areas. Others, like Hartz Farms, are proposing to use the large areas of unused land for urban farming. While perhaps understandable given the dire situation that Detroit is in, the problem with these approaches is that they constitute essentially abandoning the idea of Detroit as a major American city. We here at Winning Progressive believe there is a better approach that focuses on reinvigorating the city by repopulating it. This can be achieved in two ways: * Make Detroit an immigration safe haven - Immigration has always been the life blood of American cities, from the Polish, Irish, and Italian immigrants who came in the late 1800s and early 1900s to the Latino immigrants of the past couple of decades. And many more people want to come to the U.S. from other countries but either cannot get their way through our broken immigration system, or do not want to risk coming here illegally. As more people immigrate to an area, economic activity and jobs are created to provide basic goods and products to them. ... * Urban Homesteading: In 1862, the government sought to encourage westward expansion through the passage of the Homestead Act, which authorized the sale of 160-acre plots of unoccupied public lands in the west for a nominal fee after someone resided in the area for five years. In Detroit today, the city government has taken possession of tens of thousands of vacant lots and thousands of other lots in the city have long been abandoned. So, why not enact an Urban Homestead Act, that combines free land with a $100,000 grant to build or restore a house on that land to any law-abiding citizen who agrees to live there for at least five years? Such a program would repopulate cities like Detroit, assist people in need, and stimulate the economy by increasing home building activity. 100,000 families could participate in such a program at a cost of $10 billion per year, which is one-seventh the amount of what the wealthiest two percent of Americans will be receiving every year if the Bush tax cuts are extended and less than one-tenth what we spent every year on the Iraq war.
Community groups already have plans will go a long way towards bringing the city back. Their plans reflect an understanding that, despite statistics like those quoted above, Detroit isn't necessarily a shrinking city, as Kaid Benfield at The Atlantic pointed out.
There has indeed been a decline in part of the region. In 1970, 1,670,144 people lived within the city limits of Detroit. By 2010, that number had declined to 713,777, an astounding apparent loss of some 57 percent of the 1970 population. Recently, much has been made the 25 percent population decline over the last decade, from 2000 (951,270) to 2010. But the extent to which Detroit is such a tragically "shrinking city" depends on your definition of "city." The population of metropolitan Detroit-the jurisdictional inner city and its immediate suburbs-did decline from 1970 to 2010, but only from 4,490,902 to 4,296,250, a loss of only 4 percent. Big difference. Do the math: What that means is that, while the inner city's population was declining so drastically, its suburbs added some 761,000 people, growing at the handsome rate of 27 percent. (In the most recent decade of 2000-2010, the suburbs added some 91,000 people, or between 2 and 3 percent.) Patrick Cooper-McCann writes on his blog Rethink Detroit that, far from shrinking, the physical size of metro Detroit grew by 50 percent in those 40 years. As I've written before, neither the economy nor the environment pay attention to jurisdictional lines; neither should analysts. ...Shrinking city? Really? What this tells me is that an even bigger problem for Detroit than the decline of the rust-belt economy has been that the fringe of the region has been allowed, more than in most places, to expand, not shrink, and to suck the life and hope out of the inner city. So why aren't the self-styled progressive responses to "the Detroit problem" addressing this critical aspect of the problem?
Community organizations on the ground are doing just that. Next Detroit's neighborhood stabilization initiative focuses on private and public investment in the strongest neighborhoods, to stabilize communities and stem the loss of jobs by supporting those neighborhoods with services that will attract and retain more workers. Other organizations focus on improving the lives of citizens living in the city itself. Community Development Advocates of Detroit has proposed reclassifying various neighborhoods as green zones, homestead sectors, or village hubs. Detroit Declaration focuses on developing urban farming, and encouraging fill-in housing development. Another proposal would entice more people to move to the with tax breaks and changes in zoning laws. What Detroit needs most right now is investment in jobs and in its people. Detroit lost 323,400 jobs during the recession, and experts say that it will take more than a decade to for Detroit to recover at the current rate of growth. Only investment in creating good jobs -- jobs that fuel the hopes and dreams of American families and communities -- can get Detroit and its people on the road back to being a great American city, and restart the engine that once fueled its growth and greatness -- the American Dream. When the Speak Out For Good Jobs tour arrives in Detroit, the Motor City will have a chance to get rolling again.
In a post about Wisconsin Governor Scott Walker's bid to strip public employee unions of collective bargaining — the most important and effective tool for protecting workers — Van Jones wrote:
If a foreign power conspired to inflict this much damage on America's first responders and essential infrastructure, we would see it as an act of war.
It is an act of war, a now all-but-openly-declared war — and not just against unions, but against American workers and against the middle class. Americans are accustomed to denying even the existence of classes, let alone class conflict. This week America's ongoing class war arrived on our doorstep with the subtlety of a daisy cutter — in the form of Walker's union-busting politics, and the massive protests in Madison and beyond. Now that the battle is joined, the big questions are what the outcome will be, and whether Democrats will take the opportunity to tell American workers unequivocally whose side they are on. What we are seeing in Wisconsin is job killing in action, with the goal of eliminating or permanently weakening the middle class. Conservative policies were responsible for the death of American manufacturing and the loss of "good jobs"jobs with decent wages and benefits that aided the growth of the middle class from the working class.
The Center for Economic and Policy Research defines a "good job" as one with health insurance, a pension plan and earnings of at least $17 per hour. That works out to about $34,000 a year, the inflation-adjusted median income for men in 1979, when U.S. manufacturing jobs numbered 19.6 million, an all-time high. Since then, however, the economy has lost nearly 6 million manufacturing jobs — 52,000 in February alone. Among them were many of the 3.5 million "good jobs" lost from 2000 to 2006, according to John Schmitt, a senior economist at CEPR. As those jobs disappeared, many blue-collar workers were forced to take jobs with far less pay and benefit security. ...Helping fuel the loss of good jobs has been a decline in union membership, industry deregulation, increased outsourcing of state and government services and economic policies that focus more on containing inflation than on maintaining full employment, Schmitt said. (Emphasis mine.)
What Conservatives Really Want Now conservatives have turned their eliminating  what may be the last "good jobs" left in America, in terms of benefits. It's not just public employees and public employee unions conservatives have in their sights, but the very concepts of a common good and a public interest. George Lakoff explained in "What Conservatives Want," a post dedicated to the protesters in Wisconsin (emphasis mine):
Conservatives really want to change the basis of American life, to make America run according to the conservative moral worldview in all areas of life. …The way to understand the conservative moral system is to consider a strict father family. The father is The Decider, the ultimate moral authority in the family. His authority must not be challenged. His job is to protect the family, to support the family (by winning competitions in the marketplace), and to teach his kids right from wrong by disciplining them physically when they do wrong. The use of force is necessary and required. Only then will children develop the internal discipline to become moral beings. And only with such discipline will they be able to prosper. And what of people who are not prosperous? They don't have discipline, and without discipline they cannot be moral, so they deserve their poverty. The good people are hence the prosperous people. Helping others takes away their discipline, and hence makes them both unable to prosper on their own and function morally. The market itself is seen in this way. The slogan, "Let the market decide" assumes the market itself is The Decider. The market is seen as both natural (since it is assumed that people naturally seek their self-interest) and moral (if everyone seeks their own profit, the profit of all will be maximized by the invisible hand). As the ultimate moral authority, there should be no power higher than the market that might go against market values. Thus the government can spend money to protect the market and promote market values, but should not rule over it either through (1) regulation, (2) taxation, (3) unions and worker rights, (4) environmental protection or food safety laws, and (5) tort cases. Moreover, government should not do public service. The market has service industries for that. Thus, it would be wrong for the government to provide health care, education, public broadcasting, public parks and so on. The very idea of these things is at odds with the conservative moral system. No one should be paying for anyone else. It is individual responsibility in all arenas. Taxation is thus seen as taking money away from those who have earned it and giving it to people who don't deserve it. Taxation cannot be seen as providing the necessities of life for a civilized society, and, as necessary, for business to prosper.
The public workers targeted in Wisconsin and others are the same people who make middle-class life and security in America possible. They are the people who ensure our safety, who safeguard our health, and facilitate us getting where we want to go, among other things. They are the police who responded within minutes after our house alarm was set off by a strong wind that blew open a door that lacked a deadbolt lock; the teachers and school staff that helped our son when he needed it; the paramedics who responded quickly when a neighbor's child had trouble breathing; the firefighters who responded when a neighbor detected a gas leak; the bus driver that gets our son to school safely each day; the public transportation workers who get me to work and back home safely each day. The list goes on and on. When abstract budget cuts translate into fewer teachers, police officers, health workers, firefighters, etc. in our communities, we begin to realize that such cuts hurt rather than heal. The very necessities that support the existence of a middle class are threatened. They will not be replaced if conservatives are successful in eliminating them. They will not be affordable if privatized. The reason that there are public services supported by public workers is that there are things we believe need doing and should be done even if they're not profitable. Where there is not enough of a profit margin for private industry to see a benefit, and too great a need for charitable entities to meet entirely, it becomes a question of the public interest, requiring a public solution. We are faced with a conservative movement that not only doesn't believe in a public good but sees it at the biggest of our problems. Ideology vs. Reality: Something Has To Give Wisconsin and other states are where the irresistible force of ideology meets the immovable object of reality. While many Americans support the idea of "tough" budget cuts, most Americans want the painful cuts made somewhere else — someplace where they won't feel it. Like the Johnny Mercer lyric that says, "something's gotta give." It will either be the will of the people or the ideological move to increase economic pain and inequality. As America looked on with the rest of the world at the amazing, dictator-toppling protests in Egypt, we heard reports of how Egypt's economic inequality catalyzed a citizens' movement. And we learned that economic inequality is worse here than in Egypt. It's no coincidence that even conservatives see the parallels between Cairo and Madison. The connection between the uprisings in Cairo and Madison aren't lost on the participants in both. Facilitated by the internet, protesters in Cairo and Madison have exchanged statements of solidarity. Technology may have partly bridged that gap, but what brings the uprisings in Egypt and elsewhere closer to home is not so much the technology as the understanding that passes along it, through barriers of culture, language, religion, etc. What does it mean when Americans in Madison, Wis., see themselves in the same boat at protesters in Egypt? It means that our domestic economic policies have mirrored our economically driven foreign policy, with consequences as devastating to working and middle-class Americans as those our decades-long support of Mubarak's regime was to Egyptians. A New York Times article recently stated, "Hosni Mubarak’s Egypt has long functioned as a state where wealth bought political power and political power bought great wealth." The same can be accurately said of the U.S. in the past 30 years. In Winner-Take-All Politics: How Washington Made the Rich Richer--and Turned Its Back on the Middle Class, Jacob Hacker and Paul Pierson explain what's happened in the last 30 years.
That shift occurred in the 1970s because businesses and the super-rich began a process of political organization in the early 1970s that enabled them to pool their wealth and contacts to achieve dominant political influence (described in Chapter 5). To take one of the many statistics they provide, the number of companies with registered lobbyists in Washington grew from 175 in 1971 to nearly 2,500 in 1982 (p. 118). Money pouring into lobbying firms, political campaigns, and ideological think tanks created the organizational muscle that gave the Republicans a formidable institutional advantage by the 1980s. The Democrats have only reduced that advantage in the past two decades by becoming more like Republicans–more business-friendly, more anti-tax, and more dependent on money from the super-rich. And that dependency has severely limited both their ability and their desire to fight back on behalf of the middle class (let alone the poor), which has few defenders in Washington.
Americans are fast approaching a crossroads where the abstract budget cuts run headlong into reality of the pain those cuts will inflict on our families and communities. And in the communities where Americans live and work, the abstract notion of budget cuts translates into real economic pain. It translates into states taking action to increase economic pain while at the same time undercutting their ability to relieve the worst of it. A Lost Middle Class, Unbridled Corporate Power The war against public employees is also a war on the many things government does that support the middle class. That Republicans have not offered alternatives to these supports either reflect their lack of concern about the American middle class, or their confidence that the private sector will eventually supply alternative supports. Either way, we're probably facing a "lost decade" in which middle and working-class Americans suffer the loss of these supports, facing stagnation at best and downward mobility at worst. For younger generations, this will come a crucial time developmentally, during which they would otherwise acquire or inherit advantages they could then pass on to their children, thus perpetuating the  middle class. This is an attack on the middle class, both directly and indirectly; even on those of us who have fallen for the right's "politics of envy" and thus focus our ire on public employees rather than at those further up the economic ladder who are, still, feeling no pain in this recession. Instead too many of us look at public employee unions and ask "Hey, why should they have it so good?," instead of asking "Hey, how come we don't have it that good?" (Perhaps because only 6.9% of private employees are unionized now, due in no small part to Republican efforts to aid corporate union-busting.) As Kevin Drum notes, killing off unions removes the only remaining counterbalance to corporate power.
... Of course unions have pathologies. Every big human institution does. And anyone who thinks they're on the wrong side of an issue should fight it out with them. But unions are also the only large-scale movement left in America that persistently acts as a countervailing power against corporate power. They're the only large-scale movement left that persistently acts in the economic interests of the middle class. So sure: go ahead and fight the teachers unions on charter schools. Go ahead and insist that public sector unions in Wisconsin need to take pay and benefit cuts if that's what you believe. Go ahead and rail against Davis-Bacon. It's a free country. But the decline of unions over the past few decades has left corporations and the rich with essentially no powerful opposition. No matter what doubts you might have about unions and their role in the economy, never forget that destroying them destroys the only real organized check on the power of the business community in America. If the last 30 years haven't made that clear, I don't know what will.
Perhaps now more Americans know how high the stakes really are. Recent polls show that 65% of Wisconsin residents and 61% of Americans support the right of public employee unions to bargain collectively. (In Wisconsin, Walker is losing the support even among Republican senators.) This attack on the middle class comes at a time when the middle class has already been weakened by the economic impact of conservative policies and politics. Wisconsin illustrates that conservative economic and fiscal policies create crises that Republicans then exploit to accomplish political ends -- weakening their opponents and rewarding their cronies along the way. Naked cronyism is employed in pursuit of what to conservatives is a higher goal: to "finish the job" of remaking our economy (to more closely resemble those of other countries also facing citizen revolts) through destroying regulation, consumer protection, collective bargaining, labor organizing, and thus ensuring continued growth of economic inequality. Mother Jones magazine spelled it out in just eight charts.
Create your own labloop foto slideshow for MySpace, Facebook or your website! view all fotos of this slideshow
The importance of this end, the permanent diminishing of the middle class, to Republicans is evident in how far they are willing to go and how unswerving they are in the face of public opposition and the cognitive dissonance of reality. So, the debt or the deficit is not the point in the first place—because the deficit is merely the symptom, not the disease. The disease is the conservative economics that have created the crisis. The crisis they have created is the point. Give conservatives this: They never let a crisis go to waste, in the way the Obama administration have done thus far. Indeed, in his short time in office Walker has destroyed (and threatened to destroy more) jobs than his policies are likely to create, if previous applications of conservative policy are any indication. The $117 billion in tax breaks that Walker and the Republican legislature pushed through for GOP cronies basically created the very crisis he claims to address. That Walker refuses any compromise at all, even though the unions agreed to accept wage and benefit reductions as long as they keep the right to collective bargaining, shows that the budget isn't the point. Power is. That Walker doesn't have time to talk to the state's Senate Democrats, but does have time to take a call he thinks is from one of the Koch brothers, shows exactly whose side he's on. If Walker accepts compromise, then unions survive to bargain another day. Meaning that the wage and benefit reductions are not guaranteed permanent. If Wisconsin's economy improves, the states public employee unions would be a position to bargain for a return to previous wage and benefit levels, based on the argument that their sacrifice should end with the state's budget crisis ends. The Crossroads, In Washington And Beyond We are approaching a crossroads. It happens that this time it's been reached in Wisconsin, and other states are approaching the same point. At some point, it becomes impossible to camouflage the blatant cronyism, inequality, and bald-faced contradictions in what conservatives promise and what their policies actually deliver. At that crossroads, things can go at least a couple of ways. Either people resist, because they are still inspired by the possibility of change and believe in their ability to affect change with great effort, or they are successfully crushed by economic pain and effectively disenfranchised the point that not only do they no longer believe in the possibility of change, but they no longer bother with an effort because they believe "The government does what it wants to do. We can do nothing." "The people united," goes the chant, "will never be defeated." The fate of that union, and the ability of united people to change the direction of government, is being decided in Madison, Wisconsin, today. And maybe in your state capitol tomorrow. We will soon face a stalemate in the federal government similar to that in Wisconsin. Since congressional Democrats won't have the option of flight, they had better be ready to fight, and to make the case that Republicans have refused to bargain at all, let alone bargain in good faith. Democrats must make the case that they are working to prevent Republicans in Congress from doing to the rest of the country what Gov. Walker and Republican legislators are trying to do in Wisconsin and other states now. What's happening in Wisconsin and across the country may be the beginning of Americans realizing the consequences of voting in Republicans whose policies don't reflect what Americans really want. It may be the beginning of the Republicans running smack into the reality that the midterm elections did not give them a mandate or confer a public stamp of approval on their agenda. We can only hope it is the beginning of Americans turning back that agenda when it comes to their hometowns. A popular business tip advises would-be business leaders to "Find a parade and get in front of it." The question is whether Democrats, having failed to start a parade after 2008 will have the sense to jump in front of the one that started in Wisconsin and, at long last, lead it. Democrats' first step towards real leadership, from the president on down, should star with an unequivocal statement of support for public employees and public employee unions in Wisconsin and other states, support for the right to organize and bargain collectively, and ultimately support for the progressive values that are the foundation of all the above. The conservative war against the working people, the middle class, and fundamental American values has burst out into the open. It's time now for the president and Democrats to speak up and stand up; to leave no doubt whose side they're on, by publicly joining the fight.
Unless something drastic happens between now and the vote on President Obama's tax-cut "compromise" with congressional conservatives, America is headed for its next failed conservative stimulus. Even with the proposed tweaking around the edges, there is nothing in this bill that hasn't already been tried and failed. In a sense, we are still living with the worst economic policies of the George W. Bush era, going all the way back to the tax cuts Bush pushed for almost as soon as he entered office, promising that the cuts would create jobs, stimulate the economy and stave off the recession that Fed chief Alan Greenspan warned was on the way. On June 7, 2001 — with unanimous support from Republicans, and the help of 28 House and 12 Senate Democrats who should have known betterBush signed into law $1.35 trillion in tax cuts. It was one of the largest tax cuts in history; much larger than the $127 billion surplus left by President Clinton, Bush's predecessor in the Oval Office. While Bush was in office, the tax cuts failed to pay off. The gross domestic product grew at an anemic rate, and unemployment rose 2.1% between January 2001 and June 2003. Median household income — adjusted for inflation — dropped between 2000 and 2007, even as families were spending more on such basic expenses as food, housing, gas and health insurance. Meanwhile, after-tax income for the wealthiest 1% rose by $146,000 in 2004 alone. The poverty rate increased from 11.3% to 12.5% by 2006, and had increased to 13.2% by 2008. Over and over again, President Bush, with the support of Republicans in Congress, opted for the same failed "stimulus" — cut taxes and hope for the best. Campaigning in 2004, Bush promised "tax refunds" amounting to about $400 per working family. In 2008, with the economy already in a recession that we now know would only worsen unless the government took major steps to stimulate the economy, Bush tried "tax rebates," ironically dubbed a "stimulus plan" and passed by Congress in February 2008. But by then the trap was already set and the damage essentially done. In 2003, with job growth stagnant in the middle of an economic upswing, another round of tax cuts passed with the support of all but one House Republican and all but three Republican senators, consisting of cuts in individual rates, capital gains, dividends and the estates tax — nearly all of which were set to expire in 2010. As Jacob Hacker and Paul Pierson pointed out, the tax cut package of 2003 was a trap, both in the political and economic sense, set to ensnare whomever was unfortunate enough to hold power when the bill came due.
When President Obama said he was forced to negotiate with hostage-takers, he conjured an image of ski-masked Republicans suddenly storming the White House and demanding tax cuts for the rich, a screaming Jane Middle Class in tow. The imagery made it seem as if this bitter fight just emerged -- an impression reinforced by the breathless commentary of pundits who act as if history began last week. In reality, the hostage takers laid their "trap" a decade ago, as former Bush spokesman Dan Bartlett helpfully explained to The Daily Beast: "We knew that, politically, once you get [a big tax cut] into law, it becomes almost impossible to remove it. That's not a bad legacy. The fact that we were able to lay the trap does feel pretty good, to tell you the truth." ...In our 2005 book Off Center, we summed up the Republican tax-cut strategy as follows: Republicans carefully calibrated their presentation of the tax cuts to circumvent hostile public opinion. Three strategies were central -- each attuned to the tax cuts' principal liabilities. First, unrealistic projections of federal surpluses and of the costs of the tax changes were used to justify the tax cuts and obscure their effects on competing priorities. Second, Republican leaders managed the legislative agenda to prevent consideration of the tax cuts' specific effects on valued programs. And third, tax-cut advocates worked assiduously to make the cuts look far less tilted in favor of the rich and well connected than they really were... To respond to their base, Republicans misled most Americans. On an unprecedented scale, phase-ins, sunsets, and time bombs were used to give the tax cuts of 2001 the most attractive public face possible while systematically stacking the deck in favor of Republicans' long-term aims. From top to bottom, Republicans larded the tax cuts with features that made sense only for the purposes of political manipulation. Most reporters have done a lousy job of reminding us of this background. Why were the tax cuts of 2001 scheduled to expire? Because the Bush administration could not convince enough Senators back then that they were affordable, even at a time of record budget surpluses. The GOP's gamble was that when the tax cuts were due to expire, they would be extended because too many in Washington would be afraid to "raise taxes."
Like a lot of progressives,  I hoped the long, dark decade of conservative failure on a host of issues was behind us by 2010, or soon would be. At the beginning of the year, I looked back on a decade I dubbed "The Uh-Ohs: A Decade of Conservative Failure," and the "tax-cut stimulus" policies that created more income inequality than prosperity.
Uh-Oh! For 99% of us, it was a very taxing decade. From 2000 on conservatives preached the gospel of prosperity through tax cuts. Tax cuts for the very wealthy, that is. The idea was the tax cuts would put yet more money into the hands of the wealthiest Americans, who would then put that money back into the economy, and "spread the wealth" either by spending it on goods and services that create jobs or by investing it in ventures that would create jobs and benefit all Americans. The reality turned out to be something else. Uh-Oh! We never got the "trickle down" of prosperity the tax cutters promised. Instead, we got a kind of Bizarro World "trickle up" economy, where billionaire Warren Buffet has a lower tax rate than his secretary. Of course it didn't work. It couldn't work and we've known for years it wouldn't work. This long, slow drift actually began decades ago, but really began to pay off in the past 10 years — when conservatives had control of both the White House and Congress, and could finally do a lot of things their way.
By September of this year, David Cay Johnston reminded us how the Bush tax cuts worked out for the economy, and that conservatives were running on a platform of nothing more than the same old tax cuts.
The tax cuts did not spur investment. Job growth in the George W. Bush years was one-seventh that of the Clinton years. Nixon and Ford did better than Bush on jobs. Wages fell during the last administration. Average incomes fell. The number of Americans in poverty, as officially measured, hit a 16-year high last year of 43.6 million, though a National Academy of Sciences study says that the real poverty figure is closer to 51 million. Food banks are swamped. Foreclosure signs are everywhere. Americans and their governments are drowning in debt. And at the nexus of tax and healthcare, Republican ideas perpetuate a cruel and immoral system that rations healthcare -- while consuming every sixth dollar in the economy and making businesses, especially small businesses, less efficient and less profitable. This is economic madness. It is policy divorced from empirical evidence. It is insanity because the policies are illusory and delusional. The evidence is in, and it shows beyond a shadow of a reasonable doubt that the 2001 and 2003 tax cuts failed to achieve the promised goals. So why in the world is anyone giving any credence to the insistence by Republican leaders that tax cuts, more tax cuts, and deeper tax cuts are the remedy to our economic woes? Why are they not laughingstocks? It is one thing for Fox News to treat these policies as successful, but what of the rest of what Sarah Palin calls with some justification the "lamestream media," who treat these policies as worthy ideas? The Republican leadership is like the doctors who believed bleeding cured the sick. When physicians bled George Washington, he got worse, so they increased the treatment until they bled him to death. Our government, the basis of our freedoms, is spewing red ink, and the Republican solution is to spill ever more. Those who ignore evidence and pledge blind faith in policy based on ideological fantasy are little different from the clerics who made Galileo Galilei confess that the sun revolves around the earth. The Capitol Hill and media Republicans differ only in not threatening death to those who deny their dogma. How much more evidence do we need that we made terrible and costly mistakes in 2001 and 2003?
Now, we know tax cuts are the least effective way to create jobs and stimulate economic growth, because the wealthy don't spend tax cuts. Yet, it now appears that we will jump into that same trap with both feet. Let's be clear about what we're doing. By extending the worst economic policy of the Bush/conservative era — tax cuts for the wealthiest one to two percent — without even so much as discussing the kind of direct investment in job creation and economic growth needed for a recovery that would have real meaning for millions of Americans whose fortunes rise and fall on Main Street, not Wall Street, we are setting America up for its next failed conservative stimulus. But beyond that, whether as Democrats or progressives, we are setting ourselves up for moral failure if we do not meet the inherent moral obligation this "tax deal" creates, and let the discussion end with the extension of the same tax cuts that have consistently failed to stimulate growth and create jobs. If we fail to make the case for and demand direct investment in jobs and recovery, we will be complicit in sticking America with a deal that belongs in the same category as one that Sen. Carl Levin (quoting a Goldman Sachs email) aptly described, while grilling the former head of Goldman Sachs' mortgages department, as a "shitty deal." If we believe America deserves better, we'd better be willing to fight for it or be held accountable for failing to fight for what we say we believe is right.