Adbusters

The New McCarthyism: How the Fear of Socialism Fuels America's Climate Denial

In the late 1970s, scientists first came to a consensus that global warming was likely to result from increasing greenhouse gases released by the burning of fossil fuels. This idea had been around since the turn of the century, but the development of computer models made it possible to make quantitative predictions. Almost immediately, a small group of politically connected and conservative scientists began to question this consensus. As empirical scientific data mounted up, their attacks became more unprincipled. These conservative scientists used data selectively and often misrepresented the conclusions of many studies undertaken by the scientific community.

Keep reading... Show less

Whatever Happened to the Good Life?

Since we're accustomed to thinking of young people and students as the shock troops of social change, explaining youthful inertia has become a national preoccupation (sadly, we expect impassivity from the middle aged). Many point to the absence of a draft as a motivating factor. Others cite the lack of contemporary examples of successful collective action to inspire faith in the efficacy of protest. But more often than not, the problem is conceived as cultural. The emerging generation, of which I am part, is post-Watergate, post-Monica Lewinsky, and weaned on irony and satire. We expect the government to deceive us and are hardly surprised, let alone outraged, when these expectations are met. Others argue that young people aren't particularly self-absorbed or apathetic; they're overworked and indebted. Today's twenty- and thirty-somethings are so busy struggling to make ends meet, they simply don't have time to take to the streets.

The latter theory has gained traction with the recent publication of three thoughtfully argued books: Tamara Straut's Strapped, Anya Kamenetz's Generation Debt, and Daniel Brook's The Trap (subtitled Selling Out to Stay Afloat in Winner-Take-All America). Compared to our parents at the same age, these authors contend, we're working longer hours for less money, reduced job security, slashed benefits and fewer social services. Over the last four decades, opportunities for social mobility have declined dramatically, with wealth concentrating to a degree not seen since the Gilded Age.

In other words, it's getting harder and harder to stay -- let alone join -- America's crumbling middle class. Today's minimum wage is worth 30 percent less than it was in 1968. According to Draut, "if wages had kept pace with rising productivity between 1968 and 2000, the average hourly wage would have been $24.56 in 2000, rather than $13.74." Instead -- and particularly in fields with a social service component -- salaries have failed to keep pace with inflation and benefits, like health insurance or retirement funds, are elusive rarities. Meanwhile the cost of living has skyrocketed. Between 1995 and 2002, median rents in urban centers like San Francisco, Boston, and New York surged by sixty or seventy percent. The price tag on a simple studio in these cities is well over a thousand dollars a month. Finally, a college degree, often regarded as the key to a middle class lifestyle, costs more than ever before. In the 1960s and 1970s, when many quality public universities were free, Pell Grants covered nearly three-quarters of college tuition; today, the percentage has fallen to one-third. At the same time, tuition has outpaced inflation three times over since 1980. As a result, the average student leaves a four-year college with over $20,000 in educational debt; a graduate degree means $45,000.

As a member of "generation debt," I know these frustrations firsthand. It's hard to feel footloose when your owe $40,000 in student loans and haven't even started chipping away at the interest. I've had to move back in with Mom and Dad when housing costs were too much to cover. I haven't had health insurance in eight years and saving for retirement isn't even on the horizon. But are things that bad? Am I really so oppressed? Unlike twenty percent of the world's population, my basic necessities are covered. I've got food, clothing, shelter, and then some. I'm typing this on a G4 titanium laptop. I have a cellphone. I've traveled the world.

The fact is, even though young people today are making less, we're spending more. Between 1979 and 1990, the spending of the average person working for minimum wage increased by 30 percent. Generation Y has an inordinate amount of buying power in the United States: $175.1 billion dollars per year, much of which is wielded during the twenty plus hours a week they're online. And supposedly we have no time for activism? It makes sense that in a society where young people carry supersized debt, they expect a supersized lifestyle. Though generally inhabited by fewer people, the typical new American home is 40 percent larger than it was 25 years ago. The same period has seen the quadrupling of retail space per capita, which says something profound about rates of consumption. Jumbo SUVs, loaded with luxury options, make up half of all private vehicles on the road. Pleasure and vacation travel have become standard. Air conditioning in dorm rooms, a smorgasbord of dining options, extravagant fitness centers to work off those extra calories -- all amenities unimaginable back when college was cheap.

Since the mid-seventies, when experts starting keeping track, Americans' definition of the "good life" has become increasingly materialistic. Over the years, the good life has become more likely to include a home, a vacation home, a car, a second car, a color TV, a second color TV, travel abroad, designer clothes, a pool, a job that pays more than average, lots of money, and so on. Immaterial responses -- a happy marriage, children, interesting work, and a job that contributes to the welfare of society -- have either flat-lined or become less popular over the years.

And it's not that people simply want more; they claim to need more. A recent survey conducted by the Pew Research Center reveals that the list of things "we can't live without" has grown steadily since 1973. Many things Americans currently consider necessities didn't even exist a generation ago. Cell phones weren't on the survey in 1996, but are now considered essential by 57 percent of respondents between the ages of 18 and 29 (8 percent of the same group considers their iPod to be a necessity, not a luxury). In only ten years, the percentage of adults who consider a microwave oven a necessity has more than doubled, to 68 percent. Home air conditioning has climbed from 51 percent to 70 percent in necessity status, and the position of clothes dryers ascended twenty points as well. It's also worth noting that, according to Pew, "the more income a person has, the more likely he or she is to view goods and gadgets as necessities rather than luxuries." The richer you are, in other words, the more you need.

As economist Juliet Schor has explained, consumer satisfaction and dissatisfaction "depend less on what a person has in an absolute sense than on socially formed aspirations and expectations." That is to say, even the objectively upper-class can believe themselves beleaguered because they're ogling opulent plutocrats. Take a recent article in the New York Times pitying "millionaires who don't feel rich" in Silicon Valley. "Everyone around here looks at the people above them," complains a tycoon of substantial means. "You're nobody here at $10 million." Today's citizens aren't "keeping up with the Joneses," they're keeping up with the ultra-affluent, an unrelentingly upwardly mobile target that shapes the hopes and dreams of everyone below them.

Thus it's no surprise that young people's expectations have expanded over the years. In 1967, 45 percent of college freshman reported that being well-off financially was important; by 2004 the number ballooned to 74 percent. Critics take this as conclusive proof of Generation Y's insatiable materialism, while more sympathetic observers point out that such an attitude is simply a practical response to ever-rising costs of living. Young people aren't greedier than their predecessors, argue Draut, Kamenetz, and Brook, they simply need more money to make ends meet; which is why they have to work so much; which is why they have no time to spare; which at least partly accounts for the paltry state of progressive politics in the US.

The problem is, social movements have long been made by people far worse off than our indebted generation, a fact driven home on a recent trip to Tijuana, Mexico. I met a group of women, many in their mid-twenties and most with children, employed by the foreign-owned factories along the border. They work sixty hours a week assembling televisions and other widgets for American consumers, often for as little as six dollars a day. They live in little shacks made of scrap wood, recycled pallets, and old tires. Their homes lack running water. These women have no money and no free time, yet they have organized themselves into a collective and are effectively advocating for environmental justice in their community. Returning to the US from Tijuana, it was as though I could suddenly see clearly: our "necessities" appeared to me as what they really are -- luxuries. I have no doubt there is an element of social control built into the massive educational debt imposed on young Americans today. But I also believe social change requires sacrifice -- and imagination. We need to reevaluate the supposed "necessity" of higher education (especially people interested in the humanities, the arts, and in social change, who may find the fortune they spend on tuition could be more fruitfully invested elsewhere), envision new standards of "success," redefine the "good life," and figure out creative ways to share costs by reinvigorating old ideas (housing, food, and vehicle co-ops come to mind). Above all, we need to remember that our single biggest luxury, our salient self-indulgence, is acquiescence, and that it comes at too high a price.

The Lilly Suicides

The Witness

In the final days of the 20th century, a North Wales psychiatrist named David Healy conducted a curious study, and with more than a curious result. Twenty volunteers with no history of psychiatric problems were recruited, half of whom were given the drug Zoloft, an antidepressant from the Prozac family of drugs known as the SSRIs, or "selective serotonin reuptake inhibitors." The other half were given an antidepressant that, unlike Zoloft and Prozac, does not selectively target the brain chemical serotonin. Each group took their respective drug for two weeks and then, shortly thereafter, switched to the other.

Healy had designed his "healthy volunteer study" to compare the psychological experience of being on a serotonin antidepressant versus a non-serotonin antidepressant, but before he knew it, two of his volunteers became dangerously agitated and suicidal. Both were taking the SSRI drug. The adverse reactions couldn't easily be blamed on psychological instability -- these were healthy volunteers. And the rate of 10 percent made it clear that such results were not so rare as to be incidental.

Healy was surprised at the effect, but he would not stay surprised. Some months later, when serving as an expert witness in a civil action against Zoloft's manufacturer, Pfizer, Healy obtained access to the company archives. There he discovered an unpublished study from the 1980s in which healthy female volunteers were given either Zoloft or a placebo. The study was canceled four days later, after all those taking Zoloft began complaining of agitation and apprehension. Healy's case was not so bad; in fact, some of his volunteers rated Zoloft positively. Of the two who did not, one was a 30-year-old woman who, within two weeks of starting the drug, became obsessed with the idea that she should throw herself in front of a car. "It was as if there was nothing out there apart from the car which she was going to throw herself under," Healy reported. "She didn't think of her partner or child."

The Zoloft case was not Healy's first involvement in a civil action against an SSRI manufacturer. Earlier, he had been involved in a wrongful death suit against Eli Lilly, the maker of the much celebrated SSRI drug Prozac. An internationally renowned psychiatrist as well as a historian of psychiatric medicine, Healy's recruitment onto the plaintiffs' side was a small but significant victory. Prior to his involvement as an expert witness, Healy had already raised a number of questions about the SSRIs, including the possibility that they might produce agitation and other problems with an unusual frequency, sometimes leading to suicide. Healy was also ideal because he's not a radical or an outsider; he has done research and consulting for various drug companies, and has himself prescribed SSRIs and other psychiatric drugs. In fact, he had been consulted on several SSRI suicide cases in which he had concluded that the SSRIs were not at fault.

This view changed, however, with the case of William Forsyth.

The Victims

William Forsyth met and married his wife June in 1955. After two years of military service in West Germany, Bill and June moved to Los Angeles, where Bill had grown up. After arriving, Bill started a rental car business, and the couple had two kids, Susan and Bill Jr. The business and other investments continued to grow, and in 1986 the Forsyths cashed in. Four years later, Bill and June retired to Maui, the Hawaiian island that their son called home. Bill was 61 at the time. June was 54.

Despite the romance of a new life, the transition was difficult for Bill Forsyth. Personal difficulties led to marital difficulties. Marriage counseling seemed to help, though, and by the next year there was a general sense that Bill was on the mend. Three years after the move to Hawaii, however, with Bill still feeling unsettled, a local psychiatrist prescribed Prozac. The psychiatrist, who had been seeing Bill since the previous year, did not believe Bill to be either seriously depressed or suicidal.

After his first day on the drug, Bill was feeling as you might expect if you've read Peter Kramer's "Listening to Prozac" -- he was "better than well." The next day, however, he felt horrible, and for the first time put himself under hospital care. Ten days later, Bill felt well enough to leave the hospital, but was still taking Prozac. Everyone seemed to agree that he was doing better, and the family scheduled a boat trip for the next day. When his parents failed to show up that afternoon, Bill Jr. went to their home, where he found both his parents lying dead in a pool of blood. Eleven days after starting on Prozac, Bill Forsyth had taken a serrated knife from the kitchen and stabbed his wife 15 times. He had then taken the knife, fixed it to a chair, and impaled himself on it.

Depressed people sometimes do desperate things. Yet these were senseless acts that were simply unimaginable to those who knew Bill Forsyth. For his two grown children, the only possible explanation was the drug. They decided to sue.

The Forsyth case was not the first wrongful death suit to be brought against Eli Lilly. By the fall of 1994, a year after the Forsyth murder-suicide, there were already 160 cases filed against Lilly, linking Prozac to homicides, suicides, and other violence. Many of these cases were dismissed; others ended with cash settlements. But Lilly had not lost a Prozac case, and was determined to keep it that way. By the mid-1990s, Prozac sales were worth $2 billion per year, or about a third of all Lilly's income.

In March 1999, with Susan and Bill Jr. refusing to settle, the Forsyth case finally made it to trial in United States District Court in Honolulu. "I know that with all their power and money I don't have much of a chance," said Susan at the time, "but I feel like I have to try." With David Healy serving as an expert witness, the Forsyths' lawyers went on to argue that the Prozac family of drugs can produce a kind of psychological hijacking -- a bizarre and nightmarish syndrome marked by suicidal thoughts, extreme agitation, emotional blunting, and a craving for death. They also argued that the company knew of these risks and, instead of warning doctors to look out for them, worked vigilantly to sweep them under the rug.

The Evidence

Though Prozac is one of the world's best-known commodities, its most terrifying potential side effect, "akathisia," remains virtually unknown. Akathisia has been described as a unique form of inner torture that, prior to the development of psychiatric drugs, probably never existed. Knowledge of the side effect, however, has been around for a while. In 1978, 10 years before "fluoxetine" would be brought to the U.S. market and become the bestseller known as Prozac, initial clinical trails had already warned of akathisia and other problems. Minutes from Lilly's Prozac project team in that year noted, "Some patients have converted from severe depression to agitation within a few days; in one case the agitation was marked and the patient had to be taken off [the] drug ... There have been a fairly large number of reports of adverse reactions."

As the Forsyth case and others would go on to show, Lilly's internal records revealed considerable awareness within the company. A letter sent to it from the British Committee on Safety of Medicines in 1984 reads: "During the treatment with [Prozac] 16 suicide attempts were made, two of these with success. As patients with a risk of suicide were excluded from the studies, it is probable that this high proportion can be attributed to an action of the preparation." Similar concern was expressed by German authorities in 1985, where Prozac is sold as "Fluctin," and with required warnings of possible akathisia and suicide. A Lilly document dated from March of that year even quantifies the problem, suggesting a rate of suicide for Prozac 5.6 times higher than for the antidepressants that were popular before the rise of the SSRIs -- the tricyclics. "The benefits vs. risks considerations for fluoxetine [Prozac] currently does not fall clearly in favor of the benefits," the document concludes. By 1986, clinical-trial studies comparing Prozac with other antidepressants showed a rate of 12.5 suicides per 1,000 users compared to only 3.8 per 1,000 on older, non-SSRI antidepressants, and 2.5 per 1,000 on placebos.

After Prozac's entry into the market in 1988, reports quickly surfaced to confirm that the beast Lilly saw in the laboratory had now, without warning, been unleashed upon the public. In 1990, a report appeared in the American Journal of Psychiatry on the "Emergence of Intense Suicidal Preoccupation During Fluoxetine Treatment." Two Harvard psychiatrists and a registered nurse described cases in which patients developed serious preoccupations with suicide soon after being given Prozac. "We were especially surprised to witness the emergence of intense, obsessive, and violent suicidal thoughts in these patients," they commented. "It was also remarkable how violent these thoughts were. Two patients fantasized, for the first time, about killing themselves with a gun, and one patient actually placed a loaded gun to her head. One patient needed to be physically restrained to prevent self-mutilation."

Two years later, in July 1992, another article appeared, this time in the Archives of General Psychiatry. Again, the article had two senior researchers among its authors, one of whom was a leading expert on akathisia. The psychiatrists stressed in the report that, prior to going on Prozac, none of their patients had a history of significant suicidal behavior. "All described their distress [while on Prozac] as an intense and novel somatic-emotional state; all reported an urge to pace that paralleled the intensity of the distress; all experienced suicidal thoughts at the peak of their restless agitation; and all experienced a remission of their agitation, restlessness, pacing urge, and suicidality after the fluoxetine [Prozac] was discontinued."

The finding that these problems emerge soon after an SSRI drug is taken, and then disappear soon after the drug is withdrawn, provides compelling evidence that the problem is often the drug and not, as the makers of SSRIs have insisted, the depression. Anthony Rothschild and Carol Locke, also of Harvard Medical School, reported three such cases in the Journal of Clinical Psychiatry in 1991. All three individuals had previously attempted suicide while being treated with Prozac -- in fact, each had jumped from great heights and had managed to survive. In turn, all three had been put back on Prozac, only to complain of the same strange desire to kill themselves.

"I tried to kill myself because of these anxiety symptoms. It was not so much the depression," said one of the individuals, a 25-year-old woman. Another, a 47-year-old man, complained that "this is exactly what happened the last time I was on [Prozac], and I feel like jumping off a cliff again." Reflecting on these cases, the Harvard researchers stressed that patients need to know that such overwhelming symptoms are the side effects of medication, and are treatable. "Our patients had concluded their illness had taken such a dramatic turn for the worse that life was no longer worth living."

The Accused

Reports that Prozac might be unsafe at any dose had Lilly running scared. As early as 1990, one executive stated in an internal memo that, if Prozac is taken off the market, the company could "go down the tubes." With the U.S. Food and Drug Administration asking questions, Lilly was pressed to show that their drug was safe. The result was published on Sept. 21, 1991.

Authored by Lilly employees, the report claimed to represent all existing data comparing Prozac with either older antidepressants or placebos. In fact, the data had been hand-picked to favor the drug and the company. The analysis dealt with 3,065 patients, less than 12 percent of the total data from Prozac studies at the time. Among those whose data were left out was the very population most likely to become suicidal -- the 5 or so percent of patients who dropped out of the clinical trials because they experienced unpleasant side effects after taking Prozac.

The Lilly study was rejected by the New England Journal of Medicine. Publication in the British Medical Journal was not as high profile, but it would have to do. And it did. With the study in hand, and with repeated assurances from Lilly that its drug was safe, the FDA's Psychopharmacological Drugs Advisory Committee gave the drug a clean bill of health in September 1991, concluding that there was "no credible evidence of a causal link between the use of antidepressant drugs, including Prozac, and suicidality or violent behavior." Prozac was saved.

It was not until trials like the Forsyth case that Lilly's internal documents would surface, revealing the depth of the deception. This included statements from the Prozac working group in 1978, acknowledging problems with akathisia and drug-induced psychosis. Also among the documents was evidence that the company had drafted (but later abandoned) a package insert for Prozac stating that, "Mania and psychosis may be precipitated in susceptible patients by antidepressant therapy." And there was a memo dated Oct. 2, 1990, which referenced an upcoming Prozac symposium. "The question is what to do with the 'big' numbers on suicidality," the memo states. "If the report numbers are shown next to those for nausea, they seem small."

The Lilly papers also contain a series of memos referencing a study by two Taiwanese doctors entitled "Suicidal attempts and fluoxetine (Prozac) treatment." In a 1992 memo, a Lilly employee reports, "Mission accomplished. Professor Lu will not present or publish his fluoxetine [Prozac] vs. maprotiline suicidality data." In a similar case, Lilly lawyers obtained a cease-and-desist order against Robert Bourguignon, a Belgian doctor who was soliciting his colleagues' impressions regarding Prozac side effects. Bourguignon eventually prevailed, and his survey, "Dangers of Fluoxetine," appeared in The Lancet in 1997.

Lilly's response to "Prozac suicide" court cases was equally forceful. In the first case to go to trial, known as the Wesbecker case, Lilly appeared to score a victory, only to have the judge, John Potter, declare later that the case had been won under pretense. What Potter had learned was that Lilly had settled the case during the trial, paying a huge sum in exchange for the plaintiffs' keeping the settlement a secret. This sleight of hand occurred immediately after Potter had decided to allow the plaintiffs' lawyers to present evidence of past criminal behavior on the part of Eli Lilly. After discovering the secret settlement, Potter fought to change the verdict, and eventually succeeded in the Kentucky Supreme Court. The case had not been won, but settled. This was, however, too little, too late. Lilly had achieved its objective -- to avoid losing even a single Prozac lawsuit.

The Reckoning

Michael Grinfeld summed up Lilly's legal situation well, and prophetically, writing in California Lawyer magazine in 1998: "Lilly may eventually face a court judgment in a Prozac case, but it has succeeded beyond all expectations in postponing that day." Indeed it has. On April 2, 1999, despite David Healy's testimony and the surfacing of the Lilly papers, the jury in the Forsyth trial found in favor of Eli Lilly. In the eyes of the jury, Prozac did not cause Bill Forsyth to kill his wife and then himself.

While Lilly has continued to survive all legal challenges to date, not all plaintiffs' cases involving the SSRIs have ended in defeat. In May 2001, Australian David Hawkins was freed from prison after a supreme court judge said it was "overwhelmingly probable" that Hawkins would not have killed his wife or attempted suicide had he not been using Zoloft. In another 2001 case, a Wyoming court found against GlaxoSmithKline, maker of the SSRI Paxil. The jury found that Paxil can cause some individuals to commit suicide and homicide, and had done exactly that in the case of 60-year-old Donald Schell. After complaining of anxiety, stress, and possible depression, Schell had been prescribed Paxil by his family doctor. Two days later, Schell shot to death his wife, his daughter, his infant granddaughter, and then himself. David Hawkins, too, had committed homicide after his first two days of SSRI treatment.

Stories like these litter the communities of North America and Europe, most of them concealed behind the confusion and secrecy that so often mark sudden family tragedies. By the spring of 1999, 2,000 suicides by Prozac users had been reported to the Food and Drug Administration, at least a quarter of which appeared to be linked to agitation and akathisia. According to the FDA's own estimate, only about 1 percent of serious side effects are ever reported on its "adverse event system." This means that, as David Healy has concluded, as many as 50,000 akathisia-related suicides had taken place by 1999. The total estimate for all SSRIs would of course be much larger.

In the face of such statistics, and with the loss of their exclusive patent on fluoxetine, Lilly announced in December 2001 that it planned to bring another antidepressant to market late in 2002. Not surprisingly, the new drug, duloxetine, does not selectively target serotonin. The SSRIs, once hailed as a revolution in the treatment of depression, are now in the process of being phased out. Oddly, this is making way for pharmaceuticals that act in essentially the same way as the drugs that the SSRIs originally replaced. Given this backward trend, one is left to wonder whether all the death and misery linked to the SSRIs might have been for naught. If so, a final conclusion seems unavoidable: that next to Big Tobacco and the marketing of cigarettes, the selling of the SSRIs is perhaps the deadliest marketing scandal of the 20th century.

Richard DeGrandpre is the author of "Ritalin Nation" (1999) and "Digitopia" (2001), and is currently writing a history of drugs in the 20th century. "The Lilly Suicides" first appeared in Adbusters (May/June, 2002). For the full report on "The Lilly Suicides," log on to Prozac Spotlight.

Ritalin? Just Say No

On any given day in North America, almost five million kids will take a powerful psychostimulant drug. The geographical caveat is important: more kids in North America are diagnosed with attention deficit disorder (ADD) and given drugs like Ritalin to "help" them behave than in the rest of the world combined. In fact, the US and Canada account for a startling 95 percent of worldwide Ritalin consumption.

In the midst of this drug epidemic, April 2001 appeared to signal a backlash. Two television magazines, PBS's Frontline and A&E's Investigative Reports, pondered the massive increase in use, as did a five-part series in Canada's National Post newspaper. Still, of all the critical reports in recent months, none has come any closer to facing the hard facts about Ritalin than have the hundreds that came before.

Fact one: While medical "experts" and the media persistently deny it, developmental studies have now established that certain differences in caregiving and family structure cause some children to become impulsive and hyperactive. In a recent example, a ten-year, federally funded study in the US, reported at the April meeting of the Society for Research in Child Development, found that the more time children spent in daycare the more unmanageable they became. Kids who spent more than 30 hours a week in daycare scored significantly higher on such things as "explosive behavior," "talking too much," "argues a lot," and "demands a lot of attention" -- the very behaviors that so often lead to stimulant treatment.

Fact two: Ritalin is little more than coke for kids. "Cocaine, which is one of the most reinforcing and addictive of the abused drugs, has pharmacological actions that are very similar to those of methylphenidate [Ritalin], which is the most commonly prescribed psycho-tropic medication for children in the United States." This conclusion, reported by Nora Volkow and colleagues at Brookhaven National Laboratory, appeared in the Archives of General Psychiatry in 1995. A follow-up study, published in the American Journal of Psychiatry in 1998, found that the pharmacological actions produced by oral, therapeutic doses of Ritalin were comparable to those produced by recreational doses of intranasal cocaine. Researchers are quick to point out that children prescribed Ritalin do not (usually) snort or inject it, which alters the drug-taking experience. But do we really believe parents would give their kids cocaine, even if it was only in pill form?

Fact three: Ideology is driving the science. Fighting the drug war, researchers like Volkow have demonstrated that continued use of cocaine and other stimulants causes brain changes. Yet never have these researchers investigated whether chronic stimulant use might produce the same effects in kids. Meanwhile, other researchers have pointed to subtle differences in certain areas of the brain to suggest that ADD is a biological disease -- a claim repeated in the recent Frontline episode. The truth is that all these studies have looked only at hyperactive individuals who have been taking stimulants for years. At least one study, published in Psychiatry Research in 1986, was honest in its findings: "since all of the [ADD] patient had been treated with stimulants, cortical atrophy [i.e., brain deterioration] may be a long-term adverse effect of this treatment."

Fact four: The US Drug Enforcement Administration has long known that massive amounts of Ritalin are being diverted by adolescents and adults into recreational use, where it's often crushed up and snorted, or even injected. The DEA reported that Ritalin misuse in high schools increased from three to 16 percent from 1992 to 1995. Similarly, it found that while children between the ages of ten and 14 were involved in about 25 emergency-room visits connected with Ritalin misuse in 1991, this number had jumped to 1,725 by 1998.

Fact five: Stimulants are no cure. Perhaps all this hypocrisy could be excused if stimulant "treatment" somehow worked, but it doesn't –- at least not for the children themselves. Parents have been encouraged to believe that pharmacological control will boost their child's learning and social skills, but this rarely happens. Dozens of objective studies have assessed the long-term effectiveness of stimulants on children's academic performance, social development and self control. None has shown them to be effective for anything but controlling kids' behavior -- an effect that vanishes once the drug wears off. Such studies rarely make the headlines, however. Instead, we hear about recent research from the US -- "the MTA study" -- that relied heavily on subjective reports from teachers and parents while ignoring its own objective findings, which showed little promise for drug treatment. Reporting on this research, the media, too, has found a cure where there isn't one.

So where does the cure lie? It lies in prevention. This means getting back to basics as a culture, with parents who have and take the time to truly matter in the lives of children.

Richard DeGrandpre is the author of Ritalin Nation: Rapid-fire Culture and the Transformation of Human Consciousness (Norton, 1999), and Digitopia: The Look of the New Digital You (Random House, 2001).

All This Talk of Anarchy

Over the wire comes a report of an anarchist punching a police officer in the face, "repeatedly," during a street protest in Philadelphia. I imagine that little clot of information exploding outward through the endless fractals of the Information Age. I picture it reaching the suburban dinnertime conversations of a hundred million American Beauty households, and if I listen closely, I can hear America tut-tutting.

But then, there is something shocking about some punk putting one up in a cop's face. In a culture that can absorb, without flinching, the fact that certain individuals can afford to order take-out for the world's poorest billion without losing their seats in the Billionaire's Club, punching a cop remains a genuine shock. If you make an effort to understand it, your internal pop-psychologist kicks in: I'm getting the sense that you're angry. More than likely, you give in to an almost gut-level feeling that this is very, very wrong: In America, One Does Not Punch an Officer of the Law.

Keep reading... Show less

Consumerism 101

Hey kids, here's a newsflash: You are being sold. You are being sold every time you walk in the door of your school -- or you soon will be. The marketing business has put a price on your head, to the tune of $300 billion a year -- that's what experts believe that people your age collectively spend, whether it's money out of your own pocket, or the way you influence your parents' spending habits. And now these commercial-crazy headhunters are hunting you in the one place you can't escape -- school. Here are some examples:*From New Mexico to Nova Scotia, Coke or Pepsi will pay schools $10 to $20 per student to get exclusive rights to sell their soft drinks to you. What do you get out of it? Lots of advertising plastered around your school, and if you're lucky, a free Coke or Pepsi t-shirt.*Companies create "free" ready-made lessons for teachers to use on students. Chips Ahoy has a counting game for little kids where you have to figure the number of chocolate chips in their cookies. Kellogs has an art project where you make sculpture out of Rice Krispies. Procter & Gamble sponsors lessons on oral hygiene that include giving away Crest samples. Campbell's Soup created (then shamefully recalled) a science lesson where students compared the viscosity of Prego sauce to rival Ragu. The Consumers Union has stated that 80 percent of these "lessons" contain wrong or misleading information.*Companies profit by changing the way you think. Representatives of the drug Prozac will come to your school to "teach" you about depression. Exxon has ecology curriculum that shows how clean the environment of Alaska is. Some schools actually sell ad space in the hallways, on the sides of school buses, or billboards out in the yard.*Companies collect information about you at school. In New Jersey, elementary school kids filled out a 27-page booklet called "my all about me journal," basically a marketing survey for a television channel. Students in Massachusetts spent two days tasting cereal and answering an opinion poll. ZapMe! corporation puts "free" computers and internet hookups in schools. Then they monitor your web browsing habits and sell the information, neatly broken down by age, gender and postal code, to their customers.*Channel One gives schools "free" televisions and audio visual equipment. The catch is that you and 8 million other students have to watch their daily news broadcasts, including the commercials.*YouthStream has message boards in 7,200 high school locker rooms. They carry product advertisements and try to sell you on visiting the company's website, where even more advertising can be seen. The boards reach almost 60 percent of US high school students.So, maybe you don't like the idea of school becoming one giant rat maze for the marketing lab. But what will you do about it? You could fight fire with fire, using the laser printer supplied by ZapMe! corp. to make your own anti-spam sticker campaign, and jam every company logo in the school. Unfortunately, the see-one-sticker-one approach will get you either the principal's office or jail. But there are other ways.Start by figuring out the chain of command -- teacher, principal, school board officials, state and federal education department officials, and so on. Then complain as far up the ladder as you need to go until you get results. Contact the news media. If you call the local newspaper and tell them you are being forced to watch TV commercials during school, for instance, you'll get attention fast. Jamming meetings, protesting outside schools, hanging banners from classroom windows -- it's at least as much fun as doing homework, and you'll probably learn more.For a detailed, step-by-step game-plan to ad busting at school, you can contact the Center For Commercial-Free Public Education at unplug@igc.org or visit www.commercialfree.org.HERE'S WHAT SOME ADULTS ARE SAYING BEHIND YOUR BACK:"The advertiser gets a group of kids who cannot go to the bathroom, who cannot change the station, who cannot listen to their mother yell in the background, who cannot be playing Nintendo, who cannot have their headsets on." -- Joel Babbit, former company president, on the advantages of Channel One for advertisers."If you own this child at an early age, you can own this child for years to come. Companies are saying, 'Hey, I want to own the kid younger and younger.'" -- Mike Searles, former president of Kids-R-Us, a major children's clothing store, on the business of marketing to kids."Because physical education is required for high school students, over 70 percent of students see GymBoards every week." -- promotional material of YouthStream Media Networks, makers of school locker-room advertising boards."Teens will get your message when they see your logo within the materials ... Depending upon your product, sampling and couponing can often be incorporated into the program... In addition, we are able to gather demographic information about the school and market." -- promotional material of West Glen, a company that distributes corporate-sponsored "teaching" materials to schools.A FEW ADULTS COME TO THEIR SENSESCalifornia has passed a law prohibiting schools from selling exclusive vending rights to soft-drink manufacturers unless public hearings are held first. Another recently-passed law prohibits the use of product placements in school textbooks. A third bill in the works would place limits on electronic advertising in schools.Rep. George Miller has introduced a bill in the US House of Representatives that would prohibit companies like ZapMe! Corp. from collecting marketing information in schools without written permission from parents.In Canada, the provinces of New Brunswick and Manitoba have declared that the Youth News Network, a Canadian version of Channel One, will not be allowed in public school classrooms.

BRAND NEW STORIES

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.