Will We Ever Learn to Think in Moderation?
The media seems to have three modes of action when it comes to psychoactive drugs: intense promotion of advances and benefits; general disregard; and full-on panic about negative effects, including potential for misuse and addiction. During both the benefits and the risks periods, many myths and misinformation are disseminated. But between these bouts of euphoria and panic, there is little coverage at all, especially of addiction. This up/down/off pattern does a disservice not only to people suffering from addiction, but to those with other diseases as well.
Right now, we seem to be moving from a period characterized mainly by disinterest into one of attention and fear. Though we’ve never returned to the peak freak-out of the late ‘80s and early ‘90s—in 1989, a Gallup poll found that Americans viewed drugs as the number one problem threatening the nation, eclipsing even the economy during a recession—we have seen brief but blinding spotlights on Oxycontin, methamphetamine and now prescription drugs more generally.
A recent front-page New York Times story on Adderall addiction is suggestive of the new turn. After years of focusing on these drugs primarily to ask whether they enhance cognition, or allow people to cheat in school by faking ADHD, the article puts them front and center; it tells story of a college student who faked the disorder and the physicians who enabled him to continue getting the drug, despite desperate warnings from his parents about his addiction. Over the course of several years, he became psychotic and ultimately committed suicide.
That Adderall, an amphetamine drug, can be addictive and can sometimes cause mental illness and suicidality is no surprise. If the Times searched its own archives, it would see several earlier periods of promotion of speed as a cognitive enhancer and study aid, followed by hysteria over psychosis and addictions. (Indeed, way back in 1937, the paper of record called it “high octane brain fuel”). And anyone old enough to remember the ‘60s probably recalls the admonition “Speed Kills.”
Why can’t we recognize that a drug can simultaneously benefit some people and harm others? Why do we swing from seeing particular drugs as panaceas to viewing them as the devil’s own poison?
Part of it stems from “generational forgetting”—a well-documented condition that prevails when the addicts of one era have aged out or died and those who saw the damage done are also past their youth. When America was still in a frenzy that the ‘80s crack epidemic would continue escalating until every last youth was a glassy-eyed zombie, the younger siblings of crack addicts were already observing the devastations of the drug and choosing a different, less demonized high—often marijuana, sometimes opioids. Crack use fell rapidly.
That was far from the first time that an epidemic had burned itself out. Epidemics are inherently self-limiting because once the use of a particular drug is widespread, its dangers become obvious to everyone—and because when a culture becomes familiar with a drug, it develops ways to minimize harm. For example, our long-term relationship with alcohol has produced bans on drunk driving; price, sales, and advertising restrictions; and advice on moderation, like alternating alcoholic drinks with water or soft drink—not to mention AA.
Unfortunately, this can also create the impression that panic is productive as a way of changing behavior, when it actually contains the seeds of the next epidemic. Since the new generation is not using the previous one’s “demon drug,” it thinks its own drug use is not going to become a problem. Indeed, the newly popular drug appears to be safe, beneficial, fun—at least, that’s generally how the media tends to portray legal drugs when first on the market. Of course, during the early stages of addiction, it does seem like everything’s under control.