Psychotherapy Networker

Here Are the 14 Habits of Highly Miserable People

Most of us claim we want to be happy—to have meaningful lives, enjoy ourselves, experience fulfillment, and share love and friendship with other people and maybe other species, like dogs, cats, birds, and whatnot. Strangely enough, however, some people act as if they just want to be miserable, and they succeed remarkably at inviting misery into their lives, even though they get little apparent benefit from it, since being miserable doesn’t help them find lovers and friends, get better jobs, make more money, or go on more interesting vacations. Why do they do this? After perusing the output of some of the finest brains in the therapy profession, I’ve come to the conclusion that misery is an art form, and the satisfaction people seem to find in it reflects the creative effort required to cultivate it. In other words, when your living conditions are stable, peaceful, and prosperous—no civil wars raging in your streets, no mass hunger, no epidemic disease, no vexation from poverty—making yourself miserable is a craft all its own, requiring imagination, vision, and ingenuity. It can even give life a distinctive meaning.

Keep reading...Show less

The Addict in All of Us: Dr. Gabor Maté on the Problem We All Live With

Over the past several decades, popular attitudes toward addiction have undergone a radical destigmatization. Many attribute the beginning of this shift to former first lady Betty Ford and her decision to go public about her addiction to alcohol and opiates soon after leaving the White House. She hadn’t been a public nuisance or a barfly. She’d never driven drunk, she said, or stashed bottles so she could drink secretly when she was alone. But by openly addressing her problems and becoming an outspoken advocate for rehabilitation through the Betty Ford Clinic (now the Betty Ford Center), she helped change the face of addiction. Perceptions of addicts as out-of-control gutter drunks and junkies were replaced by images of glamorous celebrities like Liza Minelli, Mary Tyler Moore, and Elizabeth Taylor as they checked in and out of Betty Ford.

Keep reading...Show less

Examining the Science of Torture: The Price of Coercive Interrogation

In 2014, Pulitzer Prize–winning New York Times journalist James Risen made headlines by revealing that the American Psychological Association (APA) had supported and helped legitimize the Bush administration’s use of torture in its post 9/11 war on terror. After first dismissing the claims, the APA commissioned an independent committee to investigate the allegations. That committee’s 542-page report, which appeared in July 2015, documented the truth of everything Risen had reported.

Keep reading...Show less

The Profound Challenge of OCD: One Woman's Story

Long before I’d ever heard the word archetype, I discovered a defining myth for my life in the story of the Athenian hero Theseus and his cunning plan to slay the Minotaur. This fearsome monster, half-man and half-bull, was caged in a structure of tangled, dead-end pathways called the Labyrinth. Young men and women were periodically forced inside as sacrifices to feed the beast, never to return. Until Theseus. With help from Princess Ariadne, who instructed him to unwind a ball of thread to trace his pathway in and guide his safe return, Theseus boldly entered the Labyrinth, killed the Minotaur, and lived to tell his triumphant tale.

Keep reading...Show less

How to Break the Emotional Cycles That Make Weight Loss So Hard

Most therapists have been taught that if we can help clients understand the emotional triggers of their overeating, they’ll be able to control their behavior and lose weight. Some of us, when working with clients on the continuum from occasional overeating to binge-eating disorder, build strategies around nutrition, portion control, and exercise habits. But more often than not, weight loss—should it occur—is fleeting. In fact, the pursuit of weight loss typically triggers and sustains overeating.

Keep reading...Show less

5 Harmful, Puritanical Myths That Keep Us Mired in Self-Loathing

Most people don’t have any problem with seeing compassion as a thoroughly commendable quality. It seems to refer to an amalgam of unquestionably good qualities: kindness, mercy, tenderness, benevolence, understanding, empathy, sympathy, and fellow-feeling, along with an impulse to help other living creatures, human or animal, in distress. But we seem less sure about self-compassion. For many, it carries the whiff of all those other bad “self” terms: self-pity, self-serving, self-indulgent, self-centered, just plain selfish. Even many generations removed from our culture’s Puritan origins, we still seem to believe that if we aren’t blaming and punishing ourselves for something, we risk moral complacency, runaway egotism, and the sin of false pride.

Keep reading...Show less

How to Fight Stress and Burnout on the Cheap

It was a series of upending life events over a period of years — some bad, some good, all unexpected and disorienting — that gradually propelled me into a state of mind-numbing, body-exhausting burnout. First, there was my husband’s cancer, his surgery, and the seven months spent watching him suffer through the spirit-breaking ordeal of chemotherapy. During those months, I’d prayed and cried and white-knuckled my way through an endless, dark valley of alternating fear, anguish, and desperate hope.

Keep reading...Show less

Why Can't Americans Stop Talking About Their Weight?

Body of Truth: How Science, History, and Culture Drive Our Obsession with Weight—And What We Can Do About It
By Harriet Brown
Da Capo Press. 243 pages.
ISBN: 9780738217697

Keep reading...Show less

The Science of a Happy Love Life: It's A Lot Simpler Than We're Led to Believe

We often imagine the English as reserved stonewallers, even more emotionally constricted than we Americans are. But Susan Johnson, the daughter of two London pubkeepers and the inventor of Emotionally Focused Couples Therapy (EFCT), has devoted her career to breaking that old stereotype and developing an approach that zeroes in on showing couples how to express their deepest feelings for each other. To get there, she had to break what was perhaps the biggest taboo of all.

Keep reading...Show less

The Club Drug Emerging As a Popular Antidepressant

Since it was introduced as an anesthetic in the 1970s, ketamine has occupied an uncertain pharmacological status. It’s been used as both a Vietnam-era battlefield painkiller and an illicit party drug, better known as Special K. But recent findings in studies around the world have some researchers wondering whether it might be the silver bullet for depression that Prozac and its sidekicks never turned out to be.

Keep reading...Show less

'You' Don't Exist: Why an Enduring Self Is a Delusion

What if our therapeutic goals of improving self-esteem, developing a stable and coherent sense of self, and identifying and expressing genuine, authentic feelings all turn out to be symptoms of delusion? And what if the current mindfulness craze—if we take it seriously enough—changes who we think we are and what we’re trying to do in therapy?

Keep reading...Show less

The Man Who Couldn’t Stop: OCD and the True Story of a Life Lost in Thought

The Man Who Couldn’t Stop: OCD and the True Story of a Life Lost in Thought
By David Adam
Farrar, Straus and Giroux. 324 pages.
ISBN: 9781447238287

Keep reading...Show less

How the Mindfulness Movement Went Mainstream -- And the Backlash That Came With It

In 1979, a 35-year-old avid student of Buddhist meditation and MIT-trained molecular biologist was on a two-week meditation retreat when he had a vision of what his life’s work—his “karmic assignment”—would be. While he sat alone one afternoon, it all came to him at once: he’d bring the ancient Eastern disciplines he’d followed for 13 years—mindfulness meditation and yoga—to people with chronic health conditions right here in modern America. What’s more, he’d bring these practices into the very belly of the Western scientific beast—a big teaching hospital where he happened to be working as a post-doc in cell biology and gross anatomy. Somehow, he’d convince scientifically trained medical professionals and patients—ordinary people, who’d never heard of the Dharma and wouldn’t be caught dead in a zendo or an ashram—that learning to follow the breath and do a few gentle yoga postures might help relieve intractable pain and suffering. In the process, he’d manage to reconcile what was then considered fringy, New Age folderol with empirical biological research, sparking a radical new approach to healing in mainstream medical practice.

Keep reading...Show less

Why People Act in Self-Defeating, Irrational Ways - and How They Can Stop

At the tail end of a sweltering, humid Chicago day in 1993, I took my family to the community pool for a dip. As the children splashed gleefully, I sat nearby reading Robert Ornstein’s new book, The Evolution of Consciousness, unaware that my life was about to change.

Keep reading...Show less

The Fascinating Differences Between the Conservative and Liberal Personality

"There are three things I have learned never to discuss with people: religion, politics, and the Great Pumpkin," laments Linus van Pelt in a 1961 Peanuts comic strip. Yet in today's hyperpartisan political climate, religion and politics are obsessively debated, while the "American people" that politicians and reporters constantly refer to seem hopelessly divided. Meanwhile, psychologists are increasingly exploring the political arena, examining not just the ideological differences, but also the numerous factors - temperamental, developmental, biological, and situational - that contribute to the formation and maintenance of partisan political beliefs.

Keep reading...Show less

Can You Rewire Your Brain to Change Bad Habits, Thoughts and Feelings?

Nearly 90 years since F. Scott Fitzgerald wrote his classic The Great Gatsby, Baz Luhrman's film version gave renewed currency to the novel’s famous final line: “So we beat on, boats against the current, borne back ceaselessly into the past.” What’s afforded this passage such staying power is not only its haunting poetry, but the worldview it expresses—however hard we may try to reinvent ourselves, we’re doomed to remain captives of our pasts. Another celebrated author, William Faulkner, put it this way: “The past is never dead. It’s not even past.” Eugene O’Neill penned these words: “There is no present or future, only the past, happening over and over again, now.”

Keep reading...Show less

Are Therapists Seeing a New Kind of Attachment?

Over the past decade or two, seasoned therapists who treat young people have been seeing some increasingly worrisome trends. Although solid statistics are hard to come by, one indication of a surge in troubled young adults comes from the reports of college mental health services. A 2010 survey by the Higher Education Research Institute (HERI) at the University of California, Los Angeles of almost 202,000 incoming college freshmen at 279 colleges and universities showed a shocking decline in self-reported mental and emotional well-being—at its lowest level since 1985, when HERI began conducting the surveys. In this recent survey, the percentage of students who rated their emotional health “above average” fell from 64 percent in 1985 to 52 percent. According to the June 2013 APA Monitor, 95 percent of surveyed college counseling-center directors said that the number of students with “significant psychological problems is a growing concern,” citing anxiety, depression, and relationship issues as the main problems. Another 2013 survey, the American College Health Association–National College Health Assessment, reported that 51 percent of 123,078 responders in 153 US colleges had experienced “overwhelming anxiety” during the previous year, 31.3 percent had experienced depression so severe it was difficult to function, and 7.4 percent had seriously considered suicide.

Keep reading...Show less

Psychologist: Stop Bubble-Wrapping Your Kids! How Overprotection Leads to Psychological Damage

I’m sure Shyam wasn’t thinking about the harm she was causing her 8-year-old daughter, Marian, when she demanded her daughter’s school put an extra crossing guard closer to their home. Nor did she doubt herself when she insisted that children be barred from bringing oranges to school because Marian developed a minor rash every time she ate one. At home, Marian was closely monitored and never allowed to take risks: no sleepovers, no playing on the trampoline with friends, no walking to the corner store (less than a block away) by herself.

Keep reading...Show less

Have Antidepressants Lived Up to Their Promise?

The research literature on the effectiveness of antidepressants is filled with contradictions and controversy. Few have the scientific know-how and patience to wade into the Great SSRI Debate and make sense of it. An exception is neuropsychologist John Preston, author of Clinical Psychopharmacology Made Ridiculously Easy. While he’s a critic of the role of Big Pharma in the mental health field, in the interview below, he tells us how SSRIs may have unfairly gotten a bad rep.

Keep reading...Show less

Falling in Love Again: The Amazing History, Marketing, and Wide Legal Use of Today's "Dangerous" Drugs

In the fall of 1987, a story appeared in the business section of the New York Times about a new antidepressant drug, fluoxetine, which had passed certain key government tests for safety and was expected to hit the prescription drug market within months. Just this brief mention in the Times about the prospective appearance of the new, perkily named Prozac propelled Lilly shares from $10 to $104.25—the second-highest dollar gain of any stock that day. By 1989, Prozac was earning $350 million a year, more than had been spent on all other antidepressants together in 1987. And by 1990, Prozac was the country’s most prescribed antidepressant, with 650,000 prescriptions filled or renewed each month and annual sales topping $1 billion. By 1999, Prozac had earned Lilly $21 billion in sales, about 30 percent of its revenues.

Morphine, the injectable alkaloid derived from opium, was considered superior to opium because it was stronger and thought to be nonhabit forming. As injected, it was even considered a cure for opium addiction. (The only downside to this theory, of course, was that morphine was even more addictive.) American Civil War vets getting morphine injections to relieve the lingering pains of war wounds continued taking it decades after the war to feed their habit. Mothers gave it to colicky newborns and teething babies. And while men drank alcohol openly in saloons, opiate-addled ladies nodded off quietly in their parlors or dosed themselves in secret before heading off to the theater or opera.

Opium or morphine was, in fact, the ladies’ drug of choice, or at least the drug of their doctor’s choice. Women were given opiates for “female troubles’’—from menstrual problems to the pains of childbirth. In addition, because of the innate delicacy of the female nervous system, they were thought to be more susceptible to neurasthenia and melancholy than were men. By the late 1860s, more than two-thirds of the public, 80 percent in some places, estimated to be opiate addicts were white, middle-class women. But besides relieving physical pain, women undoubtedly found morphine to be as good an escape from the tedium of daily life as they were likely to find. As one female morphine taker, “a lady of culture and distinction,” wrote in 1909, “[D]on’t you know what morphine means to some of us, modern women without professions, without beliefs? Morphine makes life possible. . . . I make my life possible by taking morphine.”

Altogether, by 1900 more than four percent of the American population was addicted to opiates—a bigger percentage of the population than heroin addicts in the 1990s. Even by the 1880s, however, the medical community as a whole was beginning to perceive opiate addiction to be a threat to the mental and physical health of the republic. What to do? Luckily, at about that time, the next big, bright idea on the pharmaceutical scene was born—which itself would become a kind of proto-blockbuster. This was heroin, a synthetic introduced in 1898 by a German company known for making color dyes and owned by one Friedrich Bayer. The company had another little compound in the works called acetylsalicyclic acid (or aspirin). But they put it on the back burner to focus on what they thought would be a bigger money-maker, heroin (so-named because, twice as strong as morphine, it was considered heroisch, the German word for ‘heroic’). Bayer and his company massively promoted their new drug in its adorable little bottle, sent out free samples to physicians, and advertised it as an effective, nonhabit-forming drug—a great cough suppressant and, again, a good source for demorphinization, the process of weaning addicts off their morphine.

The drug was a huge commercial hit in America but, within a few years, it was painfully obvious to most that the hype had been a shade too positive: heroin was far more addictive than morphine. Bayer stopped making it in 1913 (whereupon it, too, would go on to a booming career as an illegal street drug) to get rich instead on the sales of that other compound the company had laid aside, aspirin, which itself turned out to have quite a nice run.

In the meantime, what would replace the fading reputation of the opioids? In 1884, a young physician named Sigmund Freud wrote his first important monograph on a drug he’d been experimenting with on himself. The monograph was “On Coca,” and the drug was, of course, cocaine. Although shortly thereafter, he’d have serious regrets about his new find, at the time, he sounded like a classic cokehead enthusiast. Cocaine, he wrote, promoted “exhilaration and lasting euphoria, which in no way differs from the normal euphoria of the healthy person.” There was no unnatural excitement, no “hangover” effect of lassitude, no unpleasant side effects and, best of all, no dangerous cravings for more. Cocaine, Freud believed, could be not only a mental stimulant, but a possible treatment for “nervous stomach” and other digestive disorders, asthma, and once again, morphine and alcohol addiction. To his fiancée, Martha, he wrote, “I take very small doses of it regularly against depression and against indigestion and with the most brilliant success.”

Freud certainly didn’t introduce cocaine to America. In fact, it was introduced to him via promotional materials sent by Parke, Davis & Company (soon to become the world’s largest producer of pharmaceutical-grade cocaine), who eventually paid him to say something nice about the drug. By this time, cocaine infatuation was well underway in Europe and America. In contrast to the drowsy, dreamy languor induced by the opioids, cocaine woke people up, gave them new vim and vigor, and increased their optimism and self-confidence to forge ahead in life. As Stephen Kandall writes in his book Substance and Shadow: Women and Addiction in the United States, fashionable women visited their doctors for hypodermic injections of cocaine “to make them lively and talkative” in social situations.

Parke, Davis & Company produced a clutch of cocaine-based products (as did other companies), including wine, soft drinks (to compete with Coca-Cola, made by a Georgia company), cigarettes, a powder that could be inhaled, and an injectable form with a hypodermic kit. Cocaine would, Parke, Davis & Company promotions read, “supply the place of food, make the coward brave, the silent eloquent and . . . render the sufferer insensitive to pain.” Wonderfully enough, among the innumerable products containing cocaine made by different companies was a soda simply and perfectly called Dope.

But, like opium before it, cocaine gradually lost its bloom with the growing awareness that this addiction was, yet again, possibly even worse than those of opium and heroin. Unlike the opioids, which rendered people semiconscious, too much cocaine made people crazy—hallucinatory, psychotic, violent. As middle-class users fell away, cocaine became increasingly associated with the lower orders of society, making it even scarier to polite society. Gradually, beginning in the early 20th century, and particularly after the Harrison Act of 1914, increasingly stringent laws severely tightened the sale and distribution of these drugs, as well as cannabis. They then went underground, only to emerge 50 to 60 years later as part of a huge, and still apparently unstoppable, illegal drug trade.

Barbiturates

After a kind of pharmaceutical respite, the 1930s saw the more or less parallel rise of uppers anddowners in the form of amphetamines and barbiturates. Both would be extolled, in the same familiar way, as perfectly safe, nonaddictive, and wonderfully effective for a wide variety of physical and mental ailments.

The first barbiturate drugs, synthesized and marketed in 1911 by Merck and Bayer under the brand names Veronal and Luminal, proved themselves hugely more effective as soporifics than previous nostrums—like the bromides, which had terrible side effects and took 12 days to work themselves out of the body. The first intravenous anesthetic, they dampened down the central nervous system without necessarily knocking people out. By the 1920s, they were being used to relieve surgical and obstetric pain, control seizures and convulsions, ease alcohol withdrawal symptoms, soothe ulcers and migraines, suppress asthmatic wheezing, and reduce the irritability accompanying hyperthyroidism.

The biggest selling point of the barbiturates, however, was that they made people feel pleasantly drunk and sleepily euphoric, so it only seemed natural that they might be a good drug for soothing the perpetually anguished and agitated American psyche. Thus, the drugs soon acquired a reputation as the psychotherapist’s friend. One psychiatrist, in fact, claimed in 1930 that Amytal not only quickly dispatched “manic-depressive psychosis” symptoms, but also midlife depression, with “complete recoveries in two to four weeks.” Amytal was marketed to psychoanalysts for “narcoanalysis” because it relaxed fears, anxieties, and inhibitions and engendered childlike trust in the therapist—features of the drug that, coincidentally, attracted law-enforcement authorities, who used it as a truth serum to ease confessions out of befuddled suspects.

Although Benzedrine Sulfate didn’t actually improve thinking ability, what it did do, from a marketing standpoint, was even better. It almost magically lifted people’s spirits, increasing their optimism, self-confidence, sociability, mental alertness, and initiative. The researchers tried the pill on children with learning and behavioral problems and noted that it seemed to help. Paradoxically, while revving up adults, it seemed to calm down unruly kids, though the ultimately explosive ramifications of that discovery wouldn’t be revealed for decades. Yes, the drug tended to make people more aggressive, but it made them feel positive and confident, too. So the company had a winner!

When Benzedrine Sulfate debuted in 1937, SKF sent out mailers to almost every member of the American Medical Association (90,000 doctors) with this simple message: “the main field for Benzedrine Sulfate will be its use in improving mood.” Ads that followed marketed it for “the patient with mild depression,” but not severe depression or schizophrenia or anxiety (it made the last two worse). Symptoms of this “depression” that Benzedrine Sulfate promised to relieve closely paralleled the familiarly vague miscellany associated with neurasthenia: apathy, discouragement, pessimism, difficulty in thinking, concentrating and initiating usual tasks, hypochondria—that same generic, hard-to-define sense of “unwellness” in body and mind that seemed to be a perennial affliction among Americans.

The marketing effort was helped immeasurably when a leading psychiatric expert at Harvard, Abraham Myerson (the drug industry would today call him a “key thought leader”), took up the cause of Benzedrine Sulfate as a nearly infallible remedy for what he called idiopathic anhedonia—a condition of apathy, low energy, lack of motivation, and inability to take pleasure from life. Myerson himself liked the way amphetamine made him feel (it would turn out that doctors enjoyed taking speed as much as prescribing it) and thought it worked wonders at cheering “anhedonic” people up, giving them more pep, vigor, and gusto for life. So it happened that Benzedrine Sulfate became, in effect, the world’s first prescription antidepressant, not to mention the first, and probably only, prescription medication intended for a condition called, simply, “discouragement.”

Once Benzedrine Sulfate was launched upon the world, the rush was on. Of course, college students—always willing guinea pigs for any new mind-altering substance, legal or illegal—already loved the inhalers because they not only kept them awake and alert, but seemed to improve their powers of concentration, even their thinking ability. By 1937, they were using them enthusiastically for all-night cramming to an extent that alarmed college authorities and warranted a brief article inTime calling the drug “poisonous.” In fact, by the end of World War II, amphetamine-based decongestants of many different brands were available everywhere, easily bought without prescription—but not very often, it would seem, for the purpose of clearing one’s nasal passageways. People routinely broke open the inhaler and ate the amphetamine-soaked paper contents.

During World War II, American and British soldiers kept Benzedrine in their emergency kits along with Band-Aids, hydrogen peroxide, morphine solution, tweezers, burn ointment, scissors, and whatnot. Not only would the drug keep military men awake and alert—on long bombing runs, for example—it would shore up morale. For this, the drug worked sensationally well, too well, in fact. Sure, it increased soldiers’ confidence and fearlessness, but also their foolhardiness, often worsening performance and judgment. There was also the problem of hallucinations and paranoia following amphetamine-caused sleep deprivation, not to mention the addictiveness. In On Speed,Rasmussen estimates that about 15 percent of the 12 million servicemen probably became regular users—a habit they brought back home with them after the war. During the Vietnam War, the United States dispensed more amphetamine to its troops than the British and Americans had done throughout World War II.

After the war, the drug went civilian big-time. A whole generation of beatniks, musicians, writers, and other aficionados of “cool” took their doses of amphetamines, either as inhaler paper wads or pills. Indeed, Jack Kerouac’s On the Road is a kind of long prose poem to speed. But it was in the perfectly legitimate world of mainstream medical practice during the ’50s that amphetamine really hit its stride. The times were particularly friendly to psychiatrists (who’d played an active role in the war, screening soldiers and treating combat fatigue) and to psychoanalysis. The public readily accepted the analytic concept of “neurotic depression”—referring to a grab bag of quotidian suffering caused by family strife, financial worries, overwork, life dissatisfaction, anxiety, and/or various specified depression-inducing conditions, such as physical disease, chronic pain, and old age. And for the first time in American history, medical authorities now claimed that a large percent of the population was mentally ill: 1 in 10, they estimated. Since there weren’t enough trained psychiatrists to take care of all these people, general practitioners would have to step in and act in loco psychiatrist, so to speak. These doctors didn’t do analysis, but they sure could do the next best thing: prescribe amphetamines.

Then in the late ’40s, amphetamine makers hit the jackpot when they concocted a mental/physical condition for which their drug was the most perfectly marketable solution: fatness plus depression. Whereas being overweight was once associated with slow metabolism or, say, piggish overeating, now it was described as a psychosomatic problem, caused by a mental illness, most likely depression. For years, doctors had prescribed amphetamine pills off-label for weight loss, but in 1947, SKF began marketing both Benzedrine and a new product, dextro-amphetamine (brand-name Dexedrine), specifically for this purpose, citing emotional disturbance as a direct cause of weight gain and, as an added spur, reminding doctors that even a few extra pounds were potentially dangerous.

Between 1946 and 1949 (when Dexedrine lost its patent), SKF’s annual amphetamine sales rose from $2.9 million to $7.3 million—which doesn’t count the proliferating sales of D&D (diet and depression) products based on methamphetamine, a newer synthesis, invented by other companies to compete with Dexedrine. When Dexedrine was released as a “spansule,” a time-release capsule, sales rose from $5 million in 1949 to $11 million in 1954—not including new, proliferating competitors (some by SKF itself), with vaguely space-age names like Eskatrol, Thora-Dex, AmPlus Now, Opidice, Obedrin, and so forth.

This perfect dream of a drug only had one flaw: it made people feel jittery, edgy, jumpy, and agitated—and many people seemed to find that more pep and energy weren’t worth a chronic case of the heebie-jeebies. The answer was, once again, a stroke of marketing genius: combining amphetamine and barbiturate in one all-purpose, upper/downer drug that would elevate people’s moods and mellow them out at the same time—and help them lose weight in the bargain! The first such pill was called Dexamyl, a combination of Dexedrine and Amytal (the barbiturate) packaged in a bright blue, heart-shaped little tablet and promoted to relieve anxiety and depression—as one glossy brochure put it, “for the management of everyday mental and emotional distress.”

By 1960, Rasmussen estimates that amphetamines accounted for about three percent of all prescriptions sold in the United States and Great Britain, and by 1962, US production of amphetamine was 80,000 kilograms per year; that’s 8 billion tablets, or enough for 50 tablets a year for every person in the United States. By the mid-’60s, however, the truly appalling downside of amphetamines (withdrawal symptoms like panic, nightmares, seizures, severe depression,hypersomnia followed by hyposomnia, as well as the psychotic features afflicting people who started taking the drug with a doctor’s prescription and then just couldn’t stop) were becoming hard to ignore. There were exceptions: amphetamine’s die-hard champions, including psychiatrists, ignored or rationalized away such side effects, even arguing that abusers were rare and abused the drug only because they had weak personalities to begin with, or preexisting schizophrenia, which the drug simply uncovered.

What clearly helped sell Miltown to millions of people in the first place was the sudden, explosive celebrity mania for the drug, particularly in greater Los Angeles, which became known as Miltown-by-the-Sea. In fact, so many show-biz types loved Miltown that it became a television comedy meme, with comedians wisecracking about the pills, even as they were tossing them back like M&M’s. Jerry Lewis, Red Skelton, Bob Hope, Jimmy Durante, and Milton Berle incorporated Miltown jokes into their routines. In 1958, Salvador Dalí was commissioned to do an art installation for the Miltown exhibit at the American Medical Association’s annual meeting. It was a silk-sided, undulating tunnel that people could walk through, with Dalí’s slightly creepy murals on the side symbolizing the transition from anxiety to tranquility.

Meanwhile, Equanil, a carbon copy of Miltown, debuted at the same time by agreement with another pharmaceutical company, was being marketed only to doctors, who felt that the crass, showbiz-tainted promotion of Miltown to the unwashed masses was beneath the dignity of their profession. Soon, because doctors backed and recommended Equanil and not its cheesy (but chemically identical) twin sibling, its sales outpaced Miltown’s. By 1959, 19 different companies were making me-too tranquilizers, and Miltown’s reputation was flagging. Over the next few years, it would paradoxically be judged both relatively ineffective—a namby-pamby kind of party drug—and highly dangerous. This double message—a drug that was too weak, but also too strong—was summed up by Time in 1965, the year Miltown was dropped from the US pharmacopeia. “Some doubt that it has any more tranquilizing effect than a dummy sugar pill,” the article read. “Others think that it is really a mild sedative that works no better than older and cheaper drugs, such as the barbiturates. A few physicians have reported that in some patients Miltown may cause a true addiction, followed by withdrawal symptoms like those of narcotics users ‘kicking the habit.’”

Nighty-night, Miltown! But even before Miltown began drifting off into oblivion, the drug companies had their periscopes up and swiveling around in search of the next Best Drug Ever. Soon enough, Hoffman-LaRoche (Roche), a pharmaceutical company in New Jersey, had a new blockbuster, which would not only out-blockbuster all previous blockbusters, but wouldn’t itself be out-blockbustered until the advent of the godlike Prozac in 1987. Actually, the inventor, Leo Sternbach, came up with two similar blockbusters, both based on his newly created class of chemicals, the benzodiazepines. The company named them Librium (from the last syllables in equilibrium), released in 1960, and Valium (from valere, the Latin verb meaning ‘to be in good health’), released in 1963.

Roche pushed the development of Librium by relentlessly promoting the drug’s originality and difference from the rest: “completely unrelated chemically, pharmacologically, and clinically to any other tranquilizer.” Librium’s other major selling points were its greater potency and versatility: it would presumably help more and different kinds of people with a wider range of problems than anything else on the market. As in every such promotional campaign, Librium’s makers insisted that the drug had no bad side effects and was completely unaddictive (this time, we really mean it!). And they set out to convince the world that it would help absolutely anybody, from the mildly to severely impaired, who wasn’t flat-out crazy (and it might help even some of them, too). This included a wide spectrum of impairment, including the middling group of so-called psychoneurotics, the ones with agitated depression and severe anxiety, as well as just about every other category of mental misery, such as chronic alcoholism, phobias, panic, obsessive-compulsive reactions, “schizoid” disorders (excessive detachment and social avoidance), epilepsy, emotionally generated physical symptoms (skin disorders, gastrointestinal issues), and narcotics addiction. The drug showed its chops with volatile prisoners, apprehensive college students, hysterical farm wives, querulous elderly people, and those old standbys of the perfectly normal, all-American nuclear family: the tense businessman with heartburn; his moody, premenstrual, sex-avoiding wife; and their unruly, school-phobic child. Three months after its introduction, Librium had become the most often prescribed tranquilizer in the world, with American doctors writing 1.5 million new prescriptions for it every month.

Three years later, Valium was introduced to the market, probably just to keep the exploding market for tranquilizers firmly in the Roche camp. If Librium had been a hot-ticket item, Valium was a supernova. Substantially more potent than Librium, but without Librium’s bitter aftertaste, it was the first drug brand in pharmaceutical history to hit $100 million in sales, rising to $230 million by 1973, and a cool billion by 1978, when 2.3 billion tablets were sold. Between 1968 and 1981, it was the most widely prescribed medication in the Western world. No wonder that in 1975, Fortune Magazine called the benzodiazepines, “the greatest commercial success in the history of prescription drugs.”

Tranquilizers were vastly successful among both men and women, but they seem to have had a special place in the lives of women. Even though, as Tone points out, slightly more ads for Valium were directed toward men, the different ways male and female anxieties were presented and used to sell drugs were telling. Whereas men’s anxieties produced real, life-threatening bodily ills, the physical ailments provoked by women’s anxieties often showed up as hypochondria or “female troubles,” related to menstruation, sexual difficulties, and menopause. One prominent Valium ad features a woman standing alone on a cruise-ship balcony with the accompanying title “35, single and psychoneurotic.” The copy goes on to say that “Jan” has low self-esteem, hasn’t found a man to measure up to her father, and now feels lost because she realizes she may never marry. It suggests that whether in conjunction with therapy, or all by its own little self, Valium could relieve the anxiety, apprehension, agitation, and depressive symptoms that somehow keep Jan from finding the man of her dreams. In the late 1960s, women were the biggest consumers of psychotropic drugs, accounting for 68 percent of all antianxiety drugs prescribed. But that same year, they were also prescribed 63 percent of all barbiturates, 80 percent of all amphetamines, and—20 years before Prozac—71 percent of all antidepressants.

By the early ’70s, Valium was still on a roll, with Librium not too far behind. So successful were the drugs, so popular, and apparently so immediately effective at soothing the savage beast that they actually succeeded in undermining the religion of psychoanalysis. How could anybody argue that anxiety was based on some buried inner psychic conflict when all it took was a pill to make all things bright and beautiful? In fact, during these years, both the medical community and the public began regarding anxiety less as a psychoneurotic problem than a biomedical one—a shift that would receive official sanction in 1980, when the newly published DSM-III first included generalized anxiety disorder (GAD) among its diagnostic categories. If ever a diagnosis seemed tailored to fit the drug prescribed for it, GAD was it.

Yet—low, ominous drum roll—dark clouds were beginning to dim the sunny skies of tranquilizer heaven. By 1975, although there were 103 million tranquilizer prescriptions a year in the United States, the reports of abuse and dependence were growing increasingly hard to dodge. By 1967, the FDA had collected enough complaints about the drugs to launch an investigation into their abuse potential, but bureaucratic inertia, drug industry pressure, and resistance among many Americans to having their favorite pills taken away stalled action for almost a decade. At the same time, however, the mass media were on the front lines of the story, particularly magazines aimed at women, such as Vogue, Ladies’ Home Journal, Good Housekeeping, and Ms. They ran articles describing people’s harrowing attempts to stop taking benzos: the intolerable rebound anxiety and depression, insomnia, headaches, agitation, tremors, even occasionally delirium and hallucinations (the sensation of bugs crawling all over the skin being one). Research studies appeared revealing that long-term use of tranquilizers actually increased people’s anxiety and depression, and caused their life-coping skills and personal grooming to go downhill. New trials on the drugs’ effectiveness demonstrated efficacy for the first week or two or three, after which they proved no better—and in some cases, worse—than placebo.

In 1978, it was sensational news when former first lady Betty Ford checked herself into rehab for her alcohol and Valium addiction. Then in 1979, television producer and documentary filmmaker Barbara Gordon’s book, I’m Dancing as Fast as I Can, about her self-destructive Valium addiction, became an international bestseller and was turned into a feature film, starring Jill Clayburgh and Nicol Williamson. The same year, Senator Ted Kennedy opened Senate Health Subcommittee hearings about the dangers of benzodiazepines, thundering that tranquilizers had “produced a nightmare of dependence and addiction, both very difficult to treat and recover from.” And the coup de grâce came in 1981 when John Hinckley, a deranged young man with a crush on movie star Jody Foster, attempted to assassinate President Reagan. As it happened, his psychiatrist had been prescribing 15 milligrams of Valium for him a day (he’d taken 20 milligrams just before the assassination attempt). One of Hinckley’s lawyers said that giving him Valium was like “throwing gasoline on a lighted fire.”

In the short run, benzo sales took a serious dive. The prescription rate fell from 103 million in 1975 to 67 million in 1981, to about 61 million in the mid-’80s. Valium prescriptions had fallen by nearly half. The release of newer tranquilizers—Klonopin (clonazepam) in 1975, Ativan (lorazepam) in 1977, Xanax (alprazolam) in 1980, and many others after that—let the tranquilizer market recover a good deal of territory (almost 86 million prescriptions in 2013). But even in recovery, the bloom was off the rose: the benzos had lost their prelapsarian innocence, and it would never again be possible to see them as no more worrisome than a nice, soothing cup of chamomile tea (but so much more effective!).

The Coming of Prozac

By the mid-’80s, there was a kind of existential void in the pharmaceutical feel-good market—but not for long. What would take the place of the tranquilizers, amphetamines, barbiturates, cocaine, and =opiates to help millions of people relieve the genuine, if often hard-to-define suffering of mind and body that seems such a perennial affliction of Western, certainly American, society? This brings us to the era of antidepressants, beginning with the totally unique, the unparalleled Prozac, which spawned many, many descendants—a psychotropic empire in full triumph and glory!

During its early, palmy years, Prozac seemed the answer to the depressed person’s prayers. There was a time during the early 19th century when the “melancholic” temperament was admired, at least in theory, as the mark of a sensitive, artistic soul, but in the 1980s, that time was past. Prozac was an all-American, 20th-century go-getter sort of drug. It made people feel the way they thought successful people felt: self-confident but not arrogant, cheerful but not manic, cognitively on the ball but not hyperintellectual. And unlike so many other psychoactive drugs, the shiny little green-and-white capsule didn’t make people feel drugged, didn’t sedate them or jolt them into hyperconsciousness, didn’t send them drifting off into dreamy euphoria. Undoubtedly, some people taking Prozac felt “better than well,” as Peter Kramer famously wrote in Listening to Prozac, but most seriously depressed people were probably thrilled just to feel “normal,” able to go about their day with more emotional equilibrium than they’d experienced in years, if ever.

By 2011, the rate of antidepressant use had increased nearly 400 percent since 1991. By 2013, more than 40 percent of Americans had used an antidepressant at least once, and 270 million prescriptions were being written annually. And why wouldn’t people want them? Here was a class of drugs promoted as completely safe, without serious side effects, and effective for the wide-ranging symptoms of depression and anxiety and, increasingly over the years, obsessive-compulsive disorder, alcoholism, narcotics withdrawal, ADHD, eating disorders, bipolar disorder, PTSD, hypochondria, and so forth, as well as a long miscellany of nonpsychiatric conditions, including snoring, chronic pain, various neuropathies, chronic fatigue, overactive bladder, and so on. By now, thanks to their prevalence and a flood of direct-to-consumer print and broadcast ads, antidepressants are deeply familiar even to people who don’t take them.

But they’re not the benign, uncontroversial consumer product they might seem. Beginning a mere three years after Prozac’s introduction, negative side effects to the drug had already been noticed: intense agitation, tremor, mania, suicidal ideation. Not only that, there was already evidence for what came to be known as “Prozac poopout”: the drug often stopped working after a few weeks or months. Such worrying signs increased over the next two decades, becoming a steady cascade of possible side effects and withdrawal symptoms that ranged from the prosaic (headaches, rashes, insomnia, digestive problems, joint pain) to the more disturbing (dyskinesia, rigid or trembling limbs, loss of fine motor control, sexual side effects amounting to a virtual shutdown of erotic life) to the frankly dire (abnormal bleeding and the potential for stroke in older people, and self-harm and suicidal thinking in children and adolescents).

Even worse, during the ’90s and the first decade of the 21st century, a small flood of psychiatric papers and research studies argued that antidepressants didn’t work well in the first place. After a few weeks of improvement, depressed patients often returned to the status quo ante and had to have their dosage increased or supplemented with other drugs. When antidepressants did work, the results were often—particularly for milder and moderate forms of depression—little if any better than placebo. And what was described as a relapse in people who stopped taking the drugs—redoubled depression, anxiety, insomnia—might really be withdrawal symptoms. There was even evidence for the depressogenic effects of the drugs, which acted on brain chemistry all right, but in the wrong way, making some people even more depressed and turning what might have been a temporary depressive episode into a chronic condition.

One reason we know about these developments at all is because a dedicated little army of muckraking mental health experts has produced a steady stream of books, articles, blogs, and websites devoted to publicizing what they regard as the money-driven, ethically challenged “drug culture” perpetrated by Big Pharma in collusion with the much-compromised field of psychiatry. Books like The Emperor’s New Drugs: Exploding the Antidepressant Myth (2009) by Irving Kirsch, Let Them Eat Prozac: The Unhealthy Relationship Between the Pharmaceutical Companies and Depression (2004) by David Healy, Comfortably Numb: How Psychiatry is Medicating a Nation (2008) by Charles Barber, and Anatomy of an Epidemic: Psychiatric Drugs and the Astonishing Rise of Mental Illness in America (2011) by Robert Whitaker not only excoriate the societywide overuse of antidepressant and other psychotropic medications, but relentlessly dig up dirt about the research that presumably justified the drugs in the first place.

Besides the well-known charge that drug companies routinely publish more than twice as many of their positive findings as negative ones (hiding the trials that proved unfavorable to the drug), these detectives revealed they also produce meta-analyses that are themselves drawn from pools of successful trials (ignoring the unsuccessful ones). Drug companies, it turns out, ghost-write studies (designing and conducting them, then analyzing the data) and publish them under the names of prominent physicians, who are often paid consultants for the companies. Overall, because there’s so much dough riding on encouraging outcomes, drug companies—and even the National Institute of Mental Health—stand accused of routinely inflating positive findings, omitting or hiding relevant negative data, and gaming the numbers in ways that make the results look fishy, to say the least.

Thus, the evidence for the routine use of antidepressants has increasingly come to be seen as pretty feeble. Generally, antidepressants seem to work well for seriously depressed people, but not so much for the vaster, far less emotionally disabled population for whom they’re usually prescribed. Even in drug company trials, placebos relieved mild or moderate depression as well as antidepressants. In other words, for the kind of ordinary, scratchy unhappiness that Freud found characteristic of human existence (at least among the Viennese bourgeoisie) and that primary-care doctors see all the time in modern America, antidepressants were not measurably helpful.

Perhaps a major problem lies in the possibility that nobody really knows what depression is. As Edward Shorter puts it in his latest book, How Everyone Became Depressed, “The current DSM classification is a jumble of nondisease entities, created by political fighting within psychiatry, by competitive struggles in the pharmaceutical industry, and by the whimsy of regulators.” The result is the metastasizing growth of conditions for which antidepressants are considered the remedy.Depression is today such a malleable, shape-shifting term for such a mass of mental and physical symptoms that no single class of drug could be expected to resolve them all. Is depression a debilitating hopelessness, sense of guilt, loss of all ability to feel pleasure, or even to feel anything at all. Is it also a somewhat lesser state of misery, which might include anxiety, irritability, fatigue, disappointment, and discouragement, with maybe some somatic symptoms (insomnia, digestive problems) thrown in? Hmmm. Is it the reactive misery and fear following the loss of a job, a spouse, a house to foreclosure? Or is depression the hidden lever behind a chronic sense of insecurity and low self-esteem, loneliness, and social awkwardness? Who knows? Since both the formal and informal definition of depression has become so murky, inclusive—boundless, even—any combination of these, any subjective experience of unhappiness, can qualify.

At this point, you might wonder what difference it makes since—right or wrong, good or bad—antidepressants rule. We might just as well get used to it and take our meds, right? But, in fact, the whole antidepressant empire shows signs of being seriously in trouble. Sales are down from their high-water mark of $13 billion in 2008 to $9.4 billion in 2013—and still slipping about 4 percent a year. The reason isn’t because people are taking fewer of them, but because so many patents are expiring, leaving, at the moment, 40 percent of the market to generics—a percentage that will increase over the next two years. At this point, the great promises of neuroscience for new and better pharmaceutical treatments haven’t been fulfilled, not remotely. It’s been nearly 40 years since the discovery of lithium, antipsychotics, and antidepressants, but that serendipitous wave of discovery was followed by what can most charitably be described as a lull, and any understanding of the neurobiology underlying psychopathology remains severely limited. For the first time, the fabled pharmaceutical antidepressant pipeline, from which ever more ingenious combinations of molecules have gushed forth, seems to have run dry.

***

So are there any lessons to be drawn from this 150-year boom/bust cycle of American psychopharm? One would seem to be that most of these drugs (even heroin, on occasion, and the once-popular cannabis, now making its comeback tour) often did and still do—in the right conditions, with the right people—bring almost miraculous relief from intractable pain and suffering, physical and mental. Many psychoactive drugs loved not wisely but too well actually did “work”—in their own way, they could be considered as miraculous as antibiotics. Today, millions of people who’ve taken antidepressants feel they’ve been helped; many are convinced that the drugs have literally saved their lives. But like antibiotics, the drugs could be victims of their own success, done in by runaway marketing, excessive enthusiasm, and flagrant overuse.

At this point, to vilify antidepressants after having idolized them for so long suggests an old American temptation to think in black-and-white terms, veering from infatuation with a pharmaceutical quick fix to furious outrage when the fix doesn’t immediately fix everything. So far, antidepressants have escaped the moral outrage and panic of earlier drugs because they don’t carry even a whiff of the threat associated with cocaine or methamphetamine. It’s hard to imagine antidepressants having much appeal as illegal street drugs, given that it takes weeks for them to have any psychotropic effect. Although their cultural stock, not to say their financial stock, has declined, it’s a little early to say they’re through. As biological psychologist Peter Etchells put it in an August 2013 Guardian article, none of the controversy “means that antidepressants don’t work. It just means that we don’t know yet.” Etchells went on to say more research was needed “that isn’t influenced by monetary gains or emotional predispositions toward one outcome or another.” To which one may be tempted to respond, “Good luck with that.” From what planet are these judicious, not-influenced-by-money-or-emotion types supposed to come?

In the meantime, we can anticipate the excesses likely to come with the legalization and spread of marijuana. Today, in Colorado, you can already buy marijuana elixirs (sparkling peach, red currant, blueberry, pomegranate), foods (chocolates, cheesecakes, hard candies, candy bars, sodas), botanical drops (cinnamon, ginger-mango, vanilla, watermelon), capsules, balms, pills, and old-fashioned smokables, including hash, in different flavors no hophead of yesteryear could imagine: Lemon Skunk, True Purple, Mother’s Poison, and Black Bubba, to name just a few. Financial stocks in weed production and marketing have shot up in Colorado and Washington (where pot is also legal), and there is now a growth industry, not only in pot sales, but in books and financial services to people who want to invest in the cannabis sector of the economy. Are we on the verge of another best-drug-ever bubble? Is America a capitalist country? Do Americans like mind-altering drugs?

How will the next psychotropic miracle fare? Tune in here a decade from now for the follow-up story.

Keep reading...Show less

How to Escape the Awful Feeling of Being Trapped in Life

You’re sitting with a client, fighting your own feelings of frustration and boredom as she tells you the same sad story that you heard last week and the week before. She’s explaining to you, again, what’s wrong with her and why she can’t change. You long to be able to help her, but nothing that you say seems to get through. You start wondering if someone else could do a better job. You even wonder if you should refer her to a physician for medication, which shows you’re really starting to get desperate. In the end, all you want is to see her eyes light up, her shoulders lift, her breath deepen, as she finally “gets it” and makes important connections, sees her life in a new way, feels fresh hope. These are the moments we live for as clinicians.

Keep reading...Show less

How We Become Enslaved to Our Mindless Habits -- and How to Break Out

With my right foot planted firmly on the floor and my left heel just barely off the ground, my body leans slightly to the right when I pee.

Keep reading...Show less

Is It a Habit or Addiction? What Happens in Your Brain When You Start to Get Hooked

I have something to tell you,” Jolie said as she sat down in my office. “It’s really bad.” She went on to describe how she’d ruined a dinner with friends because she’d drunk two glasses of wine, gotten irrationally upset at something, and left before dinner was over. With a tone of mixed embarrassment and surprise, she remarked that she must have been drunk. But as I probed further, she admitted that she’d drunk “a little” before the dinner to take the edge off her day. And further probing revealed that her two glasses of wine at dinner were more like a bottle. As we talked more about when and how much she drinks, she began to acknowledge that her use of alcohol was starting to cause problems she’d been unwilling to see.

Keep reading...Show less

The Secret to Breaking Out of Our Most Destructive Habits

Charles Dickens’s A Christmas Carol is one of my all-time favorite stories, as it’s been for millions of others since it was written in 1843. Who doesn’t start sniffling when reading this classic tearjerker about Ebenezer Scrooge, a cold, bitter old man dragged—by the ghosts of his past, present, and potential future—on a terrifying midnight journey of self-discovery, from which he emerges transformed and redeemed? Most people love movies about driven, selfish people who, struck by the life-altering experience of sudden love or near loss, eventually see the light and blossom into life-affirming menschen. Miraculous conversion stories appeal to the wishful thinker in all of us. We want to believe that hitting bottom—being forced into genuine awareness of one’s bad behavior and experiencing true remorse about it—is the key to transformational change, a comforting daydream shared by many therapists.

Keep reading...Show less

Salt Sugar Fat: How the Food Giants Hooked Us on Deadly Junk

Bet you can’t read just one page of Salt Sugar Fat: How the Food Giants Hooked Us, the unapologetically unsugarcoated exposé of the processed food industry’s tricks to spur addiction. Although you may not immediately recognize the name of author Michael Moss, you’re probably familiar with his investigative report on pink slime, the controversial ammonia-treated beef trimmings that meat producers are legally allowed to add as cheap filler to lean ground beef. Moss won the Pulitzer Prize for his New York Times series on beef contamination and safety, and his scoop on pink slime started a chain reaction of public concern and outrage that led to a reduction or discontinuation of its use by several companies.

Keep reading...Show less

Can You Rewire Your Brain to Change Bad Habits, Thoughts, and Feelings?

Nearly 90 years since F. Scott Fitzgerald wrote his classic The Great Gatsby, the new film version has given renewed currency to the novel’s famous final line: “So we beat on, boats against the current, borne back ceaselessly into the past.” What’s afforded this passage such staying power is not only its haunting poetry, but the worldview it expresses—however hard we may try to reinvent ourselves, we’re doomed to remain captives of our pasts. Another celebrated author, William Faulkner, put it this way: “The past is never dead. It’s not even past.” Eugene O’Neill penned these words: “There is no present or future, only the past, happening over and over again, now.”

Keep reading...Show less

10 Best Ways to Manage Your Anxiety

"I don't think I want to live if I have to go on feeling like this." I hear this remark all too often from anxiety sufferers. They say it matter-of-factly or dramatically, but they all feel the same way: if anxiety symptoms are going to rule their lives, then their lives don't seem worth living.

What is it about anxiety that's so horrific that otherwise high-functioning people are frantic to escape it?  The sensations of doom or dread or panic felt by sufferers are truly overwhelming--the very same sensations, in fact, that a person would feel if the worst really were happening. Too often, these, literally, dread-full, sickening sensations drive clients to the instant relief of medication, which is readily available and considered by many insurance companies to be the first line of treatment. And what good doctor would suggest skipping the meds when a suffering patient can get symptomatic relief quickly?

But what clients don't know when they start taking meds is the unacknowledged cost of relying solely on pills: they'll never learn some basic methods that can control or eliminate their symptoms without meds. They never develop the tools for managing the anxiety that, in all likelihood, will turn up again whenever they feel undue stress or go through significant life changes. What they should be told is that the right psychotherapy, which teaches them to control their own anxiety, will offer relief from anxiety in a matter of weeks--about the same amount of time it takes for an SSRI to become effective.

Of course, therapists know that eliminating symptomatology isn't the same as eliminating etiology. Underlying psychological causes or triggers for anxiety, such as those stemming from trauma, aren't the target of management techniques; they require longer-term psychotherapy. However, anxiety-management techniques can offer relief, and offer it very speedily.

Keep reading...Show less
BRAND NEW STORIES
@2025 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.