Janelle Brown

When the Drug War Invades the Chess Club

On Thursday, the Supreme Court ruled 5-4 in the case of the Board of Education vs. Earls that it is "reasonable" under the Fourth Amendment to randomly administer drug tests to all high school students who participate in extracurricular activities. In other words, it is now perfectly legal for a school to force a cheerleader or the president of the chess club to pee in a cup -- anytime -- to keep their membership in after-school programs.

The decision didn't come as a surprise. During oral arguments on the case in March, several Supreme Court justices expressed strong support for student drug testing. At one point, Justice Antonin Scalia taunted Graham Boyd, the ACLU lawyer who argued the case on behalf of defendant Lindsay Earls: "So long as you have a bunch of druggies who are orderly in class, the school can take no action. That's what you want us to rule?" At another juncture, Justice Anthony Kennedy asked Boyd a hypothetical question about whether a district could have two schools, one a "druggie school" and one with drug testing. As for the first, Justice Kennedy said, "no parent would send a child to that school, except maybe your client." (Earls, a former honor student at Tecumseh High School in Oklahoma, had objected to drug testing as an intrusion on her right to privacy.)

Even though they had anticipated defeat, opponents of the war on drugs -- and its new battlefield in the classroom -- found it deeply disappointing. These critics argue that by targeting students, particularly those who participate in extracurricular activities, be they athletes, prom queens or Future Farmers of America, participating schools unfairly single out students who are often the least likely to be doing drugs in the first place, and drive students at risk for drug use away from the activities that might take the place of getting high. Furthermore, they argue, drug testing erodes the privacy of high school students; and it has never proven to be an effective method of reducing drug use among kids.

Justice Clarence Thomas, who wrote the majority opinion for the case, clearly wasn't swayed by these arguments. "Students affected by this policy have a limited expectation of privacy," he wrote. "This policy reasonably serves the School District's important interest in detecting and preventing drug use among its students."

In an interview following the ruling, Boyd, who is the director of the ACLU Drug Policy Litigation Project, talked about the impact of the decision, which he believes will result in more drug testing in the nation's schools, and, perhaps more importantly, a serious erosion of constitutional rights to privacy for everyone -- students and adults.

What's the message being sent by this ruling?

Basically, it's disappointing on a lot of levels. It certainly erodes students' privacy in a way that has never been done by the court before, and really puts students on par with prisoners. In his decision, Justice Thomas focuses almost dispositively on the fact that students are in what he calls the "custody of the school" and that drugs are themselves dangerous. There's no drug problem in this school in question, no safety issue ... but the mere fact of being a student seems to be, in Thomas' opinion, a reason to drug-test.

The logic of the opinion is so different from Vernonia, Ore. [A previous Supreme Court ruling that allowed drug testing of student athletes only.] In that case, there were a lot of reasons to uphold the drug testing: There was a drug problem in general, which was centered around athletes, who were school role models. This reason, and others, were absent in Tecumseh.

Do you expect schools across the nation to immediately start or expand drug-testing programs?

The good news in this is that schools have not previously been that responsive to the Supreme Court on this issue. It's been seven years since the Vernonia decision, and still only 5 percent of all schools drug-test their athletes. Cost is one factor [drug tests cost upwards of $25 per person] but it's also a question of effectiveness. A newspaper reported yesterday that the Dublin, Ohio, school board decided to stop drug-testing students, saying that it was ineffective. They said they knew that the Supreme Court was about to rule on this issue, but they didn't care: They wanted to do what would actually help the kids.

I think it's not going to be a legal question for most schools; it's a question of what works. The politics around drug testing are very thick but the evidence of it actually helping anybody is absent. And most school boards are controlled by people who want to help kids, so they aren't going to go down this road.

So, you don't think many school districts were waiting for a Supreme Court go-ahead?

I haven't heard of a single school district that is doing this, but I'm sure they are out there. What invariably happens is a small number of parents start to make a lot of noise about needing to do something about drugs; and drug testing is an easy decision that shows that the school board is tough on drugs.

It's not that different from what Congress does with mandatory minimum sentencing: It's a cheap political gesture that keeps voters happy but in the cold light of day doesn't hold up [as a meaningful action]. No one thinks raising jail terms for crack possession is going to have an effect, but it gives them something to campaign on. Everyone knows that drug testing is at best a waste of money, but it makes [school boards] look good.

The Bush administration's lawyers have suggested that they are interested in pushing on to test all students for drugs. Do you think this is likely to happen?

I don't think that it's going to sweep the nation as a popular cause to drug-test all students, but sure, I think some school district will probably push it to the limits to see what happens. I've already litigated such a case in Lockney, Texas, where a school began drug testing all their students; but there, the very conservative judge struck it down.

Do you think the current Supreme Court would lean towards drug testing all students if the issue came up?

I think you've got to read a lot into Justice Breyer's opinion. In one little paragraph he says that for the student who really doesn't want to [be tested] they could always just not participate in the extracurricular activities; and he acknowledges that this is a serious matter, but a very different matter than saying the kid would be expelled from school altogether.

That's really what it comes down to in the question of drug testing all students: What do you do with a kid who refuses to be tested? To say that a kid can't be in the choir if they refuse the test, that's harsh. But say that they can't get a public school education at all if they object? That's awful.

So, what recourse is there currently for extracurricular students who don't want to take this test? Do they have a leg to stand on?

You've got to look further down the road. Drug testing doesn't exist in most schools right now, and if schools want to do this they have to hold hearings and get input from the public. So it's important for students and parents to engage in that process, bring information into those meetings.

What's nice is this isn't just about people complaining about privacy; it's also about educating school boards that this kind of drug testing is counterproductive. That's a very empowering argument, and we're doing everything we can to really encourage students and their parents to exercise their own voice.

In the Supreme Court decision, though, Justice Thomas wrote that "testing students who participate in extracurricular activities is a reasonably effective means of addressing the school district's legitimate concerns in preventing, deterring and detecting drug use." Why did he write this? Is there, as he suggests, proof that it was reasonably effective?

Your guess is as good as mine: There is none. There really is no evidence whatsoever that drug testing is at all effective. The evidence is the opposite.

Thomas also suggested that the Supreme Court wasn't ruling that the decision to drug-test was "wise," just that it was constitutional.

It is affirmatively unwise, in the view of most experts. In a limited sense, one could read this as support of school-board discretion. I hope the boards will be responsible in use of that discretion.

What message does this decision send about student privacy rights? Is this essentially arguing that students have no Fourth Amendment rights anymore?

This comes close to saying that high school kids have no Fourth Amendment rights. It doesn't completely upset the balance, but it does suggest a drug-war exception in schools. Lower courts are going to be tempted to endorse any kind of anti-drug measure a school wants to take.

We'll fight that, it's not a foregone conclusion, but that's one of the dangers of today's decision.

What's the likelihood that these rights will ever be returned to high school students? Is this a slippery slope of privacy loss that is impossible to climb back up?

It's a huge concern. When the Supreme Court rules, it often remains precedent for a generation at least. And they've now set the bar very low for intrusions on student privacy in the name of the war on drugs.

Of equal concern for me is that what happens to young people in the privacy realm could also have an impact on the privacy rights of that generation when they come of age. One of the fundamental ways the Fourth Amendment is measured is by reasonable but subjective expectations of privacy: "Do I personally feel offended by this?" If I don't feel it's a big deal, the government can do it. To the extent that kids become accustomed to various intrusions on their privacy, because of drug policies, they have no standing to object to other intrusions as they get older.

What other long term impact do you believe this decision will have on kids?

I'm going to make a mischievous argument. I think that in some ways the best thing that government officials could do to bring an end to the war on drugs is continue this trend of cracking down on young people.

For every student who is drug-tested, for every student who has to prove her innocence by passing a drug test, you'll have one more student that questions the drug war. Every time they teach D.A.R.E. and teach lies to kids, you'll have one more kid that doesn't believe in the drug war. Needless and groundless drug testing of high school students is just taking one more step down the road of having people say we've had enough.

Porn Provocateur

The most reviled woman in pornography stands before me, a 25-year-old bleached blond in tattered floral house slippers. On the cover of a catalog for Extreme Associates, the porn film company she runs with her husband, Rob Black, director Lizzy Borden wears spider-web tights and a skull-and-bones T-shirt that she accessorizes with a malicious grin and impressively pneumatic breasts. But in the privacy of her own office -- a cramped cinderblock warehouse covered with posters of naked women and autographed Hulk Hogan photographs -- she wears sweat pants and a Quiksilver T-shirt, with no visible makeup and her hair pulled back in a ponytail. Plus, the slippers.

"I'm totally bloated today," she tells me, sounding more like an awkward teen than a porn movie maven. "Don't take any photographs!"

In person, Lizzy Borden comes off as a chatty girl with a quick sense of humor, a potty mouth and a slight attitude problem. And yet, during her five-year career as an actress/director, Borden has emerged as a porn powerhouse who manages to offend, disgust and/or alienate not just feminists, politicians and most Americans with a conscience, but a great percentage of the unshockable pornography industry as well. This is a woman whose filmography includes such gems as "Fossil Fuckers" (geriatric women having sex with young men), "Cocktails" (post-coital women drinking vile concoctions of vomit and bodily fluids) and, most horrifyingly, the recent slasher-porno "Forced Entry."

In fact, Borden's films are so repugnant and evil that it's difficult to justify their existence, let alone comprehend why anyone -- especially a woman -- would want to make this kind of garbage in the first place. But Borden talks about her work with pride and a kind of twisted logic. She considers herself a moralist, an artist, a realist and a provocateur (though perhaps not in such grand vocabulary). In an industry that treats women like second-rate citizens, that considers them useful only as long as their breasts are perky and their orifices exploitable, Borden sees a route to power and respect in out-boying the boys. Furthermore, she believes she's just giving audiences the same kind of violence and vileness they've come to expect from their other entertainment outlets, like the World Wrestling Federation, "Jackass" and Eminem.

And thanks to her own abusive childhood, Borden is just screwed up enough to believe this rationalization. "As people stay in this industry they tend to get pretty twisted," explains journalist Luke Ford, a porn insider and longtime chronicler of the business. "It both attracts twisted people and then exacerbates those tendencies." No one epitomizes this more than Lizzy Borden, and none of her work manifests it so completely as the recent "Forced Entry."

The film premiered, oddly enough, as part of the Frontline documentary "American Porn," which screened on PBS this February. During their exploration of the extreme end of the porn industry, the Frontline producers visited the set of "Forced Entry" (actually, the back room of the Van Nuys, Calif., warehouse that Extreme Associates calls its office), where Borden was taping the climax of her film. The PBS producers were so disturbed by what they saw -- star Veronica Caine being "raped" and "beaten" -- that they filmed themselves walking off the set because it was too hard for them to watch.

I felt the same way when I watched the film. "Forced Entry" is purportedly the story of a serial killer and his gang who rape and murder a series of women -- an 18-year-old virgin, a pregnant woman, etc. -- before being caught and lynched by an angry mob. The actresses in the film are slapped, spit and urinated upon, and violated in every orifice, while sobbing and screaming and begging for mercy. Watching it, I was aware that it was just a movie -- that these were consensual acts taking place between actors and actresses who had already had sex with each other dozens of times in past films; and that the blood and screams were fake. The video even included an entire "blooper" reel of the actresses laughing and joking on the set. Still, I was so traumatized by the movie that it brought me to tears: It was like witnessing a real rape, seeing the nadir of man's contempt for womankind brought to life with no holds barred.

It's hard to equate this horror with the outgoing girl in slippers who walks me around the Extreme warehouse, through the spray-painted room where, in "Forced Entry," Veronica Caine is violated and stabbed to death. All the while Borden chats about her beloved bulldog and asks me questions about my career. She seems anxious to prove to me that she's not the evil, actress-beating director that Frontline depicted.

"I've fucked up on a lot of interviews, said things that people take all the wrong way," she tells me, as she sits me down in a broken office chair and puts her feet up on a big wooden desk. (I eye the desk nervously, wondering if it figured in the scenes of any of her movies.) "I'm very honest. Anything you want to know." Despite these assurances, Borden has not always been totally honest about her past. In her early Extreme years, she pretended to have spent her youth in an insane asylum after murdering her family, à la the original Lizzy Borden. But this is the story she tells me now: Lizzy Borden was born in conservative Orange County, Calif. -- "behind the Orange Curtain" as she puts it -- to a "white trash" Italian Catholic family. Her stepfather was an abusive alcoholic, she says, who beat her mother viciously and regularly. Her mother stayed at home to raise Lizzy and her three half-siblings, and took out her own frustrations, says Lizzy, by beating her with her fists and assorted sharp objects. Lizzy moved out of her mother's house when she was 12, and finished high school while living with her conservative grandparents.

When she turned 18, Borden headed off to a local community college on a scholarship. "For the first time in my life I could actually see the world," she says, and the world she chose to see was full of drugs and partying. Frat parties led to raves; acid and Ecstasy led to speed and cocaine use, which led to a job as a stripper, assorted abusive boyfriends and her first lesbian experience.

Borden derives immense pleasure from describing how naive and introverted she once was. "I was an ugly duckling: I had a big Italian nose, the long brown hair. I never ever shaved my cunt until I started to strip! I didn't know anything," she confides, all wide-eyed astonishment. When she got her first job at a strip club, she tells me, "I didn't know how to use a tampon: When I saw someone else do it I was like, 'Oh no, Oh no. My grandma told me that once you do that, you are going to get toxic shock, you are going to die!'"

Nevertheless, Borden made quick work of losing her naiveté. After a brief career dancing at strip clubs, she met a porn star who convinced her that there was good money to be made in the business. She took her mother -- with whom she had reconciled, thanks in part to the drug problem they share and over which they had "bonded" -- to her first porn shoot and sat her outside the front door.

"I told her she had two choices: 'Either you don't go with me, and I get fucking killed and you see it on the news. Or you go with me and make sure no one kills me.' So she sat outside," says Borden. "And I was like, 'OK, my mom's out there just so you know.' I didn't know this world, I thought no one was gonna hurt me if my mom was out there."

Her mother cried over her decision, Borden says, until she saw the money that Borden was making. These days, Mom has a job at Extreme Associates. (Her mom, she says, "has her own fucked-up issues.")

After less than a dozen films, Borden got a job performing in a film for Extreme Associates, which led her to the man who would become her husband -- the porn director Rob Black, who already was gaining notoriety for his no-taboo-unexplored films. But, explains Borden, Rob "can't be with a woman who does porn, he comes from a Catholic Italian family," so she quickly gave up acting and, after assisting on Extreme sets, convinced Black to let her direct instead. It was just a matter of months before the two of them invented Lizzy Borden, the twisted mascot-slash-mistress of the most horrific films in the Extreme catalog.

Rob initially resisted Borden's desire to direct. "He said, 'Women don't make good directors,'" Borden explains. The idea that women don't make good directors is a commonly held belief in the porn industry, she says, because women "shoot all the soft stuff, all the lovey-dovey stuff that there's not a big market for. In the video stores, that's not what you go see: You want to see hardcore ass-fucking, DP [double-penetration], cum, piss, shit, whatever you can."

Borden, in turn, began to see herself as a kind of female challenge to the male-dominated industry: "I said: 'I'm fucked up! I can write something! I can be a man!' My mission after that was to prove everyone wrong."

Around her office, she says, she now acts just like a guy: She describes, with glee, how she recently peed on the chair of a co-worker and then made him sit on it, and how she and her best friend, star Veronica Caine, hid a dead fish in the office of another co-worker. She says that she farts and scratches and takes no grief, and that this means that the men of the porn industry no longer treat her like one of those "fluffy-puffy" porn queens.

"It's a power thing," she says. "These people told me I couldn't do something, and that's the only reason I wanted to do it. Because they told me I can't ... So I started to get more and more hardcore, until now no one can top me. I can get anyone to do anything because I am a woman. I think I've earned that respect." It didn't take long for Lizzy to establish herself as a woman who went where no woman, and most men, would dare to go. The covers of the films that she has produced are difficult to even look at, covered as they are with hardcore snapshots of sex and blood. There's "Cannibalism," a horror-porno in which various internal organs are consumed after an orgiastic release. There's the "Sexually Intrusive Dysfunctional Family" series, which features such props as a decapitated pig's head. "Cocktails" features a grinning girl with a filth-smeared face and a bowl underneath her chin. ("Forced Entry," fortunately, has no cover art.)

Sex in Borden's films is almost always violent. Urine, excrement, blood and spit are prominent. Many films feature witches, Satan, robots, aliens and assorted otherworldly creatures. No orifice goes unviolated, and the more revolting the means, the better.

In fact, Borden says, she often revolts herself, but in a good kind of way. "It's disgusting but I like to watch it because it's shocking," she explains, and says that she sadistically eggs on her actresses to see how far they'll go. She's inspired by the likes of Eminem and Marilyn Manson, she says; she also compares her more nasty videos, like "Cocktails" -- in which, it seems, the sex is almost ancillary to the shock-horror-revulsion -- to shows like "Jackass," in which star Johnny Knoxville will don a face mask and go "swimming" in a porta-potty, or "Survivor," where contestants eat bugs and drink blood.

"Those reality shows, where people eat bugs and shit: That's disgusting! How can you watch it? But I watch it. It's the same with what we do: People are shocked by it, but they watch it."

And apparently enough people watch to make Borden's business profitable: "Cocktails" and "Fossil Fuckers," the films of which Borden says she is most proud, have sold so well that she has done sequels for all of them. "Forced Entry," her magnum opus, has sold 20,000 copies through mail order alone. This is the most disturbing aspect of her films: The fact that viewers watch these movies as a way of getting off. What kind of person masturbates to the sight of girls being slapped, drinking their own vomit or being raped?

This is the only question that gives Borden pause. "That's the one thing that we deal with every day," she says, and quickly meets the eyes of Veronica Caine, whom she has hauled into our interview in order to assure me that the actress wasn't really beaten and murdered in "Forced Entry." There is a momentary silence.

But Borden quickly brushes off this concern by insisting that, really, those creeps who get turned on by the violence in her films are actually being taught a lesson. She argues that many of her films are moral tales, based on "real" stories you might read in the news, in which the bad guy or girl gets caught in the end. A bad alcoholic mother runs over her babysitter and her son, and ends up slashing her wrists (after having sex with assorted strangers first); the rapist in "Forced Entry" is murdered by a vigilante mob; a woman with a cheating husband leaves him in the end.

"If you watch it and don't fast-forward it, and if you think about it, you'll see there's a moral to it!" Borden argues. "Most of this is awareness. I try to show what could happen to you. Do this [violent act] and you are going to get fucked up." Instead of believing, as some do, that linking sex and violence encourages rape, she points out that people get turned on by violence against women in movies like "Halloween" or "The Accused" all the time -- the only difference being that they don't actually see the sex.

Caine, a freckle-faced 32-year-old who is as calm as Borden is hyperactive, chimes in. She was shunned for performing in "Forced Entry," Caine says, and she admits there are moral concerns about the film. But, she adds, "There's nothing wrong with what I'm doing. The one thing that people want to pick out that makes it wrong is the fact that there is hardcore sex in it. Everything else is completely acceptable on a daily basis. I simply performed something that involved sex with a completely mainstream kind of entertainment. And porn is mainstream to me."

I can vaguely understand the argument that Eminem's "Kim," in which we hear a woman being raped and murdered, is no more socially acceptable than "Forced Entry," in which we see it. Except "Forced Entry," of course, is also a porno flick intended to sexually arouse us. Perhaps the juxtaposition that shocks me has simply been normalized in the porn industry, in which sex can be as banal an act as eating cornflakes -- even when it is embellished with a beating. Still, it's difficult for me to comprehend how these women can calmly talk about performing acts that make me shudder with horror when I see them on tape. Maybe, I think, Borden's a sadist who reenacts violence in order to purge the violence of her own past; maybe this is a game to prove herself to her extreme husband; maybe it's a way of rising above helplessness and futility.

I find one thing striking, though: Borden admits that she isn't actually aroused by her own movies. I ask her what turns her on in porn, and she says it's "when two people have a connection ... Not so much lovey-dovey, but when they are like, yeah, fuck me harder, fuck me harder! Well, I guess you could say it's lovey-dovey, but with more of a hardcore edge to it."

Borden vows that she isn't a feminist, but she sees herself as kind of a gender pioneer. Her films, she says, have powerful messages for women: "For the most part I hope women will look at [my movies] and say, 'A woman made this, she directed it.' I'm saying, you can get your revenge, maybe not the way I did, but in your own way you can rise above it. You got to reach within yourself and apply it."

But the respect that Borden says she has earned is dubious: There are women, she admits, who no longer meet her eyes when they see her, and men who think she's a freak. In general, porn insiders say that Extreme fare is generally considered too gross by the more elitist hierarchies of the industry, and that Borden is considered an anomaly. But Borden also extols the fact that men on the sets no longer ask her to "fluff" them or demand blow jobs; she's sometimes even approached by female porn stars who admire what she does.

She's happy, she insists: Messed in the head, maybe, but happy. She's off drugs and alcohol -- the only thing she does now is drink too much coffee and take Zoloft. She's not sure if she wants children, but says that her porn friends are her family: All she needs is her husband Rob, whom she describes as "my savior, the love of my life," and her best friend Veronica, "a saint ... she completes me.

"We're like the land of misfit toys," she giggles.

Borden's not sure where she wants to be in 10 years; maybe on a beach in Florida, she says. She recently launched a second career as a wrestler, as part of Extreme Associates' new foray into WWF-style entertainment (more violent, of course, including accessories like barbed wire and thumbtacks). She's also thinking of going back to college with Caine, to learn how to do movie makeup, or set design, or maybe nursing.

When she talks like this, she sounds like an average young adult, unsure what she wants with the world but confident that she can conquer anything if she just sets her mind to it and has her best friend at her side.

Borden tells me that she is seeing a therapist. When I ask her what her therapist thinks of what she does, she tells me, "She thinks I'm interacting with the world, which is good. I accomplish goals instead of being 'Poor me, I was abused as a child so I'm going to sit on the couch and be depressed and eat dum-dums and watch Oprah and commiserate with every woman and man in the world that feels like that.' No, I'm going out there and doing my own thing."

She looks me in the eyes and puts on a challenging face that flickers between blasé and bravado. "Yeah, I'm fucked up," she says, shrugging. "I can admit it. People say they are sorry for me, and I'm like, why? It's made me a better person. I don't want to be a pansy."

Janelle Brown is a contributing writer for Salon, where this article originally appeared.

Here Come the Buns

Just a few years ago, it was considered in bad taste to reveal your butt crack. Getting cheeky was an icky faux pas reserved for plumbers and the odd teenage boy with unresolved pant-to-boxer issues. Now, however, the tender cleft is in your face. Girls in low-slung jeans sit insouciantly on bar stools, "presenting" their rears like primates in heat. The jeans tug downwards, the butt balloons upwards, and at least an inch of crack blooms above the belt loops. Some have tattoos just above the crack, a titillating invitation to stare. Others brandish g-strings, which ride above the waistband -- a hint of Monica Lewinsky.

The posterior has, intentionally or not, recently become the focal point of fashion and pop culture alike: The butt crack is the new cleavage, reclaimed to peek seductively from the pants of supermodels and commoners alike. Blame it on J.Lo and the rise of booty-centric hip-hop culture; or point your finger at the return of the low-rise jean, a familiar fashion rehash that has exposed millions of unwitting lower clefts.

Perhaps, having grown weary of nipples and thighs, we simply needed a new body part to fetishize. And maybe that's not a bad thing. It's hard to oppose a trend that extols a generous posterior. But like so many trends that spring from retail, this one comes with a punishing beauty protocol. We're going to expose your ass now, the fashion industry has said, and it had better look good.

This is a rather abrupt change from the recent decade's parade of anemic asses. But the behind does have a glorious and full fashion history. Consider the corset, designed to show off the bosom and ample rear by cinching in the waist; or the bustle, which was all about giving baby some back even if she didn't have any to speak of. The round behind lost some momentum after these prosthetically supported heydays, making only rare appearances, often as a bonus body part on the statuesque bods of busty movie stars like Marilyn Monroe and Jane Russell. Then it vanished for what seemed like forever, from Twiggy to waif models, from tiny to nonexistent.

Maybe, then, the return of the rear end was inevitable, one more body part cycled through the fashion wayback machine to keep things fresh (and selling like hot cakes). But bootyliciousness also comes to us by way of the street, a byproduct of the mainstreaming of hip-hop culture, and the accidental fallout of the low-cut pant. Every teenager in America has turned on MTV at some point to see serious backsides jumping to a beat while a rapper extols the glories of the booty.("I like the way she shake it in the thong -- OOOHHHWEE!" says Master P.) After "Out of Sight," Jennifer Lopez's butt was the toast of the nation. Suddenly big wasn't bad and fashion was accommodating.

The invention of the thong bikini (and subsequently the pop sensation Sisqo) was the next step in the public blessing of the backside. It was just a short step from tight pants that emphasized the butt, to bathing suits that tanned it, to fashion that exposed the behind altogether.

Not that butt cleavage was an intentional move. It hit the fashion mainstream more as a consequence of the low-rise pant than the brainstorm of Seventh Avenue. Arguably, this coincidence can be blamed on fashion enfant terrible Alexander McQueen who, in 1997, sent models down the runway in bumsters that revealed several risqué inches of butt crack. By last summer, it was impossible to find a pair of jeans that actually covered your hipbones, and the result was a parade of inadequately covered posteriors. It is possible that many among us never intended to expose butt cracks, but the cut of the pants made it inevitable: Simply put, it is impossible to sit down in a pair of low-rise pants without displaying at least an inch of cleft.

Whether the revealed ass-crack was simply a side effect of bad fashion design or a malicious conspiracy on the part of the jeans industry, the jean-buying population bought in. The youth of America seems blithely undisturbed by the fact that they now feel gentle breezes in places where the sun don't usually shine. Meanwhile, at the high end of the trend, risqué haute couture designers like Donatella Versace and McQueen have picked up on the look and are now intentionally dipping their ball gowns a few inches lower to expose the tops and sides of the buttocks.

Fashion pioneer Sharon Stone best embodied this trend, appearing, over the course of this spring, in a dipped-in-the-back Oscar dress that revealed the tip of her butt crack and a sheer-paneled dress that entirely revealed the sides of her behind (and the fact that she wasn't wearing any panties).

In other words: The butt crack has arrived, an affect of lowly citizens and shock-driven fashionistas like Stone and Versace. The butt crack is the new, 21st century kind of sexual fetish: It's naughty and slightly tawdry, but with the soft round charm of a perfect pair of breasts.Unfortunately, a key word here is "perfect." Perfection is the fuel of the fashion retail-industrial complex. And so, even if the visible butt crack was a kind of serendipitous fashion accident, it has been co-opted by salaried "tastemakers" as a phenom ripe for rules and regulations.

"The derriere isn't a body part as much as an embodiment of personality," claims the June issue of Elle magazine in its six-page spread devoted to the visible posterior. This from a glossy magazine which, like all fashion magazines, has spent decades sternly prescribing squats to flabby readers -- the same magazines that for years have lectured that the best kind of ass is no ass at all (see: Kate Moss). All is forgiven, they write. Aspire now to the ampler behinds of Jennifer Lopez, or Sophie Dahl, or Catherine Zeta Jones.

Of course, J.Lo's ass is hardly elephantine, nor is Dahl's or Jones' or that of anybody else featured as a bearer of good butt. A thunder butt is still a fashion taboo, and no one is bothering to make low-rider jeans in a size 16 since no one wants to see that much butt crack. In fact, the rise of the noticeable behind is ultimately putting even more stress on the body conscious: No longer do you just want a skinny butt, now you also need one that is "high, round, shapely and firm," according to Elle. It has to have enough curve to round out those low-rise jeans, and enough oomph to create a comely cleavage without evoking the Hottentot Venus.

And the proper grooming of your behind no longer begins and ends at the gym. No, no: The newly revealed ass must be treated to an entire beauty regimen of its own. You must wax your bootie to eliminate all unsightly hairs that might peep from between your cheeks. You must massage your buttocks daily to "remove excess water and facilitate lymphatic drainage, causing the skin to plump, making dimpling less noticeable," says Elle. Your butt must be tan, but not sun-damaged, making G-string beach sessions problematic and self-tanning lotions a must. And don't forget to moisturize and exfoliate to eliminate embarrassing acne.

Finally, there's the underwear problem. Visible Panty Line is unforgivable (despite the unfortunate youthful trend of wearing your G-string hiked above your belt), and as the waistline of jeans plummets, your underwear options will be equally constrained. You'll need to purchase an entirely new wardrobe of intimates that ride as low as your pants. (The undergarment industry has happily obliged by inventing low-cut panties, which are selling up a storm).

Of course you can always just buy a new butt. "Good Morning America" recently reported that butt implants, designed to give a boost to an otherwise nebbish rear end, are on the rise at $10,000 a pop. (Pop star Kylie Minogue is a customer, the gossip rags claim.) If this proves too costly for you, you can simply purchase a pair of padded panties to mimic what nature denied you. (Gay men have been wearing them for years). These are to ass cleavage what the Wonderbra was to the breast.

Alas, it is true. The embrace of the generous posterior and its cleft is quickly devolving into yet another reason to feel lousy about your body. The good news is that, if this trend is like all others, it won't last. Super-low-rise jeans are already disappearing from the runways, along with butt-cheek revealing micro-minis. High-waisted pants are coming back, the glossies say. By the time the fashion slave has spent a summer's salary on new underwear and cellulite creams, implants and Brazilian bikini waxes, the visible butt crack will have vanished underneath the fall fashions.

And what beyond the butt crack? Total nudity is probably out, if the fashion industry has anything to say about it. Visible pubic hair may be a possibility, though that seems like a bit much for even Alexander McQueen. Perhaps the pendulum will simply swing back and the call will go out for high collars and more buttons. Hard to say. Best, under the circumstances, to investigate your plumber's closet and move on.

Janelle Brown is a contributing writer for Salon, where this article originally appeared.

Smoke a Joint and Your Future is McDonalds

America loves a happy ending: The prisoner on the brink of release decides it's time to straighten out and go to college; the addict gets himself off drugs and becomes a community leader; the teenager grows up and gets responsible. Rebounding from a troubled past is a great American tradition, rewarded even with the highest post in the nation: President George W. Bush is a former alcoholic turned born-again Christian turned world leader.

Chris Berry wanted to be the subject of one of those stories. A factory worker in his 20s with a wife and four kids, he was caught and convicted of possession of marijuana several times before he decided it was time to go to college and get on with his life. But he needed financial aid to afford an education; and this, unfortunately, was where his plans went awry. Thanks to a provision in the Higher Education Act -- a federal law governing the funding of public colleges and universities, as well as student financial aid -- Berry discovered that he was ineligible for federal aid because of his prior drug convictions.

Despite the setback, Berry was able to scrabble together a $2,000 loan from the nonprofit group Students for a Sensible Drug Policy, and he entered Mountain Home Arkansas College last year. But the money ran out after a year, and so did his time in college. "I'm not a student right now," he complains. "I just can't afford to go to school."

The federal law that foiled Berry in his plan to restart his life is called Drug-Free Student Aid Provision, a piece of legislation passed four years ago in the hysteria of the war on drugs. It is a textbook case of knee-jerk lawmaking, a measure that was ill-conceived and poorly implemented. Not only does it fail to affect the population it was supposed to address, but it unfairly affects struggling minority and low-income students. The provision singles out drug users and gives a free pass to those convicted of other crimes. And most importantly, the legislation effectively thwarts young adults who are trying to clean up their lives and get an education, throwing up barriers that stop them from accomplishing their goals.

There was some outcry when the law was passed in 1998, but it wasn't until the 2001-2002 school year that the provision was put into effect and students began losing their aid. Now, as its impact finally becomes evident, students and civil libertarians are taking a public stand against the law and organizing protests to get rid of it. Meanwhile, the financial aid officers at a handful of universities have reimbursed students affected by the law, and their colleagues around the country are looking for ways to follow suit.

Last month, Yale University became the fourth private university to announce that it would begin reimbursing students who lost their financial aid because of the Higher Education Act. A bill that would repeal the provision, proposed by Rep. Barney Frank, D-Mass., has gained some momentum; and Rep. Mark Souder, R-Ind., who wrote and sponsored the law, is now backing a bill to change his own legislation.

"I think the law is mean-spirited and short-sighted," says Barbara Hubler, director of financial aid at San Francisco State University. "We're out here trying to improve students; education is a way to help people become better citizens. It shouldn't be a bureaucracy that throws up obstacles and frustrates their attempts to move ahead with their lives."

The battle against the Drug-Free Student Aid provision is, in effect, one more protest against laws quickly cobbled together in an unrealistic drive to purge the country of drugs. In the quest to ensure that no one, particularly not young Americans, touches drugs, Congress managed to pass sweeping and draconian measures that fail to differentiate between degrees of drug use and abuse, or between victims and villains. Opponents of the financial aid provision, like the opponents of drug conspiracy laws and the harsh sentencing of drug offenders, are questioning the effectiveness and fairness of broad rules that have tended to punish many in hopes of sending a message to the few, and are now derailing an educational system once known for its inclusiveness.

The measures proposed to change or eliminate the Drug-Free Student Aid provision, even if they are passed, will not help the thousands of students already denied financial aid under the measure. It is not likely to influence the many students who, when denied aid, gave up on higher education altogether. In fact, this year's freshmen, unless they are enrolled at a university that has taken a stand against the measure, will be widely affected by the rule.

The provision may seem like a relatively small piece of legislation, reaching only a fraction of all college students, but its implications are far-reaching. The message it sends about our nation's priorities is ominous: As the ACLU's director of drug policy litigation, Graham Boyd, sums it up, "The government is creating two classes of people: One class to whom we want to give an education and succeed in life, and another class of low-income drug users who we want to relegate to a life of working at McDonalds."

The Drug-Free Student Aid provision was tacked on to the Higher Education Act of 1998 as part of Souder's aggressive anti-drug agenda. The law was intended to be both an incentive and punishment for currently enrolled students who were battling drug problems: Students who were receiving aid but had strayed from the straight and narrow would lose that money unless they could prove that they'd cleaned themselves up. "We wanted to ensure that students who were receiving taxpayer subsidizations were not breaking the law," explains Seth Becker, Souder's press secretary. The provision passed quickly, with support from both sides of Congress.

Under the rule, any student who has been convicted of the possession or sale of a controlled substance is temporarily -- or perhaps permanently, depending on the offense -- ineligible for any federal grants, loans or work assistance. Students with one drug possession conviction lose their aid for one year from the date of conviction; with two convictions, they lose two years; and upon a third offense they may lose their aid forever.

Sanctions are even stricter if students are caught selling drugs: A first offense is punished with a two-year aid loss, and a second conviction gets you indefinitely barred. There is one caveat: If a student completes a federally approved drug rehabilitation program, and then passes two unannounced drug tests, he or she could get the aid reinstated.

But in their rush to get the law on the books, Congress failed to conceive of a reasonable strategy to enforce the provision. The lack of an implementation procedure has resulted in far more students being affected by the law than anyone, even Souder, ever intended. This glitch is compounded further by the inherent unfairness of the law: It specifically targets minority students of lower income, and ignores any financial aid applicants who have committed crimes unrelated to drugs.

Even though the provision was intended to deny aid to currently enrolled students who are convicted of drugs, it has had the effect of punishing new students who have sinned in the past. What the law didn't take into consideration is that the Department of Education has no way of knowing when recipients of financial aid get in trouble with the law. Indeed, college financial aid officers have no means of tracking the arrests and convictions of their students.

Under the circumstances, applicants for aid are required to self-report their past drug convictions when they sign up for support for the first time. The outcome of this system is that the only students who can be identified as having drug convictions are new applicants who haven't even begun to receive aid (and who also haven't been savvy enough to lie about their drug histories when filling out their application).

Both opponents and proponents of the provision agree that this is an unfair way to ferret out drug offenders; because of the way the law is being used, it is no more than a belated punishment for crimes that happened long before the student applied for college. But the sides disagree on how many students have lost their aid because of it.

The Department of Education's numbers fail to clearly illustrate the impact of this provision. It's possible, in the strictest sense, to say that only 1,019 students have lost their aid since 1991 because of a past drug conviction. But this ignores thousands more students who are dropping out of the financial aid process halfway through, thanks to the way the application is formatted. Since 1991, some 59,543 students have either left the question about drug convictions blank on their applications or failed to return worksheets that grilled them about their drug convictions, thereby making them ineligible.

No one is tracking these students to find out why they gave up, but it's probably a fair assumption that many of them had drug convictions they didn't want to reveal and so simply decided not to bother. Similarly, no one is tracking what happens to students who are denied financial aid: Do they continue on to college anyway, find other forms of financial aid and take on second jobs? Or do they just drop out altogether?

There is no national data on these questions, but an informal survey of some local colleges and universities in the San Francisco Bay Area unearthed some disturbing trends. At San Francisco State University, for example, 48 students were turned down for financial aid last year because of their prior drug convictions; of those 48, says financial aid director Hubler, only 4 students ultimately enrolled in school. At City College of San Francisco, 10 students were turned down; a slight majority continued on to school and the rest disappeared. (The more prestigious University of California at Berkeley and Stanford University didn't recall any students who had been denied financial aid.)

"One could theorize from the numbers that students are being denied aid and then not enrolling," says Hubler. "And another group of students with convictions surely looked at the application and said, 'I'm not even going to apply for financial aid.' We have no idea how many of those students there are."

Although it's possible to go through drug rehab to reclaim eligibility, it's a circuitous and insulting process that could deter many students. Students who already have gone through Narcotics Anonymous, for example, are informed that the program doesn't meet federal standards. Their only option is to enter drug rehab again. Students convicted of a minor possession -- say, having a small amount of marijuana -- have to go through the same measures as a heroin addict. Some, if not most, approved drug-rehab programs are both expensive and time-consuming.

Only one of the 48 students at SF State who were denied federal aid bothered to go through rehab, which deeply concerns financial aid administrators. Jorge Bell, the associate dean of financial aid at City College, observed several students drop out rather than undergo rehab. "There might be some students out there who are deciding not to continue their education because of this extra hoop," he says. "Financial aid is so important, if you have to wait for your aid -- or even just wait weeks to go through a drug rehab program -- you may give up altogether."

The Drug-Free Student Aid provision also imposes class and racial biases, in addition to an oddly arbitrary rating of various types of crime. For example, the measure, because it concerns only drug-related transgressions, does not apply to rapists, batterers or armed robbers, among others.

Student groups, infuriated at being singled out as lurking enemies in the war on drugs, have organized against the provision. "Are you telling us that drug laws in the United States aren't deterrent enough?" asks Shawn Heller, national director of Students for Sensible Drug Policy, which is coordinating students across the nation to protest the legislation.

"It's drug war politicking," he says. "Souder wants to go back to his district and say 'I'm tough on drugs and I created laws that buckle down on users.' But if he wanted to do anything about drug problems on campus, he'd give additional funds for local programs that actually reduce drugs on campus. It's another zero tolerance law in which you take discretion away from judges and people who deal with these problems on a daily basis, and give it to the federal government instead."

Civil libertarians argue that this law is an example of blatant discrimination, based on income and race. By definition, they say, the law is designed to penalize the less fortunate college applicants: Students who can pay for college with their own (or more likely, their parents' own) money, are unaffected by the law, while needy students are subjected to unfair scrutiny and the loss of an education. The law also unfairly targets minority citizens, by virtue of the deep racial inequality of the nation's war on drugs: Sixty-two percent of those with drug convictions are African-American, even though they make up only 12 percent of the population and 13 percent of all drug offenders.

"It's well documented that drug war enforcement is heavily skewed towards blacks," says the ACLU's Boyd. "Since the Drug-Free Student Aid provision is more about who gets convicted by the system than it is about the drug offense, it's much more likely that you'll lose your funding if you are black than white.

"It's extending the racial injustice that you see on the street corner into colleges, and the consequences are profoundly unfair, since education is extremely important."

Because the provision is a federal law, only prestigious, private institutions have the freedom to financially assist students who have lost their aid to it, thanks to the private money on hand. At public universities -- where more students are likely to have been affected by the law in the first place -- the financial aid offices are bound by their reliance on public funding. But that has not stopped many of them from speaking out against the provision.

A handful of the private institutions have used their freedom to defy the provision by reimbursing students who have lost their aid, or offering them special scholarships. Last year, three private universities, Hampshire, Swarthmore and Western Washington University, took this action. In April, Yale University, where students have protested the law, joined in. Although no students at Yale have ever lost aid because of the drug provision, any who do in the future will be reimbursed by Yale's private scholarship funds. (The student does have to agree to participate in a drug rehab program, however).

Yale spokesman Tom Conroy is careful not to condemn the law, pointing out that Yale historically has guaranteed all students the financial aid they require for their education, regardless of reason. Still, the provision's opponents are thrilled by Yale's action, and hope that if one private Ivy League school takes action, the rest will fall in line. Opponents of the law are further delighted by the symbolism in the defiance by Yale, alma mater of George W. Bush.

"Yale's decision is tremendously important for political reasons," says Boyd. "Because it is an elite institution, which many of our elected leaders actually attended, it sends the message that this is a law that is so fundamentally unfair that the universities are effectively opting out."

Still other universities have shown some interest in taking action against the measure. The Students for Sensible Drug Policy (SSDP) held a national conference about the provision last year, explaining how financial aid officers could reimburse students who lost their aid. Administrators from 50 colleges, including Yale, attended. The SSDP also has organized student action groups at more than 200 colleges around the nation, including strong chapters at Harvard and Wesleyan, to push their schools to follow Yale's lead.

Some schools, which have yet to lose a student due to the provision, appear to be waiting to see what happens (or if any other universities stick their neck out first). At Stanford University, one financial aid administrator explained that the university would consider a similar action "if the situation were to arise here."

In the meantime, there's a chance that the law could simply go away. In February 2001, Barney Frank introduced a bill to Congress that would repeal the law. Since then, 66 cosponsors from both parties have signed on. But as Frank's secretary Peter Kovar complains, "There's been no action on the bill, and frankly it's an uphill fight with the Republican administration."

Meanwhile, in December, Souder submitted an amendment to his bill that would restrict the disqualification of students for drug offenses to "those students who committed offenses while receiving student financial aid." The amendment is slowly working its way through the committee process, but it still doesn't address the question of how, exactly, the Department of Education would find those students in the first place. (Souder's press secretary Becker suggests that maybe there could be some kind of "reporting mechanism between legal agencies" that would track student drug convictions, but he's noticeably vague on the details.)

This piece of panicked legislation, like others that emerged during the war on drugs, is likely to be slow to change. Nearly 20 years after the national hysteria about crack, for example, we still have laws on the books that will put a person in jail for decades for possessing even the tiniest amount of crack, or throwing a party where drugs are used. The Drug-Free Student Aid provision, which is so flawed that even the congressman who wrote it wants it to change it, may not disappear any faster.

The nation's war on drugs has consumed vast amounts of funding -- for the criminal justice system, interdiction and heavy-handed propaganda designed to bring national drug use to a halt. But education, with its psychological and financial benefits, is widely acknowledged to be perhaps the best deterrent of all. Isn't it ironic, then, that this piece of legislation would deny even a small amount of financial aid to those with troubled pasts who are now trying to improve their lives.

The proponents of the provision argue that public money shouldn't be going to students who are using drugs. By this backhanded reasoning, we must be satisfied with the fact that the money we save keeping kids out of college can be put to better use building jails, where these same kids, hopeless and unsupported, will eventually end up.

Saying No to Propaganda

This is your brain on drugs. Just say no. What's your anti-drug? D.A.R.E. to keep kids off drugs.

Billions have been spent on catchy slogans and flashy branding to make the rejection of drugs as appealing as the consumption of candy. But have the dollars devoted to educating, cajoling, pleading and frightening us away from drugs done the job? Even those who make the ads admit a limited return on this investment: Teenagers see anti-drug ads 2.7 times a week, according to the government's numbers. And yet 54 percent of all teens try drugs before they graduate from high school.

Propaganda from the War on Drugs was supplanted by dispatches from the War on Terrorism during the waning months of 2001. But last month, the Office for National Drug Control Policy (ONDCP) found a way to marry the two battles in its latest anti-drug campaign, which equates drug use with financing terrorists. At the same time, the Partnership for a Drug Free America debuted its own ambitious anti-Ecstasy crusade entitled "Ecstasy: Where's the Love?"

This new offensive is fueled by serious money. Congress has allocated more than $1 billion for anti-drug advertising over the next five years; $180 million will be spent this year alone, and that's merely the quantifiable sums (uncountable sums have been donated in free airtime and ad creation). Although advertising demands only a tiny portion of the government's total anti-drug budget, it's considered the cornerstone of the War on Drugs -- even though there is little proof that anti-drug ads really work. In fact, there is evidence that some anti-drug ads don't work and that others even (unintentionally) encourage drug use, according to the newest research.

But the most vocal critics of the government's new anti-drug advertising haven't focused on the questionable efficacy of the ads. Instead, they have accused the Bush administration of using the War on Drugs to push a broad and moralistic political agenda, while overlooking community-based approaches to drug abuse. Rather than offering real solutions, they claim, the drug-terror campaign simply fans drug hysteria in the course of painting a new administration's face and philosophy on the War on Drugs.

Can an ad campaign that ostensibly seeks to warn teens away from drugs serve as political propaganda? Perhaps, if you subscribe to the idea that good advertising can sell anything to anyone. Would this matter if the ads in question, regardless of their political agenda, managed to make a dent in drug abuse? Maybe not. But so far, that appears to be the problem. Advertising can be used to create habits and sustain them, but, when it comes to drugs, it isn't necessarily an effective tool in snuffing them out.

Anti-drug propaganda, both government-funded and privately sponsored, has existed since the 1930s (think "Reefer Madness"), but it wasn't until cocaine -- and then, crack cocaine -- became a national epidemic that federally funded anti-drug advertising as we know it was born. Nancy Reagan launched the memorable "Just Say No" campaign in the 1980s, at the height of a cocaine "epidemic" that was galvanizing concerned parents and authorities; her "Just Say No" advertisements, bumper stickers and T-shirts were ubiquitous. Then, in 1987, a collective of advertising professionals created the Partnership for a Drug-Free America, hoping to do pro-bono work as a private contribution to the War on Drugs, and began peppering the airwaves with their own anti-drug advertising. The goal was to "decrease demand for drugs by changing societal attitudes which support, tolerate or condone drug use." The idea was to condition kids to reject drugs, using the same branding and market-testing principles that sell Crest toothpaste and Nike sneakers.

According to the 1979 National Household Survey on Drug Abuse, 34.4 percent of all American high school seniors reported having tried drugs, and 18.5 percent said they had done so in the last 30 days. By 1992, that figure had dropped to 17.9 percent and 6.6 percent, respectively. Believers in the power of anti-drug advertising invariably point to this impressive reduction in drug use as evidence that campaigns like "Just Say No" and those created by the Partnership for a Drug-Free America actually work. Then, drug use began climbing again in the 1990s, as evidenced by the statistics: By 1997, 11.4 percent of all high schoolers had done drugs in the last 30 days. The rise coincided with the waning of the anti-drug advertising movement, a parallel that proponents of the campaign also used as "proof" of its efficacy when lobbying Congress for new funds. But as much as the precipitous fall and rise of drug use in the 1980s and 1990s looked like evidence of successful anti-drug advertising, some researchers are wary of directly connecting the two. Robert Hornik, a professor of communication at the Annenberg School of the University of Pennsylvania, and the researcher behind a new study of the effectiveness of anti-drug ads, says that there's a "possible correlation" between the ads and statistics of this period, but the drop in drug use could have had as much to do with any number of factors: youth disillusionment with drugs, as cocaine wreaked its havoc and ran its course; plus a general nationwide furor that kept drugs in the public eye.

"There was much more noise in the environment about drugs during that period," Hornik says. "So the number of exposures someone would have had [to messages] about drugs was much more substantial."

When drug use again began to rise in the late 1990s, the Partnership for a Drug-Free America and the ONDCP renewed their efforts: They began working together, and in 1998 they launched the National Youth Anti-Drug Media Campaign. Congress apportioned some $1 billion to pay for advertising space for the ads produced by the two groups, and an anti-drug media blitz flooded the nation with an assortment of anti-drug advertisements. Despite the drop in drug use, the "Just Say No" message was declared irrelevant: It was the message of a former administration, and had long been eviscerated by both press and youth as the simplistic message of an exceedingly unhip First Lady. The government shifted gears and came up with a new series of approaches.

Although the ONDCP has been releasing its own anti-drug ad campaigns since the 1980s, the new National Youth Anti-Drug Media Campaign fomented a more regimented strategy for that group. Over the last four years, the ONDCP has released a series of "platform" advertisements: the "Negative Consequences" platform, for example, includes ads that depict kids getting in trouble when they do drugs; the "Resistance Skills" platform includes tips on how to say "no" to peer pressure; the "Parenting Skills" platform instructs parents to talk to their kids about drugs; the "Norm Education" platform sends the message that "the coolest kids don't do drugs." The main theme of the ONDCP's campaign has been "The Anti-Drug" brand, which extends across several platforms and instructs kids to find their own "anti-drug" (such as music or sports or a pet) to keep them straight.

When Bush appointed John Walters drug czar in May of last year, drug war watchdog groups anticipated the beginning of another guns-and-jails era for the ONDCP, with a greater emphasis on military and criminal punishments. Walters, a drug "hawk" who had served under William Bennett, was well known for his moral condemnation of drug use and his criticism of Clinton's drug war techniques. Although the War on Drugs dropped from the national agenda in the days after Sept. 11, it came rushing back in January with the ONDCP's first effort under Walters -- an ad campaign that managed to conflate moralism and nationalism with a heavy dose of guilt, and which immediately generated a flurry of both positive and outraged media coverage.

The new ads essentially warn drug users that when they buy drugs, they are funding terrorism. In the ads, a series of shrugging teens confess their culpability in a variety of ugly terrorist activities: "I helped a bomber get a fake passport. All the kids do it." The tagline: "Drug money supports terror. If you buy drugs, you might too." The terror-drug ads seemed to usher in a new philosophy of social guilt: Buying drugs isn't just bad for your body and your future, but it also makes you personally liable for politically motivated mayhem.

The drug-terrorism ads were "a definite departure" from the ONDCP's softer find-your-anti-drug campaigns, which sought to inspire or distract kids tempted by drugs, says ONDCP spokesperson Jennifer De Vallance. The new ads, she says, are representative of a new philosophy in the War on Drugs: "Forever people have said you shouldn't use drugs because it's bad for your body, bad for your brain, bad for your parents," says de Vallance. "These ads take a broader perspective." Trying to convince teens that drugs are bad for them was a losing battle, she adds. "Talking to teenagers is like talking to Olympian gods, because they see themselves as invulnerable. But they do appreciate the concept of social responsibility."

Bush personally described the ONDCP's strategy as ushering in a new "period of personal responsibility" -- moving away from "if it feels good do it" to an age of "morals." Explained the Office of National Drug Control Policy in a news release: "Americans must set norms that reaffirm the values of responsibility and good citizenship while dismissing the notion that drug use is consistent with individual freedom."

But critics have claimed that the ads are merely heavy-handed propaganda for the Bush administration's conservative agenda: By associating the War on Drugs with the popular War on Terrorism, they say, the administration hopes to curry support for its more militaristic approach to battling drug use. "There's a new troika driving U.S. drug policy -- Attorney General John Ashcroft, Asa Hutchinson [head of the DEA] and Walters," says Ethan Nadelmann, director of the Drug Policy Alliance, a nonprofit organization that advocates drug war reform and harm-reduction approaches to drug abuse. "All three of them are interested in drug policy primarily in terms of advancing a more reactionary political agenda in the U.S. They are making an effort to resuscitate the Bennett-politicized drug war of a decade ago."

(In the weeks following the release of the ads, Walters also announced a new plan to reduce national drug use by 25 percent, relying heavily on interdiction, criminal justice, and military approaches, with additional dollars going to specialized treatment programs. Meanwhile, the DEA staged two high-profile drug busts, including one on the controversial legalized cannabis clubs of San Francisco.)

Even those in the advertising industry concur that the drug-terror advertisements appear to have as much to do with maintaining support for the government's efforts as they do with actually reducing drug use. If the administration associates policy of any kind with the popular War on Terrorism, say veterans of the advertising industry, it is likely to maintain high approval ratings. As Mark DiMassimo, C.E.O. of DiMassimo Advertising (and the creator of a series of Ecstasy ads for the Partnership for a Drug Free America), puts it, "This is wartime propaganda. It's sort of like going back to World II and World War I when they related what you eat and don't eat -- whether you threw out leftover rice -- to the war effort."

But even as the debate rages about the nuance and approach in this campaign, new research shows that, regardless of their content or gimmick, anti-drug advertisements aren't necessarily making an impression on the audience they are meant to sway anyway. The political propaganda behind the terror-drug ads would be forgivable, theoretically, if the ads were actually convincing vast numbers of American youth to steer clear of drugs. But judging by the most recent research on anti-drug advertising efficacy, the ONDCP may need to return to the drawing board.

It is possible, of course, that guilt about terrorism as a means of enforcing "social responsibility" will, in fact, cause drug usage to plummet dramatically. Maybe teens really will steel themselves with thoughts of Osama bin Laden the next time someone offers them a joint, and just say no. (Never mind the fact that the joint was more likely to come from Humboldt County than Afghanistan or Iraq.) But the experts aren't counting on it, partly because recent reports show that, in general, there's no concrete link between anti-drug propaganda and teen drug use rates.

According to de Vallance, the new terror-drug ads have been hugely successful -- both because of the buzz they've created (some 175 articles have been written about the campaign already), and because of the impact they supposedly have had on youth. "These ads, in focus group testing, had among the highest results of reducing intention to use that we've seen in the history of the campaign," says de Vallance, who reports that more than 70 percent of the focus group teens said the ads would deter them from trying drugs.

While encouraging, the focus group reports do not ensure that the drug-terror ads will work. In fact, it is quite possible that, in these days of fulsome anti-terror rhetoric, the focus group teens felt pressured to report that they wouldn't support terrorism by doing drugs.

When the National Youth Anti-Drug Media Campaign launched in 1998, a massive research effort launched with it: With more than $1 billion apportioned for anti-drug advertising, the stakes were high enough to initiate a process to establish whether the money was well spent. The effort was spearheaded by Westat, an independent research group in Maryland, with the University of Pennsylvania's Annenberg School conducting much of the actual research. Every six months, researchers visit some 8,000 kids and their parents in their homes to interview them about their personal drug use and the ads they've seen. (They are promised anonymity.) Three reports have been issued since the research began in September 1999; three more are still to come. In October of 2001, the researchers published their latest report assessing the cumulative effectiveness of all the new ads that had been issued by the National Youth Anti-Drug Media Campaign since its launch. The good news was that drug ads targeting parents often do encourage parents to talk to their kids about drugs. The bad news was that, thus far, the media campaign hadn't had a measurable impact on the kids at all.

The average kid is currently seeing an anti-drug ad 2.7 times a week, according to Robert Hornik, Annenberg School professor and the scientific director of the report. "We're seeing lots of reports of exposure," says Hornik. But "we haven't seen any real change over time, and no real association between exposure and outcomes." This means that the kids see the ads, but it doesn't seem to have an immediate impact on their drug-use behavior.

Hornik warns that the October data represents only 18 months' worth of research, and that there will be three more reports: "It could be that it will take more time for the kids to be affected," he says. Still, Hornik's report isn't the only one with bad news for anti-drug advertisers: In the American Journal of Public Health, an unrelated group of University of Pennsylvania researchers also discovered that many of the approaches used by anti-drug ads are not only ineffective, but often even encourage kids to do drugs.

"Although there is some evidence that mass media campaigns can be successful, most studies evaluating mass media campaigns have found little or no effect," the report posits. The researchers selected 30 anti-drug advertisements created by the Partnership for a Drug-Free America in the last four years and showed them to 3,608 students in grades 5 through 12. Afterwards, they interviewed the students about their responses to the ads. The researchers broke down the ads into categories -- ads that focused on the negative consequences of drug use (i.e., "This is your brain on drugs"), ads that focused on self-esteem issues (i.e., "The anti-drug"), ads that stressed "Just say no," as well as celebrity testimonials; and a category of ads about the dangers of heroin or methamphetamines. They then used the students' responses to measure the overall efficacy of each approach.

The results were decidedly mixed. Researchers discovered that 16 ads seemed to be effective in discouraging drug use; but another eight ads had no measurable effect whatsoever, and six ads actually spurred the viewer to either want to go try the drugs, or feel less confident about how to reject them. Unfortunately, the ads that had the greatest impact on the viewers were the ones that scared kids away from heroin and methamphetamines -- drugs which most teens are not likely to try anyway. The least effective ads were the ones that addressed marijuana and "drugs in general" -- ironically, the drugs that most teens are doing in the first place.

As the report concluded, "it may be much more difficult to change young people's beliefs, attitudes and intentions regarding use of marijuana than use of 'harder drugs' ... The PSAs appear to have the biggest impact on those who seem to need them the least; or, those who most need to be influenced by these PSAs (i.e., those who do not view these risky behaviors as harmful or dangerous) are least likely to view the PSAs as effective." In other words, the kids who are already prone to try drugs aren't going to be discouraged by what they see in the ads; and the kids who wouldn't try them anyway are going to be most affected.

The Partnership for a Drug-Free America acknowledges the results of the study, but has no plans to change its approach. In general, says Steve Dnistrian, the executive vice president of the Partnership for a Drug-Free America, it's difficult to find concrete evidence that advertising does or doesn't work; to draw a direct line of cause (advertisement) with effect (purchase, or, in the case of drugs, lack thereof).

"There is no perfect way to measure advertising effectiveness," he says. "These [research results] are numbers we would take on any day of the week; in our mind, this is a very, very strong case to be made for the effectiveness of these ads. It also points to the issue that we've known for a long time -- no single ad will do the trick, which is why you need multiple ads and multiple strategies."

Dnistrian does have a point: Critical as many people are of many anti-drug campaigns, it's difficult to advocate that they be completely removed from the airwaves. Even if the ads aren't individually effective, they keep the issue of drugs in the public dialogue. And during those serendipitous times when anti-drug ads dovetail with national alarm over a topic -- the influence of "Big Tobacco," or the sudden widespread use of crack -- it is likely that they influence a broad, if brief, disgust with all drugs. But even if anti-drug campaigns succeed in keeping drugs in the public consciousness, there is a nagging issue, exposed in research, that some ads are so bad that they alienate their intended audience. Advertising executive DiMassimo says the ONDCP's ads are particularly egregious, at least from an advertising executive's point of view: "The ONDCP generates long lists of approved messaging: Much of it comes out in the clunky language of social scientists, and it is a source of amusement and consternation among the creative people and communication professionals who make up the Partnership."

The various advertising agencies that contribute to the Partnership's campaign tend to use traditional tools in creating their ads. DiMassimo describes this as "going to hang out with teens, learn about them, and then coming back with details in their language, like a cultural anthropologist." This type of saturation research works much of the time, he says, admitting that some ad industry veterans who have used this approach to make anti-drugs ads have often missed the mark as well.

Based on his own experience advertising to kids, DiMassimo believes that ads that try to be "cool" are the ones that will be received most skeptically -- for example, the clunky series of ads that educated teenagers onhow to say "no" to the drugged out "cool" kids who hang out at "hip" parties. The ads appeared to have been made by out-of-touch authorities who have no idea how kids dress, talk or dance.

The biggest mistake, says DiMassimo, is when the ads "overstate the danger" of drugs. "Kids believe anti-drug people are stiff, uptight, overnervous parental-type figures, and when you overdo it you play in to that side of the brand," he says. Kids know perfectly well that drugs are fun, he says, and there is little point in trying to tell them otherwise, à la "Reefer Madness." He describes the best kind of ad as a cost-benefit analysis: "The Partnership's work on marijuana is understated -- we say that no one says pot will kill you, but that there are better things to be than a burnout." He uses the ONDCP's terrorism ads as an example of the worst kind of authoritarian browbeating of teens and believes most kids will know that the ads are overstated.

Still, DiMassimo's own campaign -- the Partnership's ambitious new anti-Ecstasy initiative -- could be accused of overstatement. Twenty-seven people out of an estimated 3.4 million who used Ecstasy between 1994 and 1999 died under the influence of the drug; yet the new campaign chooses to focus on the death of one young woman as a warning against using the drug. You could say that the ads are merely focusing on the worst-case scenario, but kids who are aware of just how rare Ecstasy deaths are might simply reject the ads wholesale as authoritarian exaggeration. Other anti-Ecstasy ads are equally dramatic, depicting teens partying it up on E while their friend lies passed out and alone in the bathroom, under the tagline "Ecstasy: Where's the Love?"

Critics of anti-drug advertising who follow this research wonder whether ads that try to discourage kids from doing drugs aren't mostly futile. They often insist that the money would be better spent addressing kids who do drugs and need help dealing with their addiction. "Everything the ONDCP and Partnership does is focused on 'Just Say No,' mostly scare tactics, and occasionally a positive message about why you shouldn't choose drugs," says Nadelmann. "We think you should do messages directed at young people who are already experimenting or doing drugs, aimed at keeping them out of trouble." He notes: "Surveys show that campaigns directed at getting people to not do things are the least effective."

Drug war reformers like Nadelmann and David Borden, executive director of the liberal Drug Reform Coordination Network, tend to support peer education programs and harm-reduction principles over blanket advertising (and, similarly, they prefer legalization or treatment to expensive interdiction). "You have to meet people where they are. Every young person is in a different place, so the programs that will work the best are the ones that are run by or with their peers," says Borden. "You can't do that by running ads during the Superbowl."

It will take months, even years, to know whether the new anti-drug campaign has an impact on drug use, although Walters has promised that these efforts and others will reduce drug use by 25 percent by 2007. It is a bold commitment given that the ingredients of effective anti-drug advertising remain something of a mystery; and since youthful tastes are as flighty as the videos on MTV, they probably will remain so for quite some time. But there is also little evidence to suggest that Walters would get better results if he moved his $180 million ad dollars to peer education programs and harm reduction groups. The terror-drug ads are perhaps best viewed as a public relations machine for the Bush administration, summing up in a few words (and a lot of taxpayer money) the government's moral philosophies, the way "Just Say No" summed up the Reagan era.

Government drug propaganda is just that: propaganda veiled as a behavior modification tool. It seems that no number of simplistic, catchy anti-drug slogans can fully shape America's convoluted and varied attitude towards drugs. Even certain Bush family members have been known to stray, and surely Bush Senior told them all about "Just Say No." Perhaps some Americans will always have an appetite for drugs, and no remedy -- advertising, interdiction, education or criminal punishment -- will ever eradicate it.

Janelle Brown is a senior writer for Salon Technology. This article was
originally Published on Salon.com. Reprinted with permission.

The Music Revolution Will Not Be Digitized

Once upon a time, a revolution brewed. Righteous artists, technologists and youthful entrepreneurs launched digital music start-ups, determined to take power away from the conglomerates that controlled the recording industry and deliver it into the hands of the little people.

The dream was everywhere: Artists would use the Net to connect directly with fans and everyone would escape the tyranny of record labels and onerous contracts and overpriced CDs. Music would flow like water, and herringbone-suited executives in Hollywood offices would gnash their teeth as they finally received their comeuppance.

In those glory days, the Net gave birth to start-up after start-up dedicated to the proposition that online music had a brilliant future. Emusic.com, Napster, Nullsoft, MP3.com, SonicNet, Scour, IUMA, and dozens of other Web sites offered bold promises of how they would use the new medium to reboot the entire music industry. No doubt, much of the rhetoric was little more than marketing hype designed to give shaky start-ups a bit of power-to-the-people marketing cred, but for consumers weary of Top-40 radio and CD price-gouging, the vision was exhilarating.

Five years after it all started, the revolution is nowhere to be seen. The record labels, once railed against by those impertinent start-ups, now own their former enemies. Fiercely independent Internet companies have been picked off one by one by the same media conglomerates they once saw themselves as alternatives to. Through a brutal combination of business savvy, legal warfare and simple cartel power, the Big Five record labels have maneuvered the digital distribution industry into their control.

The process of consolidation and legal annihilation has been going for years, but the month of May witnessed an impressive flurry of activity. Vivendi Universal purchased MP3.com. Bertelsmann bought Myplay.com. Encouraged by its success at mortally wounding Napster, the Recording Industry Association of America (RIAA) filed lawsuits against Aimster, a file-sharing utility that works with instant messaging software, and Launch, an Internet music site with impressive personalization capabilities. In the months previous, independent companies fell like dominoes: eMusic, Scour, IUMA, SonicNet, Musicbank, CDNow -- those that haven't been bought by their competitors have gone bankrupt or been forced to lay off virtually their entire staffs.

Selling out, it should be noted, isn't always a disaster. It's also a time-proven "exit strategy" and is often an explicit goal for software-related start-ups. It's also hard to know how many tears should be shed for companies like MP3.com and Napster; they were, from the beginning, just as greedy for profits as the record labels they lambasted. It's also difficult to argue with the reality that the studios do own the music so coveted by consumers and start-up entrepreneurs; it was always inevitable that they would fight to wrest back control of their content.

But what about all those consumers who bought into the revolutionary rhetoric and spent the last few years expectantly thrilled about the promise of digital technology: do-it-yourself radio, subscription services with all the world's music at your fingertips (the so-called celestial jukebox), personalized interactive streaming radio stations, and file-sharing services that introduced you to independent music you'd actually like? It's certainly not a victory for them that the big labels are taking control of online distribution. The recording industry's vision of the future is one in which we will all be paying $2.50 for every digital single we download or $.25 for every streaming song we hear -- and you'd better forget about ever swapping those MP3s with your friends or, God forbid, an All-Metallica-All-The-Time radio station accessible through your Web browser. Innovation is being sledge-hammered out of existence by legal threats and buyouts. It's all about control -- and right now, consumers are set to lose what little gains the Internet offered them.

The news isn't all bad for online music lovers. The indie spirit of the MP3 revolution is not entirely dead. Use of Gnutella, a distributed file-sharing program that isn't tied to a single commercial company, is skyrocketing in Napster's wake, and smaller music companies continue to burst forth with interesting ideas and amazing technologies. But the trend of events in the industry warrants caution. Gnutella's success will just make it a bigger target for an ever-more-confident recording industry, and any other company that raises its head high enough will also likely provoke a severe reaction. Meanwhile, the war is all but over for the original start-up guerrilla warriors -- those so bold to think they could cash in on a new medium without paying a price to the old regime.

Who would have imagined it? MP3.com CEO Michael Robertson, a roguish gadfly who was once happiest when delivering sermons against the evils of the recording industry, is now about to start working for his formerly avowed enemy.

"It is a little on the crazy side, and dripping with irony," says Robertson. "But I guess that's what makes the Internet so darn interesting."

When did everything start falling apart? Historians will no doubt pick over the remains of the early days of the Internet for decades to come, but it's probably fair to say that the success of Napster signaled the beginning of the end. Napster singlehandedly turned millions of consumers on to the world of MP3s. Before Napster, MP3 usage was steadily rising but still far from widespread, since mainstream music was hard to find. The advent of free, all-you-can-eat music changed that forever: with a few clicks, you could access the world's music, anywhere, anytime.

Napster got the recording industry's attention. Previously, the labels eyed the Net warily -- releasing the occasional downloadable digital single, chasing down the wayward MP3 pirate, lobbying Congress for strengthened copyright laws. But after an abortive start -- the RIAA's first online music-related lawsuit, aimed at stopping Diamond Multimedia's Rio MP3 player in 1998, failed -- the industry kicked into action. In Dec. 1999, the RIAA charged Napster with copyright infringement.

The courts, it would soon become clear, were the recording industry's preferred method for dealing with the upstart digital music industry. Mere weeks after Napster received its summons from the RIAA, a similar lawsuit was filed against MP3.com. The online MP3 search engine MP3Board.com came next, followed by Scour, a rival P2P service backed in part by ex über-agent Michael Ovitz.

In retrospect, the money spent by the RIAA and the recording labels appears well spent. Scour.com shut down its file-sharing service and sold the remains of its assets to a joystick company called CenterSpan (it has yet to relaunch). Napster sought cover by selling a controlling stake in itself to Bertelsmann -- usage of the service, according to the online entertainment news site Webnoize, has plummeted at least 25 percent.

MP3.com settled with the Big Five labels for an estimated $160 million. Post-lawsuit, MP3.com still had some cash in the bank, and saw profit potential in increased fees from its 150,000 artists (including the cost of on-demand CDs and a subscription fee for artists who wanted to participate in the Payback for Playback program), but the battered pioneer of the MP3 movement finally gave up on the idea of going it alone. On May 20, Michael Robertson sold the company to Universal, his former adversary, for $372 million.

The recording industry's approach to the digital music business appears to have been to wallop the competition with lawsuits until they gave up -- and then pick up the bruised remains to use to their own advantage. "The music industry looked at legal maneuvers as simply a business strategy," says Robertson. "And, quite frankly, years of lobbying have helped them construct a labyrinth of laws and rules and complexities and advantages. When you look across the digital music space, they've outmaneuvered Scour, Napster, MP3.com -- I don't think we're the only company that has suffered from being sensitive to legal maneuvers."

Even those start-ups that did manage to evade lawsuits haven't done well. Launching revolutions turns out not to be all that cheap. After the venture capital is gone and the stock price is under water, many impoverished music start-ups had to shut down or sell out. Emusic.com? Sold to Universal. Sonicnet? Bought by MTV, and subsequently dismantled by layoffs. IUMA? Bought by Emusic, and eventually shut down (although its remains are being revived by Vitaminic). Musicbank? Closed before it even opened. MyPlay.com? Purchased by Bertelsmann, where it joins CDNow and Napster. And those companies that can still boast of independence, such as ArtistDirect or Launch or Listen.com, are bleeding staffers and pinching pennies and on the verge of being delisted from NASDAQ.

What does it all mean? It'll cost you, big, to have a new idea in the entertainment distribution business.

"There is right now a climate of oppression among inventors, who are unable to market, fund or even freely distribute their work," complains Johnny Deep, the founder of Aimster. "As [RIAA head] Hilary Rosen has said, quoted by Larry Lessig, 'unless we approve, your idea will not be permitted. It will not be allowed.'"

Without recording industry support and music licenses, distribution platforms like MyMP3.com or Napster or Launch have no major artists to (legally) distribute and therefore, no mainstream customers. The record labels have been notoriously stingy with those licenses; and even when they do grant the rights to their music -- for example, in the case of the LaunchCast radio station -- the labels are quick to employ the industry-friendly Digital Millennium Copyright Act to micromanage exactly how the music is listened to.

"There is no place for a small company to pull off a monster vision in digital music," says Robertson. "If you're making a tiny widget that's a bolt-on feature for listening to music, fine -- that can be a small company. But if you want to be the grand vision, the place where everyone stores their music and listens to it wherever they go, that's a very big undertaking and a small company simply cannot do that. What you're witnessing on the digital music front is that all the small to medium companies are going away. The window of opportunity is over."

With the promising early dot-com music companies falling by the wayside, who is stepping into the void? Naturally, the record labels, now setting themselves up as online distributors. Not only are individual record labels like AOL Time Warner, Bertelsmann and Universal scrambling to set up their own Napster-like services, distributing their own music on P2P platforms, but the entire recording industry is now dividing into two larger camps for distributing music licenses. The upcoming MusicNet and Duet platforms are supposed to step in where MP3.com and Napster are being forced to step aside. MusicNet (a partnership of AOL Time Warner, EMI, Bertelsmann and Real) and Duet (Universal, Sony and Yahoo) are both setting themselves up as music-licensing platforms that will sell the labels' catalogs for subscription services -- of course, available only in their chosen music formats, and only to carefully approved music services, and only in low-quality streams.

"If you are a small player, you're kind of locked out of the game now," Robertson says, explaining his decision to sell to Universal: "It's shaping up to be a two-horse race: MusicNet and Duet, with Duet powered by MP3.com."

The power, then, is consolidated squarely back in the hands of the same record industry executives that held the reins before. Everyone with a good idea that doesn't fit into what the music moguls have already deemed appropriate is out of luck. That personalized radio station will be shut down, that peer-to-peer network will be decimated before it even has a chance to offer a subscription plan, prices for music downloads will be set sky-high, and new music-exchange services will contain only limited catalogs.

The loser in this equation is, of course, the customer, as the pre-Internet status quo of high-priced CDs and generic radio playlists is simply replicated for the digital age.

The digital music start-ups were hardly saints or true freedom fighters by any measure -- many were plagued by greed and blithe disregard for the rights of artists. But they were all focused almost entirely on customer-friendly innovation -- personalization, portability, interactivity, access to hard-to-find tracks, exposure to new music. Wooing customers was a requirement for the start-ups.

It might be possible that in the long run the recording industry will pull through for customers -- that one or all of these upcoming, label-endorsed competing music distribution services will offer significant catalogs of music at reasonable prices, that unique online radio stations will blossom once licensing issues are ironed out. It's also possible that Congress, which is currently reviewing the activities of the record companies and debating the merits of the Digital Millennium Copyright Act, will step in on the behalf of the start-ups.

Digital music is not dead. There are hundreds of companies that are still innovating or offering new ways to listen to music -- whether online radio stations like those at Live365.com, or small independent MP3 download sites like Epitonic, or groovy technologies and add-on applications like Kick.com. And statistics for Gnutella are through the roof -- according to Clip2, which monitors the P2P service, Gnutella use grew 4 percent in the last week of May alone. Since March 4, Gnutella traffic has risen 400 percent, thanks in part to user-friendly applications like BearShare and the frantic work of programmers who have shored up the technology's weakest links. For every Britney Spears song you can no longer find on Napster, you can find 10 copies on Gnutella.

The prospect of anyone making money off of digital music other than the recording industry that is so successfully defeating the online rebels is a different story. There is no middle ground. The demise of digital music dot-coms points to a future in which the music industry is utterly split between for-profit, label-controlled services, and decentralized distribution technologies designed to evade and circumvent the authorities. How that story will play out is impossible to say yet. Gnutella may eventually succumb to the might of the RIAA, which is already making noises about targeting software developers, ISPs and individual users of the network with lawsuits.

But the Internet itself is fundamentally about making distribution of content easier. If Gnutella falls, another Gnutella-like structure will rise. So far, hackers have always found another workaround. There will always be music-loving programmers who want to continue to innovate on consumers' behalf, with or without the approval of the RIAA; and as long as the record industry continues to exploit consumers and artists, that's exactly what those programmers will do.

The collapse of the independent digital music industry brings us back to the beginning, back to the truly do-it-yourself indie roots of the Net's earliest days. Collective projects that are free from any corporate ties are still flourishing, and small companies with nifty ideas lurk on the fringes. The major record labels, in turn, will do what they've always done: They'll take advantage of their newly acquired Internet start-ups to develop music services designed to reap an already profitable industry even greater profits.

Could it ever really have been any different?

Janelle Brown is a senior writer for Salon, where this article first appeared.

BRAND NEW STORIES