Brian Awehali

Life After Corporate Death Care

It has been a bad few years for the corporate death care or "after-death" industry, and people aren't dying fast enough or expensively enough to fix the problem. As traditional religious death rituals have given way to more secular alternatives, a consumer revolt against the high cost of dying in America is well underway.

After more than a decade in which corporate death care providers aggressively sought to expand their market share, particularly in communities of color, run funeral homes and crematoriums like stealth franchises, and introduce concepts like "branding" into the death care mix, they're now clearly on the retreat. Several major providers have filed for bankruptcy, while others have faced serious legal troubles related to their business practices.

The dramatic contrast between the present and the recent past is evident in a March 1998 U.S. News and World Report article. The article reported that the price of funerals in the preceding five years had risen three times faster than the cost of living, that the "Big Three" death care businesses -- Service Corporation International (SCI), The Loewen Group and Stewart Enterprises -- owned 15 percent of the country's 23,000 funeral homes, handled one in every five funerals, and enjoyed average profit margins approaching 25 percent. The article reported that the average cost of a funeral in 1998 in the United States was $8,000.

Houston-based industry leader SCI saw its revenues drop from $3.3 billion in 1999 to $2.2 billion in 2002, and lost more than $1 billion during that time period. In 2000, SCI owned 3,382 funeral homes. As of April of this year, that number had dropped to 2,393. The San Antonio Business Journal reports that SCI "faces stiff competition from a growing number of independent funeral homes," most of which, the article continues, were once owned by SCI. "In the past few years, many of those independents were able to repurchase their autonomy... Now these independents are aggressively bleeding revenues from [SCI]."

Canada-based Loewen Group, which deliberately targeted black-owned funeral homes for acquisitions, bought up more than 340 properties between 1996 and 1998, and reported an operating profit of nearly 58% in 1997. After studying federal census and crime statistics, the Loewen Group apparently concluded that higher mortality rates -- coupled with a cultural preference for high-markup burials -- made funeral homes in black communities attractive properties. In addition to buying up large numbers of funeral homes in black and Latino communities, The Loewen Group went still further, making a deal with the National Baptist Convention USA -- the largest black organization of churches in the U.S. -- to appoint two Loewen-trained "funeral counselors" to every congregation. These "counselors" sold graves, tombstones, vaults and other Loewen Group services to congregants for a 10 percent commission.

Loewen declared bankruptcy in May of 1999, but has since restructured and now operates as The Alderwoods Group, albeit on a far smaller scale. The decline of the Loewen Group was precipitated in part by two judgments against the company -- one for $150 million and the other for $50 million -- for unfair pricing.

SCI and the Loewen Group represent two of the more dramatic examples of the corporate death care industry's decline, but across the board, corporations attempting to turn a profit on death have seen their incomes and market shares dwindle.

So what happened to an industry once considered recession- and inflation-proof?

What Went Right

The best answer is that the death care industry, long one of the most ethnically diverse and economically stable sectors of the economy, simply proved to be incompatible with corporate values and business practices.

Despite the best efforts of companies like SCI to provide "value to families at their time of need or on a prearrangement basis" while "blazing new trails through corporate innovations and revolutionary services in quality, value and care," a clear shift is taking place in the way Americans deal with -- and pay for -- death and grieving.

"Funeral service is very personal and doesn't lend itself to a standardized cookie-cutter approach," says Bill Isokait, Director of Advocacy for the National Funeral Director's Association (NFDA). "Most funeral home owners have been in business for over 60 years as family-owned businesses. They know most of the people in their communities... There are different cultures, ethnicities, different customs, which these owners are more responsive to."

According to Isokait, the corporate consolidation of the death care industry has stopped. "[Death care] corporations like SCI, Stewart Enterprises and the Loewen Group were anticipating, with the aging of the baby boomers, an increase in the death rate. They saw a market opportunity which didn't materialize... death rates have been steady over the past ten years, and that consolidation has begun to reverse itself... it's down now to about 9 or 10 percent."

"Consolidation has reached a point of saturation," agrees Ron Hast, publisher of the industry publications Mortuary Management Monthly and Funeral Monitor. "None of [the corporate providers] have brought anything better to the community. In fact, they're generally known for higher prices and certainly nothing of any higher standard than has been delivered by private ownership. One would think that the economics of scale would bring lower prices, but it has done just the opposite. In order to carry the burden of the price they paid to acquire a business, plus the cost of middle and corporate management, they've had a difficult time."

Simplicity and cremation are the two most significant trends in death care today, according to Hast. "Those two factors have a strong influence on what people choose. In the past, it was taken for granted that there would be a visitation of the body, typically a religious ceremony, and a procession to the grave. This has changed noticeably, particularly in the coastal regions. Many funeral directors now are serving people for simplicity instead of tradition."

"There are more personalized funeral services," says Bill Isokait, "with an emphasis on the celebration of the life of the deceased, their interests, their hobbies. The rise in the cremation rate, as opposed to traditional burials, is anywhere as high as 55 or 60 percent out west and 30 percent nationally, creeping up every year."

When asked about the economic and religious implications of this trend, Isokait says he sees less and less religious and cultural reluctance to cremate. "Many people these days don't have a religious affiliation at all. You have Catholics who at one time did not approve [of cremation] and that's changed. It's an increasingly viable alternative -- the scattering of ashes to reflect the life and interests of the deceased."

Still Dearly Beloved

One interesting aspect of the rise and decline of the corporate death care industry is how people came to entrust the final care and disposition of their loved ones to businesses in the first place. Certainly, it wasn't always this way.

Jerry Lyons, the founding director of Final Passages, a non-profit program in Sonoma County that educates people about their rights to home funeral arrangements, has some insights on the subject.

"It's similar to births. We've institutionalized many things in this country -- we used to do home schooling, home births, and home weddings. Service businesses have grown up in this country and we've turned things over to other people to handle. We started taking less time to really celebrate the passages in our lives. We hire people to help us do it all and they've kind of gotten bigger -- that's what happened here -- weddings got bigger, people fly now from place to place to attend things and made it harder for people to handle all of their own arrangements. Eventually it evolved to where we got more and more help.

"But then at a certain point," she continues, "greed took over and people being in that emotional state at that time makes them all the more vulnerable to people wanting to make more money off of the situation."

Lyons is one of three women interviewed in "A Family Undertaking," an upcoming PBS documentary that profiles several families who made the decision to forego the typical mortuary funeral and instead care for their loved ones at home. She says the stealth practices of corporations are something she talks to people about.

"I do my best to educate people at my workshops and presentations, making them aware that corporations are buying up independent funeral homes all over the place, and that they don't change the name. Most people aren't even aware of it. One of the easiest examples to give is the Neptune Society, because most everyone has associated them with low-cost funerals and cremation -- they were one of the first crematoriums. People don't realize that they've been bought out by Stewart Enterprises, and are one of the highest cost crematoriums. Their starting cost is around $1500 for direct cremation. When people learn this, they seem completely unaware that there are other services in the area besides Neptune Societies."

When asked for estimates on low-cost, alternative cremation arrangements, Lyons says they can run as low as $300. (She adds that this is the cost for cremation alone and does not include fees associated with filing paperwork or any additional services.)

Compared to the average cost of a traditional burial in the United States -- roughly $6000 in 2001 -- it's not hard to see why the rising popularity of cremation and simple ceremonies honoring the life of the deceased have corporations dying to get out of the business.

David and Goliath in Indian Country

"I've never seen more egregious misconduct by the federal government," said Royce C. Lamberth, the federal judge overseeing the Individual Indian Trust (IIT) reform case, which stands as the largest class action lawsuit ever filed against the U.S. federal government. His comments were elicited in 1999 when the Department of the Interior and Treasury Department announced they had "inadvertently" destroyed 162 boxes of vital trust records during the course of the trial, then waited months to notify the court of the "accident."

Filed more than seven years ago by Elouise Cobell, a member of the Blackfoot tribe, and her legal team, and representing more than 500,000 individual Native American landholders, the Indian Trust case -- also known as Cobell vs. Norton -- is easily one of the biggest stories of government mismanagement and malfeasance in modern U.S. history. The case has turned into such a nightmare for the Department of the Interior (DOI) and Treasury that the government now has more than 100 lawyers assigned to the case; more than it employed in the Microsoft antitrust litigation.

At stake is over $100 billion rightfully belonging to the nation's most impoverished people. The Interior, Bureau of Indian Affairs (BIA) and the Treasury say they have simply "lost" the money and claim they cannot now provide an accurate accounting of how much is owed to whom. In the course of the lawsuit, the federal government has destroyed vital accounting documents, filed false reports to the court, and generally conducted itself in such bad faith that a total of 37 past and present government officials, including current Secretary of the Interior Gale Norton and former Secretary Bruce Babbitt, have been held in contempt of court for their misconduct.

During the first of two trial phases, the government attempted to limit its liability, but suffered a resounding defeat when Judge Lamberth ordered the Interior Department to conduct a full accounting of Indian trust assets -- dating all the way back to 1887 -- and imposed strict liability on government trustees.

"This is a landmark victory," Cobell said of Judge Lamberth's Phase One decision. "It is now clear that trust law and trust standards fully govern the management of the Individual Indian Trust and that Secretary Norton can no longer ignore the trust duties that she owes to 500,000 individual Indian trust beneficiaries."

Unlikely Warrior

Elouise Cobell lives in Browning, Mont., where she founded and helps run the first tribal-owned bank in the country. She also served as treasurer of the Blackfoot tribe from 1970 through 1983.

Growing up, Cobell's family had no plumbing, electricity or running water. She remembers they would often complain about sporadic or suspiciously small checks from the government for the land given to Cobell's grandmother during "allotment" at the turn of the century, land that is now leased out by the Dept. of Interior and the BIA to ranchers and timber companies.

Allotment was the policy ushered in by the Dawes Act in 1887 in an attempt to break down tribal structures and reduce Native American land holdings. The heads of tribal households were given up to 320 acres of land, but were forbidden actual ownership; instead, the government forced a trust relationship, in which the Interior and the BIA would oversee the land and disburse all revenue from its use to the individual landholders.

The Act also allowed for the sale of "surplus" land, a provision white settlers exploited fully in their purchase of around 90 million acres -- almost 65 percent of all Native American lands -- by 1932. The government stated during this period that this trust relationship was necessary because it did not believe Indians were capable of managing their own affairs.

Yet it's difficult to conceive of how anyone could have done a worse job managing the trust.

After discovering a number of anomalies in the Blackfoot tribe's trust, including an account drawing negative interest, as well as a host of unauthorized transactions, Cobell began asking questions. She claims that her initial questions to the BIA were met with derision. "They said, 'Oh! you don't know how to read the reports,'" Cobell recalls. "I think they were trying to embarrass me, but it did the opposite -- it made me mad."

Steadily more resolved, Cobell caught a break when a deputy commissioner for Indian Affairs in the first Bush administration, David Matheson, arranged for her to meet with several well-connected government officials and banking experts. Among the group was Dennis Gingold, eventual lead prosecuting attorney for the case. Of his first meeting with Cobell, Gingold told the Los Angeles Times, "From my experience, American Indians were not involved in banking. I was looking for a bunch of people with turbans."

What Gingold came to learn about the case astonished him. Each year more than $500 million comes into the Individual Indian Trust from companies leasing Native American land for grazing, oil, timber, coal and other natural resources. The money is collected by the Interior and sent to the Treasury, where it is then theoretically deposited into individual trust accounts.

The problem is, the Interior and BIA are widely regarded as the worst run agencies in the entire federal government. Fifty thousand trust accounts don't have names or proper addresses. One such account has $1 million in it.

A House Committee on Government Operations in 1992 issued a damning report entitled "Misplaced Trust: The Bureau of Indian Affairs' Mismanagement of the Indian Trust Fund." Two years later, Congress passed the Indian Trust Fund Management Reform Act, appointing a trustee to oversee the process. This was Paul Homan, who brought with him to the job a reputation for cleaning up problems at financial institutions. Before being appointed the first trustee, Homan had been CEO of four problem banks and the executive vice president of another. No shrinking violet, Homan eventually quit in disgust, claiming he received no cooperation from the Interior or the BIA, or then Interior Secretary Babbitt of the Clinton administration.

Before he left, Homan reported that it was impossible to ascertain how many people were owed money. Of the 238,000 individual trusts Homan's crew located, 118,000 were missing crucial papers, 50,000 had no addresses, and 16,000 accounts had no documents at all. Homan further reported that one could assume money had been skimmed extensively from the trust: "It's akin to leaving the vault door open."

Forced To Sue

Faced with the prospect of a wide-open vault door the government seemed to have no interest in closing, Cobell was eventually compelled to file a lawsuit on June 10, 1996, demanding a full accounting of all IIT monies.

In the intervening seven years, Cobell's team has piled up an impressive series of victories. In February 1999 Judge Lamberth held Babbitt, Treasury Secretary Robert Rubin, and Asst. Interior Secretary Kevin Gover in contempt of court. In August 1999, Lamberth ordered the Treasury to pay $600,000 in fines for misconduct.

And on Dec. 21 of that same year, the judge issued his Trial One opinion (the case is divided into two phases), wherein he found that the government had breached its trust responsibilities to Native Americans and ordered the government to file quarterly reports detailing its reform efforts. Lamberth also retained jurisdiction over the reform for a period of five years.

During the first phase, the Interior seems to have bungled things in every conceivable way. The Senate Government Affairs Committee cited the Interior's handling of the Indian trust as one of its "Ten Worst" examples of federal government mismanagement. It came to light through a report filed by court-appointed Special Master, Alan Balaran, that Interior and Justice Department lawyers were destroying emails at the same time they were assuring the court the emails were being preserved.

The second phase of the trial, to determine the sums owed to Indian trustees, is now underway. As the suit nears its conclusion, the federal government has tried every means available, ethical or otherwise, to derail or minimize the imminent settlement.

On Sept. 29, 2003, Special Master Balaran filed a site visit report after being ordered to vacate the premises of the Dallas office of the Minerals Management Service. In the report, Balaran cited "chaotic document management, an inability to locate audit files...and the unexplained presence of an industrial shredder."

The destruction of vital documents has continued and has been cited repeatedly in reports to the court. Numerous experts have testified that a true accounting of the trust based on existing records is impossible. As a result the plaintiffs have submitted an accounting plan employing a satellite mapping technology known as Geographic Information Systems, or GIS, to estimate how much money individual Native Americans should have received from oil leases on their lands. Gingold explains that such methodology is necessary when records necessary for an accounting of funds have been destroyed.

Using detailed production records from every well drilled in the West, the plaintiffs can determine how many of those wells are on Indian reservations. With this information, the amount of revenue those wells managed through government leases would have produced can be estimated. The mapping technology also includes ways of calculating for timber, grazing and mineral leases on Native American lands in the West.

Divide And Conquer?

Trust reform is taking a heavy toll on the national treasury. The administration has requested $554 million in the 2004 budget to reform the trust fund, an increase of $183.3 million above the $370.2 million that was set aside in 2003. In a recent letter from the Indian Affairs Committee, Sen. Ben Nighthorse Campbell (R-Colo.), urged a speedy settlement and predicted that Congress would intervene soon and negotiate a settlement if the suit was not resolved.

Support for Cobell's efforts in the Native American community is far from universal, however, and this may prove the greatest remaining obstacle to a fair and final settlement. Because part of her team's efforts involve removing trust responsibilities from the BIA and Interior, some worry that this could provide grounds for terminating the government's trust relationship with tribes that depend on funding.

In March, five tribal chairmen published an article in Indian Country Today, the nation's leading Native American newspaper, alleging that the Cobell suit was employing "scorched-earth" tactics, and that "an attorney for the plaintiffs has publicly stated that the Cobell suit has the potential to destroy tribal governments."

The concerns of the chairmen seem to hinge on three main assertions: that the tactics of the Cobell legal team are akin to "warfare;" that the Cobell team has not utilized opportunities for diplomacy and negotiation to the fullest extent; and that in requesting a third-party receiver to resolve the trust's problems (taking it away from the Interior and BIA) without first consulting tribes, Cobell and Co. demonstrated a disregard for tribal governments.

A reply written a week later by Cobell and John Echohawk, the executive director of the Native American Rights Fund, pointed out that the "consultation, communication and cooperation" urged by the chairmen in their letter are doomed to failure because of the Interior's manifest, well-documented bad faith. In defending the "warlike" decision to bring the authority of a federal court to bear on the Interior, the letter reasoned, "Our approach is to ensure accountability when people mismanage Indian assets and that can no more be described as 'scorched earth' than holding Enron and Arthur Anderson executives accountable for their misdeeds."

"It is curious that now, when a multi-billion dollar judgment and accountability seems inevitable, officials within the Interior are pushing the notion that there is 'no end' and that a congressionally forced 'settlement' is the only solution. Tribal leaders and Indian people must not fall for this ploy" the letter closed, "and must see these actions for what they are -- an attempt to get Congress to step in at the eleventh hour and bail out the government. ...We cannot allow the Interior Department, their proxies, or anyone to 'divide and conquer us.' The government is losing and they are desperate. They are banking on being able to make us war against one another. [But] what's wrong with the Indians winning for once instead of the cavalry?"

As the suit draws slowly to a close, the extraordinary scenario of the Indians winning for a change seems closer and closer to being a reality. "I've heard from friends that the government thinks I'm tired and that they'll wear me down, so that I'll just go away," says Cobell.

Just outside her hometown is a marker that tells the story of the winter of 1884, when 500 Blackfoot died of starvation and exposure while awaiting government-promised supplies. They were buried in a mass grave now called Ghost Ridge. During the more difficult stages of the lawsuit, Cobell said she visits Ghost Ridge to reflect on her ancestors who perished in the cold almost 120 years ago, while waiting for the government's good will.

With that lesson from history firmly in mind, it seems unwise for the government to bet on Eloise Cobell or her team going away any time soon.

Brian Awehali is a freelance writer and the editor of LiP Magazine.

Held in Contempt

Since 1887, the U.S. government has been entitled to lease Indian lands and utilize their natural resources for everything from logging and mining to grazing cattle to pumping oil.
Today, the government does a brisk business in leasing, as royalties from the use of the land add up to more than $1 billion annually.

According to the Interior Department's own figures, 56 million acres of Indian land are now held in "trust" by the U.S. government, which is charged with redistributing most of those royalties to the individuals and tribes whose lands are being leased. Altogether, the Department of the Interior manages over 100,000 leases for approximately 236,000 Individual Indian Money (IIM) account holders--in addition to 1,400 tribal accounts.

Individuals and tribes alike depend on these trust fund disbursements for rent, food, and the basic operation of social services in Indian Country.

The problem: Sometimes those checks arrive, and sometimes they don't. Sometimes the checks might arrive for hundreds or thousands of dollars, and sometimes those checks might only amount to pennies on the dollar. On Indian reservations, the problem has reached crisis levels; a check written out for a smaller amount than expected--or no check at all--can mean the difference between housing and homelessness.

All the while, the Interior Department's officials have made it clear that they're not sure how to fix a broken trust disbursement system, much less how much money is missing, or where the missing funds have gone. For their part, lawyers representing hundreds of thousands of Indians in the largest-ever class-action lawsuit against the government have put the cumulative total at $137.2 billion owed.

No matter what the final figure, there's no doubt that the nation's single most impoverished ethnic group could use a bit of that cash.

It's for this reason that a group representing 300,000 Indian plaintiffs have spent the last six years trying to get the Interior Department to account for all the money that they are owed. The plaintiffs, led by Elouise Cobell, an outspoken female banker and member of the Blackfoot Nation, insist that the Interior Department's officials and employees have broken the "trust" relationship between Indian people and the Federal Government, and are therefore neither fit nor equipped to continue overseeing the vast sums.

Even the federal judge overseeing this landmark case, Cobell v. Norton, has called the BIA the most "historically mismanaged federal program" in the U.S. In February 2002, U.S. District Judge Royce Lamberth had these, sharp words for the Interior Department and Secretary Gale Norton: "[T]he department has now undeniably shown that it can no longer be trusted to state accurately the status of its trust reform efforts. In short, there is no longer any doubt that the secretary of the Interior has been and continues to be an unfit trustee-delegate for the United States."

In November 2001, when Gale Norton became the second consecutive Interior Secretary to face contempt charges in a federal court for failing to provide an accounting of the Bureau of Indian Affairs (BIA) management of IIM accounts, its unlikely she or her co-defendant, former BIA director Neal McCaleb, a Chickasaw Indian, anticipated the emotional force and organizational unity of her detractors.

Both Norton and McCaleb were held in contempt in September 2002 for failing to heed the court's orders to fix trust oversight problems. (McCaleb, the nation's highest-ranking American Indian, resigned three months later, citing the "contentious and litigious environment" ahead of him.)

Norton and McCaleb were not the first government officials to be held in contempt for the handling of Indian trust monies: President Clinton's Treasury Secretary Robert Rubin had his turn at this dubious honor as well. Government officials who let down their constituencies and mismanage millions should be held accountable.

For the Interior Department, the accounting and management of the IIM trust fund has been an exercise in acronyms.

From the perspective of the department, Secretaries Babbitt and Norton have legitimately tried a variety of approaches to reconcile and improve a broken system as quickly as possible .

On January 6, 2003, the Interior Department met a court-imposed "compliance plan" deadline, assuring the court that it took the responsibility of meeting the conditions of the 1994 American Indian Trust Fund Management Reform Act (ITRA) seriously.

The 16-page document (intended to demonstrate to the court what kind of progress the Interior had made in meeting its fiduciary trust obligations) announced a reorganization within the BIA and the Office of the Special Trustee for American Indians. According to the document, the Interior was also undergoing a "reengineering of Interior's trust business processes" and the implementation of a strategic plan formerly known as the Indian Trust Business Plan, and now entitled the Comprehensive Trust Management Plan (CTMP).

The CTMP, in turn, would be administered under the "leadership of the Office of Indian Trust Transition (OITT)."

After a while, the Interior's proclivity for the use of acronyms, appointed offices and grand restructuring plans begins to jumble together into a steaming, swirling bowl of alphabet soup.

In 2001, Norton had already proposed the creation of a new agency, the Bureau of Indian Trust Assets Management (BITAM), to manage IIM accounts. Tribal leaders chafed at not being consulted about her plans, but BITAM remained the Interior's buzzword will into 2002.

In the spring of 2001, Norton proclaimed the staffing of a new division, the Office of Historical Trust Accounting (OHTA), which was supposed to perform a limited accounting of owed trust monies. But in a July 2002 Report to Congress on the Historical Accounting of Individual Indian Money Accounts, the dozens of employees and contract workers hired to work in the office concluded that it would take them 10 years and $2.5 billion to actually do that job. And even then, they admitted, such research and accounting would not necessarily produce usable results.

Similar examples of expenditure without result riddle the Interior's recent history. Thirty million dollars spent on data cleanup, for instance, resulted in a computer specialist's admission, in court, that he could not certify that a single account had been cleaned up despite the fact that tens of millions had already been spent.

In their own "Compliance Action Plan" submitted to Judge Lamberth on January 6, 2003, Cobell and her peers point out that the government now has a genuine opportunity to redeem itself to Indian trust beneficiaries by taking quick and decisive action.

In doing so, plaintiffs took the opportunity to excoriate the Interior Department's tactics: "[The] defendants are forever reorganizing themselves, moving organizational boxes around on a chart, devising new acronyms, and renaming tasks and entities in deeper and deeper bureaucratic jargon in a pathetic effort to create the phony impression of, if not progress, at least movement."

Unwilling to wait for the Interior to make real progress, the plaintiffs asked Judge Lamberth to consider their own proposal: Take trust management out of the hands of the Interior altogether. Key to the proposal is the idea that the IIM trust accounts should immediately be taken over by an "unconflicted" trust administration solely devoted to fixing and administering the trust debacle.

Cobell and lead attorneys from the Native American Rights Fund have emphasized that such an administration, made up of non-governmental employees, and funded with permanent appropriations, could hire the competent staff and supervisors necessary to ensure the proper management of trust money.

As Tex Hall, president of the National Congress of American Indians told the New York Times in early January, "This isn't taxpayer money. This is our money that the government took, and they have to give it back."

Moving Forward

Because of their statistically small numbers, the majority of congressional representatives outside of states like South Dakota, New Mexico and Arizona can safely ignore all but the most cursory issues pertaining to the modern struggles of American Indians. For U.S. politicians, the benefits of pushing for meaningful reform of federal Indian policy are negligible.

Between American politicians whose interest in Indians is tangential and inconsistent, and a majority non-Indian citizenry whose awareness of Indian realities is minimal, at best, the categorical failings of the BIA and Interior Department are simply allowed to pile up, year by year.

"Where has Congress been while this mugging has gone on for nearly six years?," asked Cobell of the House Resources Committee in 2002.

One saving grace has been increased attention from Senators John McCain (R-AZ), Tom Daschle (D-SD), and Tim Johnson (D-SD), who introduced S. 2212, the "Indian Trust Asset and Trust Fund Management and Reform Act" in April 2002.

The "discussion" bill--so entitled to encourage comment and modification, as warranted--focuses on the creation of a Deputy Secretary for Trust Management and Reform, and specific provisions for tribal participation and self-determination in the trust reform process.

"There is no more important challenge facing the tribes and their representatives in Congress than that of restoring accountability and efficiency to trust management," said Sen. Daschle when the bill was introduced.

Added Sen. Johnson, "Of all the extraordinary circumstances we find in Indian Country ... I do not think there is any more complex, more difficult and more shocking than the circumstances we have surrounding trust fund mismanagement."

But what happens from here, of course, is anybody's guess.

Will the Interior Department continue to invent an endless stream of new proposals and official acronyms to conveniently skirt their fiduciary obligations? At least in the near future, such a strategy seems likely, even predictable.

Will Cobell and her fellow plaintiffs eventually wear the government down? Anything is possible. Even now, signs are emerging that the White House itself wants to push the Interior Department to settle to prevent the additional costs of further litigation, as noted in a June 2001 letter from the Office of Management and Budget to the Interior Secretary.

But such settlement seems furthest from the Interior's agenda. J. Steven Griles, Deputy Interior Secretary, had this to say to the New York Times: "I am not settling a case with taxpayer money for billions of dollars when there is no supporting evidence that the money they say they lost ever existed."

In fact, a critical June 2002 report from the Office of the Inspector General (OIG) seemed to point toward the Department's "bunker mentality" where trust reform is concerned. "The Cobell litigation has so embroiled and angered those involved that they cannot see or think clearly in order to make a correct decision," as the OIG reported. "Every effort is thwarted by internal discord, distrust and a dysfunctional reluctance to assume ownership."

And so while a legitimate accounting of monies owed by the Interior becomes less and less likely, hundreds of thousands of Indians continue to go without what they're owed.

From the standpoint of Indian trust account holders, the trust debacle is but the latest insult in what amounts to a historical miscarriage of decency and justice toward the descendants of America's first inhabitants.

"Many of the intractable problems the tribes and federal policy makers wrestle with today stem from the wreckage caused by these misguided policies of the past," Senator John McCain noted while introducing S. 2212 last year.

"It took over 100 years to create the problems we now confront with the Indian trust funds and assets," he added. "The Indian people did not create these problems. The Federal Government did."

Silja Talvi and Brian Awehali are the editors of LiP Magazine.

Inventing Thanksgiving

"On Thanksgiving Day all over America, families sit down to dinner at the same moment - halftime."
--Unknown

Every year, as Thanksgiving approaches, I am filled with profound ambivalence. Even as a child, the standard Thanksgiving story always seemed too simple, too wholesome, and too peaceful to be true or truly American. Finally, past the faux-historicism of school textbook-styled Pilgrims and Indians, I was able to delve into the actual construction of the story of Thanksgiving. And, in this way, I learned just how fabricated and utterly bizarre this American "holiday" really is.

In 1621, at Plymouth Plantation on Massachusetts Bay, 50 Pilgrim settlers joined with at least 90 Native guests in a three-day feast which is now traditionally cited as the "First Thanksgiving."

In reality, this seasonal, quasi-secular New England harvest celebration was not repeated in Plymouth and was in fact forgotten until a reference to it was discovered almost 200 years later, in a contemporary book known as "Mourt's Relation."

Contrary to the widely accepted, idyllic account of two cultures sitting down to share a meal in harmony, most 17th-century colonial images relating to Native Americans depict violent confrontation. It was only around 1900, when the western Indian wars had largely subsided due to a shortage of Indians left to kill--and when it was safe for Euroamericans to supplant fear with nostalgia--that the romantic Thanksgiving narrative most Americans today are familiar with took hold.

Thanksgiving Day provides an ideal opportunity to consider the formation of national identity and the concept of a civil religion. It's also a living metaphor of the prevailing American model for immigrant assimilation and the ways in which history can be reinterpreted, and indeed wholly reinvented, to serve competing ethnic, patriotic, religious, and commercial ends.

A Host of Victory Thanksgivings

An overview of historical documents reveals the many uses to which various thanksgivings have been put. The Continental Congress declared the first national day of thanksgiving on November 1, 1777, to celebrate an American victory over British general John Burgoyne:
Forasmuch as it is the indispensable Duty of all Men to adore the superintending providence of Almighty God; to acknowledge with Gratitude their Obligation to him for benefits received, and to implore such further Blessings as they stand in Need of: And it having pleased him in his abundant Mercy, not only to continue to us the innumerable Bounties of his common providence; but also to smile upon us in the Prosecution of a just and necessary War, for the Defence and Establishment of our inalienable Rights and Liberties... It is therefore recommended to the legislative or executive Powers of these UNITED STATES, to set apart THURSDAY, the eighteenth Day of December next, for the Solemn Thanksgiving and Praise: That at one Time and with one voice, the good People may express themselves to the Service of their Divine Benefactor.
Did such a weighty declaration to the Divine Benefactor cement the basic contours of the holiday? Hardly. Then as now, political struggles (electoral and military) were often interpreted as theaters for the enactment of divine will, and so victories great and small led to a rush of thanksgiving declarations. The Confederate Congress proclaimed separate thanksgiving observations in July 1861 and again in September 1862, after the First and Second Battles of Bull Run. And it wasn't just the South. President Lincoln similarly set aside days of thanksgiving in April 1862 and August 1863 to commemorate the important Union victories at Shiloh and Gettysburg. These ad hoc decrees fell in some cases on Sundays (a common day for religious observance) and in other cases on Thursdays. Lincoln declared yet another Thanksgiving Day in 1863, for the last Thursday in November--and it has been celebrated annually in late November ever since. In his proclamation he drew attention to affairs both national and international:
In the midst of a civil war of unequaled magnitude and severity, which has sometimes seemed to foreign states to invite and to provoke their aggression, peace has been preserved with all nations, order has been maintained, the laws have been respected and obeyed, and harmony has prevailed everywhere, except in the theater of military conflict, while that theater has been greatly contracted by the advancing armies and navies of the Union.
It was not until 1931, when President Herbert Hoover made his proclamation, that any of the presidential declarations of thanksgiving mentioned the Plymouth Pilgrims and the 1621 harvest festival as a precursor to the modern holiday. By this time, yet another willfully amnesiac reinvention of Thanksgiving was under way.

Industrialization, Commercialization, Assimilation

The general anxieties of the 1920s and 1930s provide telling insights into the creation of Thanksgiving Day as it is generally practiced and taught in the present-day United States. Elizabeth Pleck, writing in the Journal of Social History, asks why it's historically important that "domestic occasions" like Thanksgiving be old-fashioned:

Thanksgiving eased the social dislocations of the industrial and commercial revolutions... The growth of commerce, industry, and urban life created a radical break between past and present, a gap that could be bridged by threshold reunions at the family manse. Nostalgia at Thanksgiving was a yearning for a simpler, more virtuous, more public-spirited and wholesome past, located in the countryside, not the city. In gaining wealth, the family and nation, it was believed, had lost its sense of spiritual mission. Perhaps celebrating one special day might help restore the religious morality of an earlier generation.

In the aftermath of World War I, at a time when many Americans were concerned both with preserving and promoting (in Pleck's words) a "close-knit, religiously inspired [Protestant] community," and, not coincidentally, with the "Americanization" of Northern and Eastern European immigrants, Thanksgiving Day provided a compelling occasion for emphasizing civil religion, the quasi-religious belief in national institutions, purposes, and destiny. Furthermore, the model of the Pilgrim as the archetypal "good" immigrant, peacefully coexisting in prosperity with other ethnic communities, proved all but irresistible. The ideal of the "melting pot" in the United States -- often less about relishing a diverse mix of ethnic elements than about reducing ethnic culture to an assimilated national identity -- also exerted a powerful influence. By the time of Hoover's 1931 proclamation, the codification of Thanksgiving as the fundamental American holiday was essentially complete.

Which is not to say that Americans were done tinkering. One noteworthy and almost quintessentially American reformulation was ushered in by the Macy's Thanksgiving Parade. This commercial pageant began in 1924 as the Macy's Christmas Parade because, as Elizabeth Pleck observes, "The department store wanted to stage a parade as a prelude to the Christmas shopping season." Pleck also notes that even in the 1920s, the parade did not exist in the shadow of the family feast or the church service, but in very real competition with another Thanksgiving tradition: the afternoon football game.

Football was clearly the more significant of the two forms of out-of-home entertainment, as changes in the timing of Macy's parade in the 1920s indicate. Initially, Macy's parade offended patriotic groups, who decried a spectacle on "a national and essentially religious holiday." Macy's hired a public relations man, who decided the critics could be placated if the parade in the morning was postponed until at least after church services had ended. The parade, pushed to the afternoon, began at the same time as the kickoff for most football games. Customers and football fans complained. By the late 1920s, Macy's had returned to an early morning parade, presumably so as not to compete with afternoon football games.

The parade featured different groups of immigrants demonstrating their American cultural fluency with floats that echoed and reinforced the core Thanksgiving origin myth. At about the same time, schoolchildren were being exposed to similar ideas about celebration, national history, customs, and cultural symbols, all of which came together to form the narrative that persists more or less intact to this day.

Lies, Half-Truths, and What a Nation Will Tell Itself

Perhaps, given the patent falsehood of the Story of Thanksgiving, one of the better questions to ask as the holiday approaches is what, in fact, it really stands for. As a Cherokee, I have never felt much like celebrating an event that essentially commemorates one of several stages in the genocide of Native Americans by European settlers, a process which continues to this day in the form of environmental racism, structural poverty, and lack of educational resources. There were times, to be sure, when I appreciated sitting with my family and devouring an embarrassment of culinary riches. But those I hold separate from the holiday itself.

For me, this now agreed upon Thanksgiving symbolizes first and foremost the alarmingly subjective nature of history, which, as Howard Zinn reminds us, is almost always written by the winners. It symbolizes the triumph of football over religion, and of American commercialism over virtually everything standing in its wasteful path. And perhaps most importantly, it symbolizes the lies and half-truths on which a profoundly diverse country must depend in order to prop up the specious concept of a broadly shared civil religion or national identity.

Thanksgiving, then, symbolizes that there is still great work to be done before a nation that readily prides itself in its goodness, honesty, and wholesome relationship with Divine Grace will actually resemble the stories it tells itself.

Brian Awehali is the publisher and co-editor of LiP Magazine.

New World Disorder

What's wrong with this picture? The world's lone superpower, fearful of being attacked by one of many real or perceived enemies, sets out to solve the problem by increasing weapon sales and military aid to the world, but not just to existing allies. Indeed, in the wake of Sept. 11, the race is on to arm governments formerly considered unstable or otherwise "off-limits" due to gross human rights violations, on grounds that these nations are assisting in the sweeping "war against terrorism."

If that sounds illogical, then perhaps you're beginning to understand the perverse logic that pervades the U.S. arms industry. After Sept 11, the industry has -- with the support of the Bush Administration -- stepped up its efforts to further reduce oversight and regulation of arms sales and military aid. This, despite a clear track record of providing weapons to the very forces now portrayed to a frightened public as threats.

In the process, the administration is apparently jettisoning efforts to use military aid as a carrot to encourage the advancement of human rights. In the past year, restrictions on military aid and arms sales to formerly off-limits regimes have largely been eliminated. Of the 67 countries which have received or are set to receive U.S. military aid, 32 have been identified by the State Department as having "poor" or worse human rights records.

"Two key [FY2002] Defense Department funding allocations -- $390 million to reimburse nations providing support to U.S. operations in the war on terror and $120 million 'for certain classified activities,' according to a report in the Arms Sales Monitor (Aug. 2002), can now be delivered 'notwithstanding any other provision of the law'."

In other words, Congress has approved a staggeringly large sum of military aid for regimes fighting an ill-defined "war on terror," and is working to ensure that there will be little or no public scrutiny of how such aid is spent.

The central question is: does this makes the world a safer place for anyone but arms manufacturers and the politicians who love them?

Conflicts of Interest

From 1991 to 2000, the U.S. delivered $74 billion worth of military equipment, services and training to countries in the Middle East, according to a Sept. 2002 General Accounting Office (GAO) report. You might expect that a majority of that military aid went to our staunch ally in the region, Israel, which has been cited repeatedly by the U.N. and Amnesty International for human rights abuses. However, military aid to Saudi Arabia -- where a majority of the terrorists reported to be involved in the Sept 11 attacks were from -- topped $33 billion for the period, outpacing aid to Israel by a more than 5-to-1 margin.

What's more, there is ample evidence that arms sales to the Middle East are, in fact, destabilizing and dangerous.

"Foreign [military] assistance to the Middle East," noted West Virginia Democratic Senator Robert Byrd in 2001, "virtually ignores the spiraling violence in the region."

In a Nov. 9, 2001 interview with Pakistan's Ausaf newspaper, none other than Osama bin Laden justified the Sept. 11 attacks by noting that the U.S. sells to Israel advanced weaponry that it uses in the military occupation of Palestinian territories. Bin Laden was specifically discussing the sale of Lockheed Martin's F-16 fighter planes. It's worth noting that Lynne Cheney, the wife of Vice-President Dick Cheney, was on the board of Lockheed Martin from 1994 until 2001, and would have been involved in overseeing this sale.

On July 13, 2002, the New York Times also reported that Vice President Dick Cheney's former employer, the Halliburton Company, is "benefiting very directly from the United States' effort to combat terrorism." From building cells for detainees at Guantanamo Bay ($300 million) to feeding American troops in Uzbekistan, the Times reported, "the Pentagon is increasingly relying on a unit of Halliburton called KBR, sometimes referred to as Kellogg Brown & Root." KBR is the "exclusive logistics supplier for both the Navy and the Army, providing services like cooking, construction, power generation and fuel transportation."

And then there's the Carlyle Group, described by the The Industry Standard as "the world's largest private equity firm," with more than $12 billion in assets. A Washington merchant bank specializing in buyouts of defense and aerospace companies, the Carlyle Group stands to make a substantial sum of money from a global "war on terror." Former U.S. President George Bush, Sr. -- whom current President Bush is known to consult about policy matters almost daily -- works for the firm.

According to the Baltimore Sun, so do former Secretary of State James Baker III and former Bush Sr. campaign manager Fred Malek. Former Republican Defense Secretary Frank Carlucci (a college roommate of Defense Secretary Donald Rumsfeld), is the Carlyle Group's chairman and managing director.

The bin Laden family, hailing from Saudi Arabia, is also heavily invested in the Carlyle Group. On Sept. 27, 2001, the Wall Sreet Journal published an article entitled "Bin Laden Family Could Profit From Jump in Defense Spending Due to Ties to U.S. Bank." The "bank" in question? You guessed it: the Carlyle Group.

Cold War Communism vs. New World Terrorism

One of the more disturbing aspects of post-9/11 arms sales is the wanton redefinition of various dissident groups around the world as "terrorists." Even longstanding conflicts such as the 38-year-old civil war in Colombia have been re-cast as a war between our Colombian allies and "terrorists."

In the Phillipines, "counter-terrorism aid" has been released to fight a band of Islamic militants, the Abu Sayyaf Group (ASG), despite the fact that even government analysts admit the ASG poses no credible threat to the U.S. In Nepal, counter-terrorism aid has been allocated to help the Nepalese military quell Maoist dissent, despite State Department testimony that there's no evidence that these dissidents are connected to al-Qaeda. Military aid flowing to Central Asia under the auspices of fighting terrorism seems equally ill-justified, with virtually every country in the region receiving increases in U.S. military aid despite connections to the war on terrorism that are, at best, tenuous.

What seems clear from a close look at military aid policy over the past year is that the U.S. military is using the threat of terrorism to garner support for its ambitious goals for extending its reach around the world, and that it doesn't mind arming unstable or anti-democratic regimes in the process.

The "weapons against terror" rationale is strained even further by a 2001 report released by the Centre for Defense Information, an independent non-profit research group. The report, entitled "US Arms Exports to Countries Where Terror Thrives," found the following:

"There are 28 terrorist groups currently operating in 18 countries, according to the State Department's bi-annual list of active foreign terrorist organizations...In the period of 1990-1999, the United States supplied 16 of the 18 countries on the State Department list with arms....In addition, the U.S. military (and CIA) has trained the forces of many of these 18 countries in U.S. war fighting tactics, in some cases including individuals now involved in terrorism."

In sum, the U.S. has sold weapons or training to almost 90 percent of the countries it has identified as harboring terrorists. A severe restructuring of U.S. arms export policy is in order, but little or nothing is being done to ensure a safer future.

Guns and History in the Middle East: Why Insecurity Sells

Perhaps nowhere is the correlation between arms sales and violence more apparent than in the Middle East, where the U.S. sells an enormous amount of weapons.

According to an Aug. 6, 2002 congressional report on arms sales to developing countries, "The Persian Gulf War....played a major role in further stimulating already high levels of arms transfer agreements with nations in the Near East region. The war created new demands by key purchasers such as Saudi Arabia, Kuwait, the United Arab Emirates, and other members of the Gulf Cooperation Council (GCC) for a variety of advanced weapons systems."

"The Gulf states' arms purchase demands," the report continued, "were not only a response to Iraq's aggression against Kuwait, but a reflection of concerns regarding perceived threats from a potentially hostile Iran."

The U.S. dominated the arms market in the region from 1994-2001, selling more than $13 billion worth of weapons to Bahrain, Egypt, Israel, Jordan, Kuwait, Lebanon, Saudi Arabia and the United Arab Emirates. Russia and China also sold $8 billion worth of weapons to Iran, Algeria, Syria, Yemen, and Libya. Judging by numbers alone, it's hard to miss the parallels to Cold War-era geopolitical strategy.

Also hard to miss is the profit motive. 2001 marked a slump for arms dealers, as sales to developing nations dropped 43 percent, according to a Congressional Research Service (CRS) report. Peace, obviously, is not good business for the "defense" industry.

Why would countries siphon money from all manner of social programs in order to purchase expensive weapons systems if they didn't feel threatened? The reason has more to do with insecurity than fiscal logic, as evidenced by the fact that Israel, despite a declining economy, was the number one U.S. arms importer in 2001, purchasing, among other weapons, 52 F-16 fighter jets and six Apache helicopters.

Given that Israel has repeatedly violated international humanitarian law with its advanced U.S. weapons systems, it's clear that profits -- and geopolitical advantage -- trump human rights when it comes to selling weapons.

Focusing on the War, Not the Battle

"Every gun that is made, every warship launched, every rocket fired signifies, in the final sense, a theft from those who hunger and are not fed, those who are cold and are not clothed," proclaimed former U.S. President Dwight Eisenhower. "The world in arms is not spending money alone. It is spending the sweat of its laborers, the genius of its scientists, the hopes of its children... This is not a way of life at all, in any true sense. Under the cloud of threatening war, it is humanity hanging from a cross of iron."

Fifty years later, the figures seem to back Eisenhower up. The 2002 Federal Military Budget stands at a mind-boggling $343 billion. Consider that the same budget allocates a comparatively paltry $39 billion to children's health, $6 billion to the Headstart program, and $1 billion to combat world hunger. It's estimated that it would cost just $6 billion a year -- or approximately 1/57th of the military budget -- to provide healthcare for all uninsured children in the United States.

Given the broad bipartisan support for a war with Iraq, and considering the largely abysmal quality of most mainstream coverage of the subject, even the encouragingly large number of spirited anti-war protests around the world may not be enough to prevent an attack. However, there are battles and there are wars: The battle to prevent an attack on Iraq might fail, but the war to end a global arms race and U.S. militarism can still be won.

The U.S. military-industrial complex is a giant enterprise, employing hundreds of thousands of people, raking in billions of dollars in profits every year, with a veritable army of lobbyists and Washington insiders to maintain its dominant position in the U.S. economy. As such, the struggle to wean the country from its dependence on the defense industry has been -- and will continue to be -- a difficult one.

The good news is that the defense industry is not a monolith, and that opposition to U.S. arms sales is actually a popular, majoritarian stance. The problem is not so much one of educating the public on why arming the world to the teeth is a bad idea, but what can be done about it.

We can start by supporting efforts to end export subsidies on U.S. arms sales. Ever year, defense contractors receive billions of dollars in subsidies, taxpayer money poured right into the pockets of arms dealers; this needs to stop. Defense industry types claim they need these subsidies in order to remain competitive around the globe, but at a time when U.S. arms sales dwarfs our nearest competitor -- Russia -- by a margin of more than 9-to-1, this argument simply demonstrates the greed and lack of restraint that defines the defense industry.

The Bush Administration has been working, with relative success, to end all export controls on weapons in the name of fighting terrorism. Rebuffed in their efforts to completely eliminate weapons controls, they have turned to a strategy of incrementalism, successfully weakening or circumventing a host of weapons export controls, including the Export Administration Act. All efforts to weaken the control, oversight, and regulation of arms exports should be challenged vigorously.

Most importantly, the defense industry must not be allowed the secrecy it seeks. Public servants of both major parties must be scrutinized for conflicts of interest, and barred from public office if such conflicts come to light. This should include virtually everyone in the Bush Administration.

The agency once called the Bureau of Export Administration, which controls weapons exports, recently changed its name to the Bureau of Industry and Security (BIS). The BIS is part of the Commerce Department, and although lip service is paid to the office's responsibility for controlling arms exports, the BIS is also charged with promoting arms exports.

Politicians cannot simultaneously serve the interests of peace and war, nor can an office like the BIS serve two masters well. This office must be restructured or split in two if the concept of arms "control" is to be taken seriously. Instead of crowing on its website about its Defense Trade Advocacy Program generating "high-level, government-to-government advocacy on behalf of U.S. firms," helping them "succeed in today's highly competitive global defense market," and supporting "$22 billion in U.S. [weapons] exports since 1994," the BIS might instead make it its business to actually help stem the flow of arms to the rest of the world.

In 1925, President Calvin Coolidge uttered the famous line, "The business of America is business." However repugnant a truth that may be, fighting over the long haul against U.S. arms exports to the world -- and diminishing the political influence of the defense industry -- is important if we, as a nation wish to avoid the continuation of an even uglier truth: that the business of America is the business of war.

Brian Awehali is co-editor of LiP Magazine

Challenging the War on Drugs

A unique coalition of religious leaders, politicians, former inmates and addiction specialists gathered at a Los Angeles conference the weekend of Sept. 27-29 to discuss the impact of the "war on drugs," which has led to the incarceration of more than half a million Americans. While highlighting the disproportionate impact of the drug war on ethnic minorities, organizers and attendees of the conference -- called Breaking the Chains: People of Color and the War on Drugs -- sharply criticized the failure of incarceration as a strategy for controlling drug abuse.

Today, African Americans and Latinos make up over three-fourths of prisoners doing time in state prisons for drug-related offenses, despite the fact that the majority of drug users in the United States are white, according to the Department of Health and Human Services.

"Virtually every drug war policy -- from racial profiling to length of sentencing -- is disproportionately carried out against minorities," said Deborah Small, director of public policy and community outreach for the Drug Policy Alliance (DPA), which organized the event.

In addition to nearly 600 attendees who traveled from across the U.S. and Europe to sit in on dozens of sessions and workshops, politicians including California Rep. Maxine Waters, Texas Rep. Ciro Rodriguez, Massachusetts Rep. Barney Frank and others weighed in to support efforts to end what many participants referred to as the nation's "failed, prohibitionist" drug policies, including mandatory minimum sentences and three-strikes-you're-out legislation.

In 2002, federal and state governments will spend more than $40 billion in their battle against illegal drugs, compared with just $1 billion spent in 1980. Critics of the drug war point out that, despite this dramatic increase in drug war funding, neither street-level dealing nor drug trafficking has been reduced, and illicit drugs are now cheaper, purer and more readily available than they were decades ago.

"The least successful policy we have is the war on drugs," said Rep. Frank in a video presentation during the opening session.

Others in attendance included the three-term mayor of Baltimore, Kurt Schmoke, who spearheaded one of the nation's first needle exchange programs. AIDS is now the leading cause of death among both African American and Latino men between the ages of 25 and 44. Among African Americans, more than 60 percent of these deaths are associated with injection drug use resulting from contaminated needles.

During his tenure from 1987 to 1999, said Schmoke, he worked hard to get his constituents and Congress to understand that "there is no simple solution to these problems, but we should consider [addiction] to be a public health problem, not a criminal one."

Schmoke was joined by Antonio Gonzalez, President of the William C. Velasquez Institute in Los Angeles, who stressed the pressing need for movement-building across ethnic lines to address the "incarceration crisis" afflicting communities of color.

Religious and faith-based leaders issued some of the conference's strongest statements in favor of abandoning moralistic, punitive and ineffective drug policies. The task is as important for religious organizations and ethnic communities as it is for the government, said Ana Garcia-Ashley, Senior Staff Organizer for the Gamaliel Foundation in Wisconsin.

With regard to the drug war in America, "The church needs to become a spiritual body that cares about human beings and stands up for what's right when nobody else is doing it," said Garcia-Ashley, an outspoken Catholic activist and immigrant from the Dominican Republic.

Garcia-Ashley explained that she has two brothers who struggle with addiction.

"One brother is an alcoholic. One is dependent on illegal substances. [But] for me," she said. "These are good people. I don't have an evil brother and a Christian brother ... I have two good brothers."

Garcia-Ashley was one of several panelists in a workshop addressing issues of faith, morality and drug use. The workshop, moderated by Antionette Tellez-Humble, director for the New Mexico Drug Policy Project, also explored the unique impact of the drug war on Native Americans. Because Native Americans who live on tribal lands are subject to the mandates of federal -- not state -- law, those who are arrested for drug-related crimes are processed through the federal court system. Indians now comprise almost two-thirds of those prosecuted for criminal offenses in federal courts.

But the answer to addressing alcohol and substance abuse issues in Indian communities, said Gayle Zepeda, a community organizer with the Northern Circle Indian Housing Authority in Ukiah, Calif., is not prison. "Our communities must do that healing ourselves. We can't enact a law that we won't put something in our bodies or shoot something into our arms. For us, it's a healing issue and connecting with our own spiritual power so that we can arrive at a place of balance again."

Better, more effective and less expensive models for addressing both recreational and habitual drug use might lie in studying approaches of other countries, suggested several conference participants, including David McFarlane, a Scotland Yard detective and coordinator for the National Black Police Association of London.

McFarlane noted that police officers in the UK have begun to take a more relaxed approach toward "soft" drug use, including the possession of small amounts of marijuana.

James A. Pitts, a conference panelist and public health expert based in Sydney, noted that Australia, which now has the lowest HIV infection rate in the western world, has had considerable success with its treatment-oriented approach toward drug addiction.

"People are going to continue to use drugs, but you can do something to prevent the health consequences," he said.

Brian Awehali and Silja J.A. Talvi are co-editors of the online magazine, LiP.

BRAND NEW STORIES

Don't Sit on the Sidelines of History. Join Alternet All Access and Go Ad-Free. Support Honest Journalism.