Galileo's Ghost

What makes a scientific revolution? It could be the birth of a new idea, like Einstein's theory of relativity or Galileo's theory of motion. Or it might be in the long haul that comes after a sudden insight, like the way the Human Genome Project extended the vision of James Watson, who helped create the first model of DNA's double helix structure.

What the past 400 years have taught us, however, is that scientific revolutions come out of an entire culture. They're nurtured by educational institutions, but more important, they're cultivated by a willingness on the part of the general population to remain open-minded about radically new perspectives on, well, the structure of the entire universe. You cannot have a scientific revolution in societies like the ones that existed in medieval Europe -- not because there are no Einsteins but because there is no cultural context for them. Einsteins are never given a chance to develop, and if they do, they are expunged in a more or less messy way.

And this is why I worry, now that President George W. Bush seems to be about to act on his doctrine of preemptive strike, that the United States stands to lose far more than a war. The Bush administration is purging my country of its scientific culture. Just take a cursory look at the White House cybersecurity proposal, coupled with the administration's recent evisceration of its extensive team of science advisers, during the past several weeks. While the public is diverted by crazed flag-waving and xenophobic hysteria, Bush is murdering the ideas that make the United States a country worth caring about.

Bush's special cybersecurity adviser Richard Clarke has prepared a draft of the government's new cybersecurity proposal, which was released a couple of weeks ago for comment. Clarke, whose office has always been tight with Microsoft execs, has, not surprisingly, placed Microsoft's controversial Palladium technology at the center of a plan for making sure every citizen engages in "trustworthy computing." Along with several recommendations that range from the sensible to the silly -- securing the Domain Name System (not a bad idea), urging ISPs to monitor their customers, creating a National Cyberspace Academy (huh?) -- there are some deeply alarming "national priorities" listed. One such priority is keeping close tabs on scientific developments in "intelligent agents" and nanotechnology.

Intelligent agents are programs that can carry out commands on their own to a very limited extent -- that is, you tell them to do something, and they go off and do it without any further input from you. They are mostly being developed for useful and innocuous artificial intelligence projects that do things like keep track of your schedule and find the bathroom for you in a building. Likewise, nanotech has literally thousands of peacetime uses in everything from materials engineering to medicine. Being singled out for negative attention by the government will obviously have a chilling effect on research in these potentially rich areas -- after all, who wants to give a grant to a project that the president believes will endanger our cybersecurity?

Meanwhile, Bush has been quietly "retiring" numerous scientific advisers and committees. Last week the Washington Post reported that the Department of Health and Human Services pulled the plug on two expert committees, one of which had recommended more oversight for human test subjects in mental institutions. The other had lobbied strongly for the Food and Drug Administration to regulate home genetic tests that are sold to the public at great expense and whose results are almost entirely unreliable. Bush has also begun packing other committees that advise the government on issues like pollution and bioterrorism with industry-friendly scientists such as Dennis Paustenbach, the California toxicologist who was an expert witness for Pacific Gas and Electric Co. in the Erin Brockovich case.

Science, as any pragmatist will tell you, is made possible by a combination of brilliant insight, political scrambling, and cash flow. During other periods in U.S. history, politicians who created tech and science policy attempted to balance their economic concerns with advice from scientists whose views, if not impartial, at least represented a swath of opinions. Scientific innovation and investigation were -- perhaps grudgingly -- valued for their ability to provide new perspectives, not for their conformity to corporate party lines and other dogmas.

When I look in the direction our culture seems to be going, I think a lot about Galileo, imprisoned by the Catholic Church in the early 17th century for refusing to categorically reject the idea that the earth revolves around the sun. The church came out on top: Galileo died pretty much in disgrace, and church policy remained unchanged. In Galileo's culture, science was sacrificed in the name of ideology, in the name of spiritual security, if you will. And truth was sacrificed too. What will we sacrifice in the coming years?

Annalee Newitz (galileo@techsploitation.com) is a surly media nerd who wishes she had heart tissue-repair nanotech right now. Her column also appears in Metro, Silicon Valley's weekly newspaper.

ACLU By ACLUSponsored

Imagine you've forgotten once again the difference between a gorilla and a chimpanzee, so you do a quick Google image search of “gorilla." But instead of finding images of adorable animals, photos of a Black couple pop up.

Is this just a glitch in the algorithm? Or, is Google an ad company, not an information company, that's replicating the discrimination of the world it operates in? How can this discrimination be addressed and who is accountable for it?

“These platforms are encoded with racism," says UCLA professor and best-selling author of Algorithms of Oppression, Dr. Safiya Noble. “The logic is racist and sexist because it would allow for these kinds of false, misleading, kinds of results to come to the fore…There are unfortunately thousands of examples now of harm that comes from algorithmic discrimination."

On At Liberty this week, Dr. Noble joined us to discuss what she calls “algorithmic oppression," and what needs to be done to end this kind of bias and dismantle systemic racism in software, predictive analytics, search platforms, surveillance systems, and other technologies.

What you can do:
Take the pledge: Systemic Equality Agenda
Sign up