Have you heard the horrifying news? The future doesn't need us. It belongs to robots and superintelligent machines -- destined to seize the planet from us pesky humans as surely as any superior species defeats a weaker biological competitor. And -- surprise! -- it could happen this century.
At least, that's the theory proposed in a new and controversial treatise from computer scientist Bill Joy. His wordy opinion piece graces this month's cover of Wired magazine, but at almost 20 pages, don't feel bad if you didn't read it.
Still, it's causing quite a stir in the tech community, and even gave Joy some decidedly mainstream face time with the "Today" show's Katie Couric. Quite a feat for a guy previously known only in geek circles.
The article is certainly a nail-biter. Joy, chief scientist and co-founder of Sun Microsystems, proposes that the emerging technologies of the 21st century will have greater potential for mass destruction than nuclear weapons.
"We are on the cusp of the further perfection of extreme evil," Joy writes. Here's why: Because these new technologies -- specifically genetics, nanotechnology and robotics (or GNR, as he calls them) -- are advancing at such exponential speeds, Joy believes they will soon become available to the average person. And that's where the danger lies.
Joy (what's in a name?) hypothesizes that terrorist groups and crazed Ted Kaczynski types might use GNR technologies to destroy mankind. His article's big dramatic device comes in the form of a long quote from the tech-hating Unabomber himself, who was convinced that intelligent machines would eventually take over the earth.
"People won't be able to just turn the machines off," rants Kaczynski in a quote from his infamous manifesto, "because they will be so dependent on them that turning them off would amount to suicide."
After that, it's hasta la vista, baby. The robots take over. Or as an overpaid Austrian film star once said, "You're terminated."
To a technology advocate like myself, this sort of doomsday rhetoric usually slides off like so much yolk on Calphalon. But Bill Joy's a different sort of egg. He's no Luddite; in fact, he's spent his entire adult life designing computer software and systems. And in 1993, one of Joy's friends was gravely injured by a letter bomb sent by a certain Montana shut-in. Chicken Little? Joy doesn't seem like the type.
Joy roots his fears in a trend many of us have become blasé about: The ever-increasing rate of technological change. In the 1960s, Intel chairman Gordon Moore predicted that computer chips would double in performance every 18 months. He was right; and Moore's law hasn't stopped yet. Think about it -- that bleeding-edge 500 mHz PC you got in '98 is half as fast as today's 1 gigahertz screamers.
Still ... only a few years ago, technologists believed that Moore's law wouldn't last another decade. That was before nanotechnology. Scientists are now experimenting with molecular electronics -- transistors the size of molecules. Because of this and related nanotechnologies, Joy believes "we should be able to meet or exceed the Moore's law rate of progress for another 30 years." Many computer scientists concur.
Joy has simply done the math. If Moore's law holds, by 2030 we'll have machines a million times as powerful as today's PCs -- or, roughly the computing power of the human brain. But that's not the real Dr. Frankenstein stuff -- one of the realistic goals of nanotech research is to create tiny machines that self-replicate.
Smart robots that copy themselves? Now that's scary.
It gets worse: The main engine driving this progress is not the scientific community, but big business. Warns Joy, "We are aggressively pursuing the promises of these new technologies within the now-unchallenged system of global capitalism."
OK, you got me, Mr. Joy. Now I'm really afraid.
But what's Joy's joyless solution? Stop the GNR research. Or least slow down and manage it.
"We must find alternative outlets for our creative forces," he suggests, adding, "My immediate hope is to participate in a much larger discussion of the issues raised here."
After reading Joy's article, I got on the Net to see what the tech heads were saying. The response was merciless.
"It's very disturbing that one of the architects of Java is so strongly advocating restricting individual rights," posted one angry member of slashdot.org's tech community.
Others were more philosophical. "Even if extinction is (inevitable), that does not mean the journey ... is not worth it," wrote another hacker. "If you could live forever in some cave or live a normal life span and see the wonders of the world, which would you choose?"
Me, I'd rather choose a third alternative. I think Joy's onto something, so let's continue to talk this through. A scientist with a conscience is a good thing. Just as only Nixon could go to China in '72, Bill Joy is the ideal messenger to deliver this cautionary tale to the world.
It's our turn to decide what to do next.