Whos Watching Your Web Surfing?
December 29, 2002News & Politics
Picture this: you're a high school student who just received an assignment to research the lifecycle of the humpback whale. On your way to up the stairs you remember that you need to visit the web site of Congressman Dick Armey for your civics project. Strolling into the school's brand new, fully loaded computer lab (partially funded by federal grants) it occurs to you that your mother's birthday is coming up and that she'd really like some fresh pussy willows for the house.
Opening your Internet browser, you decide to start with the pussy willows. A window pops up: "This Internet site is blocked due to adult content." Frustrated, you turn to the humpback whales for help, but the same message appears. Forget about Dick Armey's homepage. Those federal grants for the lab didn't come string-free. Your school's computers have been fitted with Internet filtering software.
"People are seeking a quick technological fix to a complex social problem," says Will Doherty, organizer of a San Francisco press conference opposing filtering software. It and similar rallies across the country have been asking a poignant and perhaps overlooked question: why shouldn't teens be trusted to make decisions for themselves?
Congress has shown their sentiments on the subject by passing the Children's Internet Protection Act (CIPA) in 2000. The measure stipulates that any school or library receiving a certain strand of funding known as the E-rate would be required to install software capable of blocking material deemed "inappropriate" for minors.
Laden With Problems
This quick technological fix is laden with additional problems that critics have been quick to cite. Most filtering software, they argued, prevented patrons from accessing legitimate research material by over-blocking. Administrators of libraries and school computer labs are only allowed to choose what should be blocked from a list of ambiguous categories. "Sex" and "cults" are some of the options the software companies like N2H2 and Solid Oak, makers of the popular CyberSitter, offer. Many teachers have complained that students were unable to obtain information on Columbine due to "terrorism" being chosen by the schools' administrators as a category.
The broad strokes that filtering software cuts with can often block an entire site. Therefore, a Wiretap Magazine article containing a few choice watchwords may render all of Wiretapmag.org inaccessible. Considering this, some online publication editors could be led to censor their writers rather than risk losing readers. Conversely, the software allowed entrance to a number of pornographic web sites clever enough to keep common watchwords out of their URLs or site descriptions.
Worse, say critics, is that the actual lists of the sites being blocked are kept confidential by the software companies. Students and teachers are then left to guess as to what the offensive material might have been. This leaves the reins of power in the hands of software companies rather than educators, leading many to wonder whose standards are being expounded in the restriction guidelines.
In 1996, an organization called Peacefire began a campaign against filtering software based on the belief that it was restricting minors' access to important information online. Bennet Haselton, the group's 24-year-old spokesperson, says that the Peacefire found some disturbing examples of this problem. For instance, he told WireTap The American Cancer Society's sitewas blocked by a filtering software called Cyber Patrol as a "sex" site and the Mothers Against Drunk Driving site was blocked by a program called Bess as an alcohol site. In another example a site that used to advocate for the use of filtering software in libraries called www.filteringfacts.org was blocked by SurfWatch as a "drugs/alcohol" site.
As Haselton also points out, blocking that's based on individual "danger" words is silly at best, destructive at worst. He says these processes affect young people, in particular, because there is a lot of very useful information about sexuality, drugs and alcohol that young people may not be able to access otherwise. Sites for organizations like Planned Parenthood, for instance, are often blocked.
"But more importantly," says Haselton, "I think people should get in the habit of requiring an explanation from people who want to impose rules on them. People who are easily controlled as [youth] can grow up to be easily controlled by manipulative politicians, or cynical marketing campaigns. It's everybody's responsibility to critically examine the rules that others want to impose on them."
By Whose Standards?
Sixteen-year-old Emma Rood began visiting the library often during her early teenage years as she prepared to come out to friends and families as a lesbian. Researching sites such as Planetout.com allowed her to chat online with other gay teens and adults who gave her reassurance and advice on dealing with friends and family. After filtering software was installed in the library, such sites were no longer accessible. Her mother, a librarian and supporter of Emma's decision, has advocated that access to this kind of information is necessary to assuage the alienation and depression that befall many gay youth.
Who decided that Planetout.com should be blocked? Who holds the decision-making power in the companies that manufacture filtering software? In a report released earlier this year by Nancy Willard of the University of Oregon, conservative Christian organizations are often the ones holding the strings. Of the eight companies researched, seven of them have blocking categories that strongly suggest their blocking is "based on religious or other inappropriate bias." She also found that the companies make no attempt to inform school officials and other clients of their religious affiliations, and that such information can only be found through diligent research.
The connections become all the harder for school officials to make because the lists of sites that the software blocks are kept confidential. All this puts schools in quandary as to how they should deal with the Internet.
A Better Solution
Many adults share this view, and espouse a more progressive approach to the issue. "It's impossible to filter something out completely� because there are millions of sites added every day," said Stephanie Elizando Greist of the Free Expression Policy Project. "A far better alternative, we believe, is media literacy � teaching youth how to navigate the Internet."
The Free Expression Policy project believes that youth should stay actively critical of what they're looking at online, rather than being "passive consumers." But media literacy doesn't just refer to the Internet. In a recent statement, the Policy Center said that rather than looking at TV shows and video games as "neutral vehicles for information possessing some valid claim to authority or truth," students should be taught that "media 'realities' are 'constructed'-- whether to produce an adrenaline rush, sell a product, or reflect a social or cultural idea."
Haselton agrees. Instead of protecting youth from harmful content or information, he believed adults should be encouraging youth to choose for themselves. And, if that doesn't work -- as groups like Peacefire have exemplified -- youth should encourage one another to do their own filtering.
"I think the points of view that blocking software promotes are archaic," Haselton says, "and nobody's tax dollars should go towards promoting ideas that don't stand up to common sense. Why is the word 'fuck' harmful but the word 'screw' isn't? If people want to follow these rules for themselves, fine, but we shouldn't raise people to blindly accept rules [without being informed of] the reasons behind them."
Dave Kender writes, lives and works in San Francisco.

Opening your Internet browser, you decide to start with the pussy willows. A window pops up: "This Internet site is blocked due to adult content." Frustrated, you turn to the humpback whales for help, but the same message appears. Forget about Dick Armey's homepage. Those federal grants for the lab didn't come string-free. Your school's computers have been fitted with Internet filtering software.
"People are seeking a quick technological fix to a complex social problem," says Will Doherty, organizer of a San Francisco press conference opposing filtering software. It and similar rallies across the country have been asking a poignant and perhaps overlooked question: why shouldn't teens be trusted to make decisions for themselves?
Congress has shown their sentiments on the subject by passing the Children's Internet Protection Act (CIPA) in 2000. The measure stipulates that any school or library receiving a certain strand of funding known as the E-rate would be required to install software capable of blocking material deemed "inappropriate" for minors.
Laden With Problems
This quick technological fix is laden with additional problems that critics have been quick to cite. Most filtering software, they argued, prevented patrons from accessing legitimate research material by over-blocking. Administrators of libraries and school computer labs are only allowed to choose what should be blocked from a list of ambiguous categories. "Sex" and "cults" are some of the options the software companies like N2H2 and Solid Oak, makers of the popular CyberSitter, offer. Many teachers have complained that students were unable to obtain information on Columbine due to "terrorism" being chosen by the schools' administrators as a category.
![]() | ![]() | ![]() | ![]() | ![]() |
![]() | "People are seeking a quick technological fix to a complex social problem." | ![]() | ||
![]() | ![]() | ![]() |
Worse, say critics, is that the actual lists of the sites being blocked are kept confidential by the software companies. Students and teachers are then left to guess as to what the offensive material might have been. This leaves the reins of power in the hands of software companies rather than educators, leading many to wonder whose standards are being expounded in the restriction guidelines.
In 1996, an organization called Peacefire began a campaign against filtering software based on the belief that it was restricting minors' access to important information online. Bennet Haselton, the group's 24-year-old spokesperson, says that the Peacefire found some disturbing examples of this problem. For instance, he told WireTap The American Cancer Society's sitewas blocked by a filtering software called Cyber Patrol as a "sex" site and the Mothers Against Drunk Driving site was blocked by a program called Bess as an alcohol site. In another example a site that used to advocate for the use of filtering software in libraries called www.filteringfacts.org was blocked by SurfWatch as a "drugs/alcohol" site.
As Haselton also points out, blocking that's based on individual "danger" words is silly at best, destructive at worst. He says these processes affect young people, in particular, because there is a lot of very useful information about sexuality, drugs and alcohol that young people may not be able to access otherwise. Sites for organizations like Planned Parenthood, for instance, are often blocked.
"But more importantly," says Haselton, "I think people should get in the habit of requiring an explanation from people who want to impose rules on them. People who are easily controlled as [youth] can grow up to be easily controlled by manipulative politicians, or cynical marketing campaigns. It's everybody's responsibility to critically examine the rules that others want to impose on them."
By Whose Standards?

Who decided that Planetout.com should be blocked? Who holds the decision-making power in the companies that manufacture filtering software? In a report released earlier this year by Nancy Willard of the University of Oregon, conservative Christian organizations are often the ones holding the strings. Of the eight companies researched, seven of them have blocking categories that strongly suggest their blocking is "based on religious or other inappropriate bias." She also found that the companies make no attempt to inform school officials and other clients of their religious affiliations, and that such information can only be found through diligent research.
The connections become all the harder for school officials to make because the lists of sites that the software blocks are kept confidential. All this puts schools in quandary as to how they should deal with the Internet.
A Better Solution
Many adults share this view, and espouse a more progressive approach to the issue. "It's impossible to filter something out completely� because there are millions of sites added every day," said Stephanie Elizando Greist of the Free Expression Policy Project. "A far better alternative, we believe, is media literacy � teaching youth how to navigate the Internet."
The Free Expression Policy project believes that youth should stay actively critical of what they're looking at online, rather than being "passive consumers." But media literacy doesn't just refer to the Internet. In a recent statement, the Policy Center said that rather than looking at TV shows and video games as "neutral vehicles for information possessing some valid claim to authority or truth," students should be taught that "media 'realities' are 'constructed'-- whether to produce an adrenaline rush, sell a product, or reflect a social or cultural idea."
Haselton agrees. Instead of protecting youth from harmful content or information, he believed adults should be encouraging youth to choose for themselves. And, if that doesn't work -- as groups like Peacefire have exemplified -- youth should encourage one another to do their own filtering.
"I think the points of view that blocking software promotes are archaic," Haselton says, "and nobody's tax dollars should go towards promoting ideas that don't stand up to common sense. Why is the word 'fuck' harmful but the word 'screw' isn't? If people want to follow these rules for themselves, fine, but we shouldn't raise people to blindly accept rules [without being informed of] the reasons behind them."
Dave Kender writes, lives and works in San Francisco.