April 10, 2003 1:18 PM PDT

Report criticizes Google's porn filters

WASHINGTON--Children using Google's SafeSearch feature, designed to filter out links to Web sites with adult content, may be shielded from far more than their parents ever intended.

A report released this week by the Harvard Law School's Berkman Center for Internet & Society says that SafeSearch excludes many innocuous Web pages from search-result listings, including ones created by the White House, IBM, the American Library Association and clothing company Liz Claiborne.

The omissions occur because of the way Google designed the feature, which can be enabled or disabled through a preferences page. The feature uses a proprietary algorithm that automatically analyzes the pages and makes an educated guess, without intervention by Google employees.

That technique reduces the cost of the SafeSearch service, but it can lead to odd results. It's perhaps unlikely that many humans would have classified a BBC News report on East Timor, Mattel's site about its Scrabble game -- the URL includes the word "adults"--or the Nashville Public Library's teen health issues page as unsuitable for minors. Some articles from CNET News.com and CNET Software are also invisible to SafeSearch users.

"If Google put some of its smart people on this task, they could do a much better job than they have so far," said Ben Edelman, the student fellow at the Berkman Center who performed the research. "They've got a lot of smart people. It would be shocking if their great engineers couldn't do better. The question is whether that's a priority for Google."

Google admits that the thousands of innocuous sites listed by the Berkman Center's report are invisible to SafeSearch users. But the company challenged the methodology of the study, saying that some of the sites are missing because their Webmasters employ a device called the "robots.txt" file, which is designed to limit automated Web crawlers in various ways.

Such a file might, for example, ask Web crawlers not to visit a certain area of the site because repeated visits would slow down the server considerably. Social etiquette dictates that crawlers should obey a robots.txt file. Google chooses not to include pages that use such files in SafeSearch listings because its crawler can't explore the entire site and thus, the company says, can't be expected to judge the site's content.

Edelman said he was unaware of the robots.txt exclusion when he conducted the study, and revised his report on Thursday to include a discussion of the issue. The report was originally released Wednesday. Edelman said only 11.3 percent of the sites listed in his study are filtered because their Webmasters created robots.txt files. Those include sites from IBM, Apple Computer, the City University of New York, Groliers, and the Library of Congress.

"It doesn't matter whether SafeSearch omits a site because the site has a robots.txt file or because SafeSearch is imperfect," Edelman said in an interview. "Either way, the site would have been relevant but disappears from results."

Some of the thousands of nonpornographic sites without robots.txt files that are filtered include offerings from the Vermont Republican Party, the Stonewall Democrats of Austin, a U.K. government site on vocational training and the Pittsburgh Coalition Against Pornography. News sites take a hit too, with articles from Fox News, Wired News, The Baton Rouge Daily News and some Web logs affected.

Google argues that SafeSearch is designed to err on the side of caution. David Drummond, Google's vice president for business development, said: "The design was meant to be overinclusive. The thinking was that SafeSearch was an opt-in feature. People who turn it on care a lot more about something sneaking through than they do about something getting filtered out."

Drummond said that the list of off-limits sites is created "in an automated way" without human intervention. "It looks at keywords, it looks at certain words, the content of the page, the weighting of certain words that are likely to be found on something that's a bad site," Drummond said. An employee becomes involved when Google receives a complaint about a legitimate site that should have been visible or a pornographic one that was, Drummond added.

Google is hardly alone in encountering problems when separating the wheat from the chaff on the Internet. In fact, filtering software is so problematic that Edelman, with Harvard professor Jonathan Zittrain, has made something of a career out of documenting overblocking and underblocking flaws in the programs. A federal appeals court relied on that research when deciding that Congress' attempt to force filters on public librarians was unconstitutional. That decision is on appeal to the U.S. Supreme Court.

There seem to be few consistent patterns in SafeSearch's overblocking, but one that does appear is that Web pages about Edelman and other Harvard researchers who have written about filtering software's problems are blocked too.

"It might be difficult for an AI (artificial intelligence-based) system to figure out that this is a site about regulating pornography on the Internet instead of actual pornography," Edelman said.

Google's "SafeSearch Help" carries this disclaimer: "While no filter is 100 percent accurate, Google's filter uses advanced proprietary technology that checks keywords and phrases, URLs and Open Directory categories...Google strives to keep the filtering information as current and comprehensive as possible through continual crawling of the Web and by incorporating updates from user suggestions."

See more CNET content tagged:
SafeSearch, Ben Edelman, Berkman Center, Web crawler, listing

2 comments

Join the conversation!
Add your comment
i have been told that i was raped and drugged and was photographed portraying me on a web site as a prostitute. i know from experience of waking up in the middle of a gang rape that this is possible. can you tell me how to access pay per view sites that are pornographic?
Posted by rashigrove (1 comment )
Reply Link Flag
The job of a search engine is to do what the user wants it to do, not dictate terms to the user what the surfer ought to see. If the search engine polices Internet surfers than search engines such as Google are next to useless. A search engine would have to be devised which is not US based and gives access to every desired website and not block access. Google checks on the website clicked on and sends the Internet user back to the Google home page all the time. Google is policing itself or is really a front of special interests groups od security and intelligence agencies. of the USA. A non-USA search search engine would help, because Google says if it enables people access to websites which are 'illegal in the USA' then there is no chance of accessing any desired website. The ideal search engine, does not filter, it simply directs the Internet surfer to the website. What possibly is a crime in the USA is not necessarily one outside USA and Canada.
Posted by ZDnexus (2 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.