November 27, 2001 12:45 PM PST

Google may let surfers rank search results

The arms race between search engines and traffic-hungry Web sites may be headed to a new level.

Battles have seesawed for years between search engines intent on providing relevant, unbiased listings and companies seeking top placement in results, no matter what. Now Google is engaged in a controversial experiment aimed at giving its users a say in ranking the sites--a move that could help the company cement its lead in the competitive Web-search market, or could potentially weaken its position.

Whether it works or not, Google's effort is one of the boldest attempts so far to combat the rising tide of commercialism among search engines and portals. Search is one of the most widely used tools on the Internet, topped only by e-mail.

Two weeks ago, Google began quietly testing a Web page voting system that, for the first time on a large scale, could eventually let Web surfers help determine the popularity of sites ranked by the company's search engine.

If successful, the tool would add a more democratic voice to Google's search technology, which has created one of the most successful search services on the Web. It uses secret mathematical formulas to automatically rank Web sites by their page content, link structure and importance relative to other sites.

The voting feature is still in the testing phase, but it has already sparked concerns among some search experts and Web publishers over the company's ability to deliver untainted results despite the increasing commercialism of search engines. People who specialize in pushing sites into the top rankings--a technique known as search engine optimization--say the company's success has made Google a new frontier to conquer. And they assert that its system, like any other, can be outsmarted.

"Once a search engine optimizer sets their sights on a search engine, it doesn't take them long to figure out its weaknesses," said Dana Todd, co-founder of interactive agency SiteLab, who says her company uses legitimate methods to get top billing for a site.

"Google has always tried to be user aware and...known for its relevant results," she added. "But now it's starting to get junky. And in order to manage that, they need help from users."

Google's quest for unbiased results comes as search engines evolve into one of the most sought-after marketing tools on the Net. Sites including Yahoo, America Online and Microsoft's MSN openly run paid listings along with their normal results--a practice that has been targeted in an investigation by federal regulators. Google also offers special placement on its search page for advertisements related to keywords, although the paid listings are clearly marked.

Meanwhile, "spammers," who aim to capitalize on popular search terms by misrepresenting the content of a site, and their more legitimate search-optimization cousins have become fixtures of the Internet search landscape. OneUpWeb Optimization, which helps online properties land prime real estate in search engines' rankings, has a lengthy client list that includes Symantec, Kimberly-Clark and Priceline.com, according to its Web site.

In the cloak-and-dagger world of How search engines work search engines, companies can use various tricks to sway the technology agents, known as spiders, that search engines use to catalog the Web. For example, some site operators try to skew search results by including hidden text or misleading keywords on pages fed to the spiders. In extreme cases, people create hundreds of thousands of "doorway pages" to serve as entry points to a Web site and drive more traffic to the page. Others will create Web pages that are elaborately interlinked to get a higher ranking on Google and other search services that list sites by link popularity.

On guard
Search engine experts said that most search engines, including Google, have come up with defenses against many techniques for manipulating search results.

"Google is really good at spam detection," said Chris Sherman, editor of SearchDay, a newsletter published by Search Engine Watch. "If anyone tries to spam in the old-fashioned way, like cramming pages with metatags, that's not going to work."

But Sherman said search engines are still trying to counter new techniques. One method, called "cloaking," sets up a dummy page including lots of relevant information for keywords hidden through a special link. The cloaked page is fed to the search engine to boost a site's search ranking for specific terms such as "games," "sports" or "books." When surfers go to that link, however, they see a page that is different from the one indexed by the crawler.

Marketers say cloaking and other tactics can be useful and legitimate tools in certain cases. Some site operators use a cloaked page to get a highly relevant site a top slot in search results and to keep outsiders, or spammers, from understanding how it landed there. But those same techniques can be abused, they admit.

"In the wrong hands, search engine optimization (techniques) can be the most dangerous weapons for spammers," said Jessie Stricchiola, director of online marketing for the Chase Law Group, a Los Angeles-based law firm. "Search engine specialists research every day to figure out how to best promote a Web site...If pornographers see that everyone is searching for 'purple sneakers,' they're going to find a way to link their Web sites to 'purple sneakers.'"

Abuse management
Signs that spam has become a greater priority for Google include a service launched several weeks ago that lets people file complaints via e-mail of alleged spamming abuse. A note on the Web site warns that action may not be prompt, however.

That service was followed two weeks ago by a new beta version of Google's Web browser toolbar, which adds smiling and frowning face icons to the usual menu of buttons. People can click on one or the other to indicate whether they like or dislike a Web page.

Google posted a link to the download of the test version of the toolbar on a forum for Webmasters in a discussion thread titled "New way to report spam."

Votes tallied will not immediately influence search results on Google, which is merely evaluating the feature for now, according to a spokesman.

"This is at a very early stage; it's definitely an experiment," said Google's David Krane, who added that spam control is just one of the potential uses of the voting feature.

Currently, Google's proprietary system ranks sites primarily by words listed on the page, terms used in a page's title or similar factors. It also ranks a page's popularity, determined by the number--and importance--of sites linking to a page. For example, a page that is linked to 100 times from a reputable newspaper's Web site would rank higher than a page linked to 500 times from a porn site.

The new feature would give Google insight into the popularity of a site based on opinion, not link structure. For example, the search would call up a page that may not have many sites linking to it but is extremely popular with consumers.

"This gives ordinary Web users the ability to tell a search engine what they like and don't like and we don't have that interaction now," said Danny Sullivan, editor of Search Engine Watch, a search engine industry newsletter and Web site.

But as Google toys with new, nonmathematical checks, it has unleashed concerns that it will simply open itself up to new problems.

"There are people that will go to great lengths to try to sway results and this gives them another method of doing that," Sullivan said. "People who like their Web sites are going to go to their own site and punch the happy-face button about a billion times, and people who want to hurt their competitors will go to their competitors sites and punch the unhappy-face button."

He compared the beta technology to that of Ask Jeeves' DirectHit search service, which ranks Web pages by number of clicks it receives. Some Web site operators claim they have developed technology to boost the site ranking by engineering clicks.

Placing greater emphasis on human input could mark a significant turning point for Google, which until now has relied primarily on mathematical formulas and automation.

Google's Krane, however, noted that the company has solicited user feedback in the past in the form of focus groups, for example.

He added that the voting system takes into account abuses such as repeat clicks that attempt to "noodle" the system. Rather than using the votes to tinker with the specific rankings of particular pages or sites, he said, the feature would most likely be used to bolster the relevance of overall results.

"It will most likely have more of an aggregate impact," Krane said. "We have indexed more than 1.6 billion Web pages, so it is extremely inefficient to go after individual pages."

 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.