April 13, 2007 4:00 AM PDT

Bug hunters face online-apps dilemma

(continued from previous page)

A case in point: Eric McCarty, a security professional, was sentenced in January to six months of house arrest and three years of probation and was ordered to pay $36,761.26 in restitution to the University of Southern California. McCarty pleaded guilty to hacking USC's online application system, but argued he was acting to get the system secured.

In the U.K., Daniel Cuthbert was ordered to pay about $1,750 for breaking into a Web site collecting donations for victims of the 2004 Asian tsunami. Cuthbert said he decided to check the security of the site because he feared he had fallen for a phishing scam.

But not all Web site owners will report security researchers to law enforcement.

"White-hat hackers are generally doing us a service," said Christopher Blum, security director at NetSuite, a San Mateo, Calif.-based provider of online business applications. He offered a caveat, however: Those hackers are providing a service if a vulnerability is reported privately, the company's operations weren't disrupted and customer data wasn't exposed.

There have been two instances in which a researcher reported a vulnerability to NetSuite, Blum said. In both cases, the problems were fixed and the individual wasn't prosecuted. "Responsible disclosure is a good practice and leads to better quality software," he said.

Others, including Google and Yahoo, also support "responsible disclosure" of vulnerabilities. Under this approach, advocated by software and Web companies alike, researchers who uncover a flaw will not publicly disclose the problem. Instead, they contact the maker of the affected product and share details, so that the company can fix it.

The best way for a Web bug hunter to hack without fear is to ask a target company for permission, legal experts said. NetSuite's Blum said his company would likely grant such a request, though with some strings attached. Many other companies, however, may not be inclined to allow somebody to poke around, said Sima of SPI Dynamics.

"Security through obscurity is helpful," he said. "I am not just going to open things up and give hackers the ability to go through my application."

Web companies could set up a copy of their applications for ethical hackers to probe. That way the main system wouldn't be disrupted and real customer data would not be at risk. While Sima's objections also apply to this approach, some security researchers do like it.

"This is a great idea," said Billy Hoffman, a lead researcher at SPI Dynamics. "A properly isolated mirror doesn't expose them to a larger security risk. The costs to the company are reasonably small and potential gains are huge. The smart ones would even offer a bounty on bugs that were found and properly disclosed."

Web companies, like traditional software companies, do hire security firms to have audits done. NetSuite, for example, uses Fortify Software's tools to scan its code for bugs and pays Ambiron Trustwave for a monthly scan of its Web site, Blum said. However, there are always more bugs that can be found, experts noted.

While security researchers may have the law against them, other laws are helping security, Seltzer noted. Data breach notification laws in particular are forcing organizations to tighten their security, she said.

"Companies start to realize that part of what they are selling to users of their applications is trust," Seltzer said. "Things like data breach notification laws will shine more light on security problems and make security along with privacy an element of what users are considering when they are evaluating sites to do business with."

Still, because security researchers can't freely probe Web applications, consumers are at risk, Seltzer said.

"If the only people who can investigate security are the gangs of foreign teenagers stealing credit cards and not those in the U.S. who would like to help the credit card holders shield themselves against these thefts, there is an imbalance," she said.

Previous page
Page 1 | 2

See more CNET content tagged:
Computer Fraud and Abuse Act, Computer Fraud, Web application, dilemma, hacker

1 comment

Join the conversation!
Add your comment
Learn the difference
There have always been hackers and there have always been crackers.

Hackers are the good guys while crackers are the bad guys.

But laws and regulations don't differentiate between them putting the good guys in the same bandwagon as the bad guys.

When you start handcuffing the good guys out of consideration for the bad guys... you'll always end up deeper in the $#!% hole than you started!!!

It's just common sense.

FWIW
Posted by wbenton (522 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.