September 6, 2005 4:00 AM PDT

Bug hunters, software firms in uneasy alliance

Tom Ferris is walking a fine line. He could be Microsoft's friend or foe.

Ferris, an independent security researcher in Mission Viejo, Calif., found what he calls a serious vulnerability in Microsoft's Internet Explorer Web browser. He reported it to the software giant on Aug. 14 via the "secure@microsoft.com" e-mail address and has since exchanged several e-mail messages with a Microsoft researcher.

Up to that point, Ferris did everything according to Microsoft's "responsible disclosure" guidelines, which call for bug hunters to delay the announcement of security holes until some time after the company has provided a fix. That way, people who use flawed products are protected from attack, the argument goes.

Last weekend, however, Ferris came close to running afoul of those guidelines by posting a brief description of the bug on his Security Protocols Web site and talking to the media about the flaw. So far, the move has done little more than raise some eyebrows at Microsoft.

"I am walking a fine line, but I am doing it very carefully because I am not disclosing actual vulnerability details," Ferris said. "I do this to inform users that flaws still do exist in IE...I don't like it that Microsoft tries to give users a nice warm feeling that they are disclosing everything researchers report to them."

At issue is the push for "responsible disclosure" of software flaws by many industry players, including titans such as Microsoft, Oracle and Cisco Systems.

Microsoft publicly chastises security researchers who don't follow its rules. Also, those researchers won't get credit for their flaw discovery in Microsoft's security bulletin, which is published when the company releases a patch. Because Ferris did not disclose any actual vulnerability details, he's still on Microsoft's good side, a company representative said.

While many software makers promote responsible disclosure, it isn't universally backed by the security community. Critics say it could make security companies lazy in patching. Full disclosure of flaws is better, they say, and turns up the heat on software makers to protect their customers as soon as possible.

How long is too long?
"Microsoft obviously takes way too long to fix flaws," Ferris said. "All researchers should follow responsible disclosure guidelines, but if a vendor like Microsoft takes six months to a year to fix a flaw, a researcher has every right to release the details."

By that time someone else, perhaps a malicious person, may also have found the same flaw and might be using it to attack users, Ferris said.

Often lambasted for bugs in its products, Microsoft is doing its best to win the respect of the security community. The company has "community outreach experts" who travel the world to meet with security researchers, hosts parties at security events and plans to host twice-annual "Blue Hat" events with hackers on it its Redmond, Wash., campus. At Blue Hat, hackers are invited to Microsoft's headquarters to demonstrate flaws in Microsoft's product security.

"Security researchers provide a valuable service to our customers in helping us to secure our products," said Stephen Toulouse, a program manager in Microsoft's security group. "We want to get face to face with them to talk about their views on security, our views on security, and see how best we can meet to protect customers."

Many companies are getting better at dealing with security researchers, said Michael Sutton, director of iDefense Labs, which deals with researchers and software makers. "The environment has definitely changed from two or three years ago, though there are vendors who are going in the opposite direction," he said.

While Microsoft sometimes is still referred to as the "evil empire," it appears to be successfully wooing security researchers.

"We are at the point where all the obvious things we tell Microsoft to do, they already do it," Dan Kaminsky, a security researcher who participated in Microsoft's first Blue Hat event last March, has said.

Balancing act
Other technology companies still struggle with hacker community relations. Cisco especially has managed to alienate itself from the hacker community to the extent that T-shirts with anti-Cisco slogans were selling well at this year's Defcon event. Oracle also isn't a favorite, researchers said.

Cisco, along with Internet Security Systems, last month sued security researcher Michael Lynn after he gave a presentation on hacking router software at the Black Hat security conference. The company had previously tried to stop Lynn from giving his talk in the first place.

"It was definitely a surprise to see Cisco's reaction," iDefense's Sutton said. "I don't think that's the best approach. I do feel that it is happening less and that vendors are realizing that we don't want to work against them, but with them."

Cisco contends it doesn't have any beef with Lynn's discoveries,

CONTINUED:
Page 1 | 2

See more CNET content tagged:
researcher, flaw, disclosure, Cisco Systems Inc., guideline

8 comments

Join the conversation!
Add your comment
Excuses
Ferris says he has a right to break/bend the rules if Microsoft takes six months to fix a flaw, but the one he is talking about he reported on August 14. Must be a problem with my calender.

Since Microsoft try to issue patches once a month perhaps six weeks would have been a reasonable wait until he used his discovery to promote his website.

Doesn't bother me anyway as I am using Opera!
Posted by Andrew J Glina (1673 comments )
Reply Link Flag
Simple Common Sense Answer...
The negotiation of detente as to who and when
information about a software flaw is disclosed
is an embarrassing waste of effort.

Software will always have some sort of flaws, be
they mechanical, logical, or whatever. Since you
know this a priori, and also that there's a
tendency to exploit such flaws to undesirable
ends, it behooves all involved to have in place
a contigency for dealing with the exploit. It
really doesn't matter whether it's brought to
the fore by public disclosure in a news article,
accidental disclosure in a web forum, or by
release of self-propagating exploit code.

One need only assume the worst possible scenario
and plan for that. This applies not only to
software vendors, but also to consumers who must
be prepared to take things (apps and even
hardware) off-line on a second's notice, if
necessary.

The most dangerous exploits are not those that
are publicized or cause obvious problems, they
are exploits that operate by silently
transmitting, redirecting, and modifying
information. A company whose network is slowed
to 10% its speed is inconvenienced, but a
company whose research is quietly transmitted to
a third party for resale without any hint is one
that's more than inconvenienced.
Posted by Gleeplewinky (289 comments )
Reply Link Flag
Need for independent (and secure) disclosure
We have a very tricky situation when it comes to security holes and their patches. If we don't watch it, there will chaos in and through the patent office.

One hundred years ago, if you created a widget you could patent it. If someone found a way to improve your widget they could patent the improvement. this allowed them to sell their improvement.... but not the widget that it improves. (check your land line telephone it probably has a dozen patents listed)

now in the land of software widgets we see this occurring... a gentleman found a flaw and reported the fix to the company who tried to take advantage of the good man by trying to patent his idea(the fix to their problem) It appears that persons who find flaws, AND create repairs should patent the fix so the company doesn't screw them.

When Eli Whitney patented the cotton gin(an enGINe that removed seeds from cotton) others tried to make subtle changes that did the same thing.... the courts ruled in favor of Mister Whitney because all the other versions had the same "Look and Feel" (my interpretation)

should anyone else try to make a patch it surely will have the "look and feel" so thus comes the quandary.

How do we keep holes hushed up until a fix is found but protect the rights of the discoverer and get the consumer the fix in the quickest possible way?

if we don't watch it we will create an industry of patent fixes being sold to the 'fools' gullible enough to buy products that should be code named "Swiss Cheese"

My thought is a non-profit organization that acts as a conduit for security problems and has a set schedule for disclosure. while a problem that is submitted with an acceptable fix(how determine acceptable?) should have a faster release schedule than one without a fix... (days instead of weeks) there should be few if not no reasons to allow extending a set schedule of: report to company; give 2 or 3 weeks to repair; publicly announce need to patch; and a week later explain the problem

and the public should have public access to the number of items in each category in real time and even the company's turnover time for repairs (two days is better than two weeks)

unfortunately it probably should be funded entirely by tax dollars (I am a believer in smaller government) since it affects all Americans to some degree and partial funding should be directed according to customers affected (problems times users = affected)(this will affect the companies to encourage better products)
and since it is all software related, partial funding by taxing software itself is probable remember even a game has the ability to affect security(though unlikely) (this will hit the consumer in proportion to its spending)

there could even be a rebate program for the companies with the least problems per customer to encourage them to produce better products (fair implementation, or rather its inability to be fairly implemented, will probably kill this idea)

The key here is in independence. Underwriters Laboratories and the Federal Communications Commission (aka UL [found on all electrical equipment] and the FCC [regulating the airwaves])are two examples that come to mind. they (especially UL)regulate without dictating beyond their scope. UL prevents fire hazards without saying what you can manufacture and FCC keeps you from stomping on another's signal.

likewise we need an agency to encourage quick repairs and protecting everyone's rights
Posted by qazwiz (208 comments )
Reply Link Flag
Interesting...
So, if you want to sink <insert vendor name
here>, look for bugs in their software, patent
the fix, then refuse to license it. Either they
don't fix the problem, or they willfuly violate
the patent and earn you triple damages. Suddenly
the wisdom of software patents shines through...
Posted by Gleeplewinky (289 comments )
Link Flag
If I build you a house...
....and the wood I used has bugs and I knew it, it would be my
responsibility to fix the problem.

If a software maker is creating software with the knowledge that
there could be problems; then it is there responsibility to fix the
problem.

I can't make the taxpayer's fix your house because of my
workmanship, so the same senario works for software makers
too.

Take responsibility of your problems, don't pawn them on
someone else.

~Justin~
Posted by OneWithTech (196 comments )
Link Flag
Software Makers Responsibility
When you create a piece of software, whether it be a database
program or operating system, it is the responsibility of the
manufacture to maintain updates and security fixes.

When an independent security researcher finds a flaw and
reports it to the company, his job is done. Now it is up to the
software manufacture to ensure that there software is not
compromising critical data.

In the case with Microsoft, all is true. For the largest software
company in the world, with the smartest people in the world,
you would think that they could find there own flaws and fix
them within' days or even hours.

Although this is not the case. As stated in the article by more
than one security researcher, software companies like Oracle
and Cisco get angry with independent researchers for exploiting
the flaws publicly. Why?

These researchers are doing the job of the software
manufacture's and getting little if any credit and no monetary
reimbursement for there time. Yet these independent
researchers do this to ensure public safety.

Companies like Microsoft, Oracle, and Cisco have plenty of
money and once again "the smartest people" working for them.
Yet in a scurry to try and release the newest piece of software
they tend to forget the upkeep on title's they already own.

In retrospect, the public release of software flaws is the only way
to get these companies to spend time on patches to ensure the
security of the program.

With that said, maybe it's time for Microsoft and the rest of them
to start finding there own flaws; and releasing patches before
general public would find out. To me this is just simple logic! To
them, the software companies, it tends to be more about money
than security.

Kudos to all the researchers the exploit these flaws; because it's
the only way that our society will be able to use technology
without fear of losing your identity or even worse, your life.

~Justin~
Posted by OneWithTech (196 comments )
Reply Link Flag
Yahoo IE7 Bug Security Risk
I am not sure if this is known issue yet but I encountered it yesterday

This is How I reproduce it :

I have updated my browser to IE 7. I signin using my login and password
and check my yahoo mails . I signout using the sign out link and get sign out
complete page with return to yahoo mail link on the top . On clicking
return to yahoo mail link I am redirected to my inbox without re-loging
in . This is a huge security concern. If some one thinks he is logged out and leaves system any one can re-login and acces his mail by clicking return to yahoo mail link .
Posted by realraghu (2 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.