August 17, 2006 4:00 AM PDT

Flaw finders to software makers: It's payback time

Bug hunters are turning the tables on software makers in the debate over reporting flaws.

In recent years, software companies have hammered out rules with researchers on disclosure, which cover how and when vulnerabilities are made public. Now flaw finders want something in return: more information from software providers on what they are doing to tackle the holes the researchers have reported.

"We have gone from the old 'full disclosure' to 'responsible disclosure' debate, to a debate over 'The vendor has the information--what does it do with it?'" said Steven Lipner, senior director for security engineering strategy at Microsoft.

Software vendors need to establish protocols for interacting with researchers who share bug information, experts said. If they don't, they could risk losing the progress that has been made towards responsible disclosure of flaws.

Many bug hunters now understand and follow the "responsible disclosure" guidelines advocated by software companies. Under this approach, a researcher who uncovers a flaw will, as a first step, contact the maker of the affected software and share details of the vulnerability.

In the past, researchers tended to favor full disclosure, in which they would publish details of security flaws they had found on mailing lists or on security Web sites, regardless of whether a fix was available.

"Researchers want the vendors to be more aggressive, and the vendors want the researchers to show more discretion."
--Gartner analyst Paul Proctor

However, companies want to keep bug details under wraps at least until a patch is ready. They argue that with a patch, users of the flawed software can plug the hole and protect themselves against possible attacks. By contrast, with full disclosure vendors are sent scrambling to fix a flaw, while customers are exposed.

"The tension has always been the same," said Gartner analyst Paul Proctor, who moderated a panel discussion on disclosure at the recent Black Hat security conference. "Researchers want the vendors to be more aggressive, and the vendors want the researchers to show more discretion. While they both have the same goal of a more secure Internet, their perspectives are different."

Brick wall
While many researchers now follow responsible disclosure practice, some feel that their conscientiousness is not being reciprocated. In many cases, the say, they run into a brick wall or get a limited response at the software maker, which pays them little respect for their work.

"There is nothing more frustrating then trying to help a vendor secure its product in good faith and not getting decent communication back in return," said Terri Forslof, security response manager at TippingPoint, which sells intrusion prevention systems. Forslof is responsible for sharing flaw details with vendors through TippingPoint's Zero Day Initiative bug bounty program. Others agree: Her comments echo the sentiments expressed by many researchers at the Black Hat panel discussion.

"An open line of communication is essential."
--Michael Sutton, director, VeriSign's iDefense

There is a simple recipe for satisfying flaw finders, Forslof said. A company should acknowledge the issue; provide ongoing information on the status of a fix; and be open with the researcher about the processes involved in producing an update.

"An open line of communication is essential," said Michael Sutton, one of the Black Hat panelists and director of VeriSign's iDefense, which deals with software makers and vulnerability researchers. "It is the vendor's responsibility to proactively update the researcher on a regular basis on the progress that is being made in patching the issue."

Much progress has been made, and security researchers and software makers are working better together today than ever before, said Proctor. However, many companies need better processes for dealing with bug hunters, he said.

"I would like to see the growth of aggressive, formalized programs to work with researchers who find vulnerabilities," Proctor said.

Flaw finders who contact software vendors are typically well-intended security professionals, or enthusiasts who like to test the vulnerability of software. Several companies, including TippingPoint and iDefense, pay researchers for flaws they find and use the information in products to protect their clients' systems.

Adverse effect?
But complying with researchers' request for more information is not that easy, John Stewart, chief security officer at Cisco Systems, said during the Black Hat discussion. Acknowledging a potential flaw might have an adverse effect on security, he said.

"We can create undue attention onto something that might hurt our customers," Stewart said. "If we know, to the best of our knowledge, that there is a weakness in our product, we're attempting not to draw further attention to it."

Companies all operate differently when it comes to dealing with bug hunters. Microsoft has set a good example, accepting that it needs to work with the security community, Proctor said. "Cisco is moving from anger to acceptance, and Oracle from denial to anger," he said.

Cisco has worked hard to get into the good graces of the hacker community. It threw a party at a Las Vegas nightclub for Black Hat attendees and sent senior security staff to the event. That's in contrast to the previous year, when the networking giant sued a security researcher and alienated itself from the community to the extent that T-shirts with anti-Cisco slogans sold well at the Defcon hacker event that follows Black Hat.

Oracle appears to be easing up a little on the security front. Its chief security officer is now blogging, and the enterprise software company is talking to the press about security topics. However, it is still often critiqued for its unwillingness to deal openly with researchers.

Without communication, vendors risk losing the progress made toward responsible disclosure. Turned off by a cold response, bug hunters increasingly put pressure on software companies and go public with flaws, instead of going the responsible route, said Tom Ferris, an independent security researcher in Cupertino, Calif.

"I see more researchers not work closely with vendors and just giving them a 30-day grace period before going public with the flaws," Ferris said.

See more CNET content tagged:
disclosure, researcher, software company, Black Hat, TippingPoint Technologies

11 comments

Join the conversation!
Add your comment
Some points to consider.
You know I've been reading and writing in News.com religiously
for over 5 years now and it's always the same **** from you
security people. Lets gets some things in perspective here for a
minute.

1. These bug trackers like Terri Forslof do the charity work
necessary to ensure that the software that people use on a daily
basis remains secure as to not let way to hackers, virus, and
spyware

2. Its really some ******** when you say that the bug hunter's
should act in discretion when the peoples at Microsoft, Oracle,
Cisco and the rest have been know to treat bug hunters like
crap. Yet these bug hunters ensure that your clients remain safe
in all realms of technology.

3. You know I like to tell it to my clients this way, every bug that
is uncovered and un-patched is like leaving a window open in
your house even while all the doors and windows are locked. It
just doesn't make sense. When you find a bug, you fix it! You
don't just shun the people that found the flaw and then just
shove in under the rug (Mr. CSO John Stewart).

4. Companies that create consumer and private use software
need to abide by a stricter programming model and stop being
so damn lazy!

5. The public exploitation of know software flaws will force
companies like Microsoft and Cisco to release fixes for there
products in a real time basis ensuring customer security.

The message from these companies has been the same for too
long and it's about time they start taking responsibility for the
software that they sold to the consumer's that made them
Billions!

J Gund
Tech01
www.Tech01.net
Posted by OneWithTech (196 comments )
Reply Link Flag
A compromise
I am all for a compromise. Here is my solution. Inform the vendor of the flaw, complete with exploit code so that they can see that it is a real flaw. Let them know the timeframe they have to fix it. Personally I like 7 days, but if you are generous, maybe as much as 30 days. At the expiration, publish everything including exploit code regardless of whether or not a patch has been implemented.

This will force companies to get on the ball and fix their stuff, and force it to be quick to protect customers. There is a very good chance you are not the only one to find the bug, so, if it goes unpublished, only the crackers have it.

The crackers will have it. Someone else will find it. If you really want to protect people, force the vendors to fix their products, but give them a little time to do it, so at least it isn't being hit by every script kiddie on the planet.
Posted by amadensor (248 comments )
Link Flag
In total agreement.
You've said everything I wanted to say, and then some (which I agree with as well).

I'm thrilled to see there's another clueful individual in the IT industry who -- like me -- is sick and tired of seeing companies avoid responsibility. Should consumers of buggy software sue their manufacturers? No (I am very much against lawsuits) -- but it's high time those manufacturers accept the fact that trying to run from mistakes they make is simply wasting everyone's time and money (including their own!)

Fix the bugs, guys. Stop dropping Q&A headcount whenever it comes time for layoffs -- try hiring more security-oriented Q&A folks. Get knowledgable employees, not post-dot-com schmoozeboys who know how to play their social cards right. Replace those who become lazy with those who want to solve problems... and ultimately, you'll find your customers are more willing to work with you privately whenever security holes are found. But as it stands today, turnaround times are 6-9 months in some cases -- utterly atrocious.

Bug hunters (myself included) are now saying "f*** 3 months -- you've got 48 hours, and that includes weekends". It's a dose of reality, and if you don't like it, get a helmet.
Posted by katamari (310 comments )
Link Flag
Fooling themselves
Full disclosure is the proper way to go. I have yet to see a vulnerability that was not already known to the "bad elements" before a patch was made, or even found by researchers. By full disclosure, the end consumer has a chance to react since, at the least, you understand what activity leave you vulernable. A flaw in IE, well dont surf for a while, etc.

Today I have see more people being hit by hackers because they were unaware of the issue. When full disclosure was the key more people were prepared and less affected. Now-a-days I just ignore security researchers when I used to rely on them. It is better to just visit the hacking websites these days since they have no problem making these vulnerabilities well known.
Posted by umbrae (1073 comments )
Reply Link Flag
Full Disclosure the Only Way
Vendors of flawed or tainted security products all want to have time to fix the flaw.

But IT managers want to block the flaw until a fix is available.

Sometimes there is no way to block the flaw until the vendor comes out with a patch... but at least the IT managers are aware and can look extra close for any suspicious activity which might show a possible breach of security prior to the patch be coming out.

Disclosure to vendor first until patch is available has been dubbed "responsible disclosure", but there are numerous bad guys out there whom also find and disclose their findings... except they disclose their findings to the underground world. That's not responsible disclosure. And in the eyes of the IT manager... that IS THE BIGGEST threat.

So by using "responsible disclosure", the IT managers are being denied the info required for them to responsibly protect their networks.

Bottom Line: The more secure applications are created... the less security weaknesses and chances that the bad guys will find the flaws first.

Open disclosure does tip bad guys off to certain flaws, but how many of those flaws were not already know to the bad guys from the get go?

At least full disclosure WILL REQUIRE an ASAP response from vendor to what ever flaw is openly disclosed.

But at the same time, IT managers have the information in their hands as soon as it's discovered allowing them to at least monitor and attempt to thwart the security weakness.

Deciding Line: Responsible & secure applications vendors will gain a better reputation for the least amount of bugs as well as creating a new deciding line where those who patch fastest are security wise... better off.

That's the only responsible way about this. Succumbing to the whims and requests of weak security software vendors will only tie the hands of IT managers and various other sectors of business.

If Microsoft, Cisco, Oracle, IBM or anybody else doesn't like that... then they need to better secure the products which they develop. That's the only REAL TRUE solution to this problem!!!

Walt
Posted by wbenton (522 comments )
Link Flag
Free market system
The endless parade of half baked products and "referential" wannabes really must be de-stemmed. As with everything, h/w & s/w, the moment you release for sale, anything, is the minute you are saying to the world you have a good, solid, viable product and consequently should be subjected to international scrutiny and comment.

I do beleive it is time to make vendors legally responsible for troubles they foster on the public.
Posted by DiamondBridgeCenterfuge (9 comments )
Reply Link Flag
endless parade of half baked products
<a class="jive-link-external" href="http://www.analogstereo.com/dodge_ram_owners_manual.htm" target="_newWindow">http://www.analogstereo.com/dodge_ram_owners_manual.htm</a>
Posted by Ipod Apple (152 comments )
Link Flag
Need a combination of both
Let's put things into perspective here:

1. Security Ignorant software manufacturers product insecure products (poorly written &#38; lacking much security at all code).

2. Thoughtful good guys, on the lookout for holes these Security Ignorant software manufacturers create get their traps shut by the Security Ignorant software manufacturers for openly spouting out the flaws they find in the software created by the Security Ignorant software manufacturers.

3. Thoughtful bad guys, also on the lookout for holes these Security Ignorant software manufacturers create get to continue hacking into the holes they've found until the good guys catch wind of the problem and report it to the Security Ignorant software manufacturers.

4. The rest of the world just wants the Security Ignorant software manufacturers to create more Security Intelligent software so that the bad guys won't be able to hack as much.

5. The Security Ignorant software manufacturers continue to churn out Security Ignorant software which the hackers continue to hack into and create the havoc they do.

Irresponsibility needs to be prioritized here for those who've still yet to figure it out:

#1. Security Ignorant software manufacturers MUST create more secure code... until they do... they're going to play catch up at best for all the past security ignorant software they've already released. Their job should be to fix the currently released holes first (and ASAP). And next... create better code so that it's not as easily hacked. If the security ignorant software manufacturers do this, both the good and bad guys won't be able to find as many flaws... that will reduce the software manufacturer's number of flaws to fix and the number of hacks that occur as well as the number of flaws found.

#2. The good guys need to keep doing their job, but they should be paid something for doing such a good job. Basically put... the good guys are just responsible bad guys working for the betterment of software that the rest of the world uses.

#3. The bad guys need to get smacked up side the head ( and forced to do serious jail time) for being irresponsible and using their skills for malacious hacking activities.

#4. The General public at large... needs to come down on the Security Ignorant software manufacturers by threatening to NOT purchasing future products from that company and pressure those companies to come out with a patch ASAP.

But the only responsible way to do all of the above is to have Full Open Disclosure.

That way everybody can see what's going on above the table and there's nothing to hide... except for the hackers of course... of which... hard jail time should help resolve.

If the people aren't aware of the flaws in products they use daily... they face getting hacked. If the Security Ignorant software manufacturers don't like to be pressured to drop what ever it is they're currently doing and offer patches for their flaws... then they need to start creating software with less of a security threat.

But unless Full Open Disclosure is made... the General public is NOT aware of the dangers posed to them by the Security Ignorant software manufacturers and the hackers.

Bottom line: Security Ignorant software manufacturers and hackers are in the same boat. Neither are working FOR the General public at large but both are avidly hiding what they do behind the scenes.

So it's up to the good Full Open Disclosure good guys and the rest of the General public to place the blame where it belongs.

FWIW
Posted by wbenton (522 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.