May 26, 2004 4:00 AM PDT

Will code check tools yield worm-proof software?

When Microsoft needed help in taming the large number of flaws that had crept into its Windows operating system, it looked to technology known as "static source code checkers" and a company called Intrinsa.

News.context

What's new:
Nearly 4,000 security flaws have been found in software during each of the last two years, but software developers still don't routinely do automatic checks for such vulnerabilities. Legal ramifications, however, could change that.

Bottom line:
Several companies have gone into the business of creating and providing "static source code checkers" to handle such spot-checks. While many agree that the time is right, some say the technology's not ripe--which could make for additional costs and distract from the checking already being done.

For more info:
Track the players

Intrinsa's product, known as PREfix, analyzed the code created by developers and flagged potential errors. The software giant found the program so helpful, it bought the company for $60 million in 1999. Today, a handful of other developers of similar products hope to convince customers that they should be using their programs to spot-check security.

For Microsoft, such tools have become an integral part of its Trustworthy Computing Initiative, which aims to make Windows computers more reliable. The software maker trains 20,000 developers annually in secure programming, but the tools enforce discipline on a daily basis, said Michael Howard, security program manager for the company.

"We are not seeing the same (security) issues as five years ago," he said. "We have educated people, so they understand these issues, and the tools are a lot better. People are not writing bad code. They are writing better code in the first place."

A handful of other companies have started to sell tools similar to the static source code checker used by Microsoft. Although the tools have been developed mainly by academicians intent on collecting data about software flaws, these companies think the programs are mature enough for commercial applications. Moreover, with corporate information technology managers fed up with security flaws, many are ready to adopt the technology.

The spotlight on developers has increased in intensity in recent months with the release of a technology industry plan for better development and a report from the Business Roundtable that castigated software makers for failing to produce reliable products. Companies are reliant on the Internet, whether they're selling online, connecting to partners or just using e-mail. Yet almost 4,000 flaws have been found in each of the last two years, according to the CERT Coordination Center.

"Most of the significant cyberincidents that have harmed American business and consumers over the past several years have had as their root cause defective and readily exploitable software code," the Business Roundtable, which includes 150 chief executives from large U.S. companies, said in a four-page "Framework for the Future." "Most software development processes used today do not incorporate effective tests, checks or safeguards to detect those software coding defects that result in product vulnerabilities."

Microsoft, more than any other company, has raised the ire of corporate America for flaws in its widely used Windows operating system. Although many might dispute how successful Microsoft has been in eradicating software flaws, fewer people are questioning the company's focus on security and its acquisition of tools to lock down code.

"Bill Gates has it right, with all due respect to those who want to bash Microsoft--there is nobody that doesn't have to deal with this issue," said Steve Orrin, chief technology officer for Sanctum, the maker of a tool to check Web applications for security holes. "There was no one forcing QA (quality assurance) to think of security. That is night and day, compared to what is happening now."

Driven by the concerns of corporate customers that fear the Internet's darker denizens, companies such as Sanctum see business booming, as more businesses look for ways to check the security of the software they rely on. Many hope to vet their in-house applications, but the majority want to check products that they will ship or software that is produced by outside partners.

Sanctum, which had originally focused on creating software that could act as a barrier between online attackers and Web servers, found the interest from developers in its software's security-auditing capabilities so high that it has decided to target that market.

"We evolved our whole corporate strategy over the last year toward development," Orrin said. "We have been surprised at the acceleration of behavioral change that has occurred."

What's changed is that Internet-connected businesses can no longer afford to rely on software riddled with bugs, said Mike Armistead, founder and vice president of marketing of code analysis toolmaker Fortify Software.

"We all became interconnected, which has been a productivity boom, but no one thought that you would have so many people from the outside having access," he said.

Although developers test their software today for flaws, the testing is usually structured to determine if the software works properly rather than whether intentionally improper actions cause the software to fail.

According to Armistead, software developers say, "I am not going to catch everything, and (that's OK, because) it is accepted industry practice to ship the product and let people tell me what's wrong with it."

However, not all security researchers come forward with flaws that they find. Moreover, many security experts believe that developers could become legally liable for the software bugs they don't find, especially if the tools are available to detect those errors.

That's why new products to automatically find the errors are making headway. For example, @Stake, company that had focused on security services, now sells a tool to scan a program's binary code so that any user can test software security. Another company, Reflective, applies several different analysis techniques to scan for flaws.

"Down the road, you want everyone to be using these tools in their compilers," said David Evans, assistant professor for computer science at the University of Virginia and the creator of some of the code analysis technology used by Reflective. "It is a real embarrassment to the industry that people still produce code with buffer overflows."

Buffer overflows are a common memory error that allow online attackers to run malicious code on other people's computers. The MSBlast and Sasser worms both used buffer overflows in Microsoft's Windows operating system to spread across the Internet. Yet buffer overflows aren't new--security researchers have known about them for three decades.

Despite the potential for these code analysis tools to help alleviate such long-standing problems, not everyone believes the technology is ready for the real world.

Dave Aitel, principal security researcher and founder of security software maker Immunity, says he does not believe that the current crop of products is up to the task. The reason: Many pieces of code are falsely labeled as flaws by the tools. Such false positives can sidetrack the developers for a long time, reducing productivity, he said.

"If it finds 500 bugs, you have to go through those 500 bugs and fix them; any false-positive rate destroys the economics," Aitel said. "Maybe in three generations, it will be economically feasible for large code bases."

Yet Aitel acknowledges that such tools are needed.

"If you look at most corporate code, it is littered with easy bugs," he said. "A lot of these really big vendors do no checking at all. There is a big market out there for something that can shoot through 30 million lines of code and catch the obvious stuff."

Another supporter of source code analyzers, Dawson Engler, believes that the tools catch enough flaws to make them valuable today.

"I think we will get better and better at finding more and more holes," said Engler, a Stanford University computer science professor who has written much on the field. Engler started Coverity, a company selling source code analysis tools, with several graduate students.

Rival company Ounce Labs intends to put the pressure on software developers by empowering their customers.

The company, which hopes to launch its code analysis product in June, announced on Tuesday that it had created a boilerplate contract addendum that holds software makers responsible for guaranteeing the security of their software. CEO Jack Danahy believes that if companies start adding the wording to contracts, developers will then proactively start checking their software for flaws. And that means more customers for those that make analysis tools.

"What happens is that I don't have to accept (the software) from you, unless you make sure it is secure," Danahy said. "Security now becomes a requirement."

9 comments

Join the conversation!
Add your comment
Microsoft needs to clean up its own act first
Microsoft is distributing Internet Explorer infected with the Spybot 'Alexa', and it routinely infects other programs with spyware which reports user data back to only Microsoft-knows-where. Why does anyone pay attention to this company's crocodile tears about security?
Posted by landlines (54 comments )
Reply Link Flag
MS: Too Big to Bury
I'm no apologist for any software company that releases vulnerable code, including MS, but it's not that helpful to use examples of bad behavior (like the IE release cited) to blanket-condemn whole companies. It's going to take years - probably decades - to turn enormous code bases like MS' around. And that assumes they keep their eye on the ball, get and stay committed to secure coding practices across all the product lines. Something similar should be happening at all ISVs, though none get the attention/scrutiny of the largest.

Every software vendor, as well as every firm that uses packaged and custom developed apps to drive its operations, has a hell of a lot of work to do to begin to secure their apps.

Let's judge them by what they do in the aggregate over the next year or two and see if they can demonstrate some measurable gains.
Posted by (1 comment )
Link Flag
Security vs Functionality
I am sick and tried of business requirements that often contradict each other. Any programmer can tell you that it's extremely difficult to write a piece of software that's both powerful and secured.
It's like saying driving a Racing Car at 200 miles per hour and guarante it will not crash ay any situation.
Most of the microsoft problems are actually design issues. They want to write powerful software to suit their business needs.
The static code analyzer tool can help the programmer to do their work. But the story totally
misses the point about software security as a whole. Such tools will not fix the embarrassing security hole for business people. Quit dreaming!!
Posted by (6 comments )
Reply Link Flag
Communications of the ACM - June 2004 Forum
"The Threat from Within

The special section "Homeland Security" (Mar. 2004) detailed external attacks on various computer systems but did not mention that threats also originate inside those systems.

The 9/11 attackers, for instance, were trained to fly inside the system and allowed to board U.S. aircraft by that same system. Protecting U.S. computer systems requires Americans to assume those who would harm them might already be inside their development domains.

Two promising ways to examine U.S. software for security risks are path coverage analysis and concordance analysis. Programmers who discover unused paths should perform further tests or remove the suspect code. Moreover, they should generate a concordance for the code whereby high-risk words are highlighted and reviewed."
Posted by m_sadler (5 comments )
Reply Link Flag
www d50.org
www.d50.org
Posted by m_sadler (5 comments )
Link Flag
Coding is a problem...but it's not the real problem...
Yes, it's appalling that commercial software is released with so many coding errors. Our knee jerk reaction, though, is to buy expensive code checkers to reveal these flaws instead of getting at the root of the problem...why competent programmers ignore fundamental programming things they learned in Coding 101 class. There's a bigger problem here, but you have to look under the hood to understand it.

More than 50% of the top 10 recommendations conveyed on websites and in "secure coding" classes are about quality--not security.
-Bound and mask input fields
-Limit inputs to buffers
-Apply rigorous error handling
-Release threads
-Clear temp data/objects
-Remove unnecessary code
-Log and audit appropriately

What we should be asking ourselves is not "why don't programmers code for security". Instead we should be asking ourselves what causes programmer not to code for quality.

The answer's easy because until the risks of insecure software became apparent in dollars and cents, companies considered the trade off of quality for functional as a risk they were willing to accept to bring products out to market faster than their competitors. Them days is over, boys.

Developers aren't stupid. They know what they should be doing. It's executive management and PMs that too often don't. They don't understand that coding and testing for security adds significantly to the cost and time it takes to create quality software. The programmer can't just make sure the program does what it's supposed to do. They now need to make sure it does ONLY what it's supposed to do. They can't just make sure the program recovers quickly, they now also have to code to make sure it fails securely. They can't just make sure a message reaches its destination, they need to make sure the person sending it is legitimate. All this takes extra analsyis and coding but the budgets and timelines for projects aren't expanding to accomodate this.

Management also doesn't typically realize that the need for increased security in programs results in a required shift in the job role. Management says "make it secure" but what does that mean? A program that handles rocket launch codes needs to be more secure than a giveaway product demo. The only people who can make this determination are the people in the company that judge corporate risk. That's not the programmer, or the testers, or even the PMs. It's management,and without management setting clear security and objectives for each product based on relative corporate risk, our project teams are stabbing in the dark.

We don't need to send our programmers to secure coding classes, nor do we have to buy costly code sweepers (though they are a good idea anyway). What we need to do is convince management the problem really resides with them--not the programmers--and until they accept the fact that building a quality anything requires appropriate time, skills and funds, our programmers are going to continue turning out insufficient code. They don't need training in remedial coding. If anything, they--and their PMs--need classes in Risk analysis and Communications, so when they talk, somebody will listen.

Bar Biszick, cissp, csqa
www.qualityit.net
Posted by (1 comment )
Reply Link Flag
Bang on point,mate.
Posted by Duchman98 (2 comments )
Link Flag
The bad guys have tools too
The CanSecWest (www.cansecwest.com) security conference revealed some of the tools available to the "Dark Side" -- and we should all be scared.
Posted by ttul (34 comments )
Reply Link Flag
Try our new static code analyzer PVS-Studio for C/C++/C++11.
http://www.viva64.com/en/pvs-studio/
Posted by Karpov2006 (1 comment )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.