August 14, 2006 4:00 AM PDT
Perspective: Why Internet security continues to failSee all Perspectives
Indeed, free-market financial interests and an unhealthy complacency by vendors and customers alike continue to overpower sound security logic and practices. Even though many companies conduct cutting-edge research into technological security measures, the IT world continues to endorse a technology-centric approach to information protection. This has created a security planning problem for information-based organizations.
Customers forget that the technologies of protection are only as reliable and resilient as the underlying infrastructures they want to protect. Failing to acknowledge or fix an infrastructure plagued with problems raises many doubts about any security product's ability to function in such a foundation. Placing more complexity on top of existing (and flawed) complexity does not lead to increased protection, but rather, fosters a false sense of increased protection.
While helping to define the software industry, software licensing provisions shield vendors from legal and fiduciary responsibility for problems arising from their products. They also offer corporate protection rarely found in other industries. The technology industry apparently is the only business engaged in regular commerce that's immunized against gross negligence in the design or manufacture of its products.
Yet tolerating and implementing "good enough" products from unaccountable vendors is the accepted norm--and any resulting loss incurred by customers is deemed the price of doing business in the Information Age. How such practices contribute to stronger Internet security remains unclear.
Concerns about protecting information helped establish several security evaluation criteria in recent years. From HIPPA and FIMSA to SOX and GISRA, nothing signifies a successful information protection program today better than publicly trumpeting compliance with any of these government auditing and certification standards.
But if these standards are considered minimal requirements for security success, why are few--if any--security chiefs fired when their programs fail to stand up to challenge? (The current penalty for such failures seems to be little more than rhetorical admonitions to "do better next time"--as evidenced regularly during the annual Federal Computer Security Report Card evaluations.)
Today's technology procurement cycle requires customers to upgrade their products and remain current with their vendors' supported product lines (and revenue goals) by routinely replacing one "good enough" product with another one of equal standing. The new features may neither be needed nor required. At that point, like it or not, customers run up against unfamiliar products and potentially significant and unknown costs to their networks and organizations.
Customers must reverse this practice by upgrading when they--not their vendors--deem it necessary. By doing so, they can also terminate their roles as unsuspecting beta testers for allegedly completed products and shake off a dependency relationship that's more akin to the psychological condition known as Stockholm Syndrome than to sound business practice. The time to test software is before, not after, customer deployment.
The current state of insecurity clearly is not acceptable, and customers should be demanding serious improvements. But if real change is to occur, we need to look beyond technology innovations before effective security benefits can be realized. What we know now is that the major obstacle to significant progress toward sound information security is cultural, not technical.
Richard Forno is a principal consultant for KRvW Associates, a Washington D.C.-area security consultancy. The views expressed in this article are his own.
14 commentsJoin the conversation! Add your comment