Editors' note: This is part 2 in a series examining how Microsoft's security strategy has evolved over the past decade.
REDMOND, Wash.--A limo speeds away from Seattle's Pioneer Square carrying an unlikely party on an unusual quest.
A group of security researchers and a member of Microsoft's security response team have bonded, in search of--a haircut.
The expedition, held in September, was part of the Limo Races, a citywide scavenger hunt serving as the informal end to Blue Hat, the internal Microsoft security conference that started two years ago. The conference has become a twice-yearly event bringing some of the world's top hackers inside Microsoft's walls for two days of presentations before the software maker's executives and engineering ranks.
In the end, the team that included Microsoft's Andrew Cushman and IOActive's Dan Kaminsky failed in its mission. They found several tattoo parlors open for business, but no all-night barbers. None of them was really up for a buzz cut, anyway.
But while Cushman may have failed to win the Limo Races competition, he and his colleagues met a larger goal. Once again, Microsoft had succeeded in its twin aims for Blue Hat--becoming a more accepted part of the security community and ensuring that the people writing Microsoft's code are acutely aware of the threats facing its products.
The company has realized that security issues are about more than preventing buffer overruns and keeping up to date with the latest fuzzing tools.
"It is a really human problem," said George Stathakopoulos, the head of Microsoft's security response efforts. "The human element plays a massive role."
These days, Microsoft's security strategy is one that focuses on both people and technology. While Microsoft spends a fortune on automated testing and creating institutional processes to avoid bugs, it also spends money reaching out to its front-line engineers as well as to the security community that finds the bugs that Microsoft misses.
That attitude represents a sea change from where the company was a decade ago. At that time, Microsoft took a hands-off stance toward the security research community. In its earliest days of security issues, the company didn't even disclose the extent of vulnerabilities.
"We had almost a cold-shoulder approach," Stathakopoulos said. The idea of talking more to the outside world was controversial, prompting meetings with "many raised voices," he said.
Stathakopoulos admits his was one of the voices arguing against such transparency.
"People already think our products are bad and if we start talking about those issues more and more, people will think we are horrible," Stathakopoulos said he argued at the time. But his boss, Mike Nash, persisted, arguing that the move would pay big dividends over time.
Ultimately, Stathakopoulos was at the forefront when it came time to take the next step toward being more open. He was among those who pushed Microsoft to start working more closely with the community of security researchers known for poking holes in the company's products.
"This is where we did most of our soul-searching," he said. "Are those good guys? Are they bad guys? Are they helping or hurting?"
In the end, Microsoft concluded that while they might have different approaches, both the company and bug hunters shared the goal of a more secure computing world. Plus, Microsoft needed them as allies.
How Microsoft got its security groove on
CNET News.com's Ina Fried talks with Charles Cooper about Microsoft's long security journey.
Download mp3 (5.36MB)
The company relies on the outside world to report the bugs that have slipped through Microsoft's testing processes and made it into products. The company has advocated a process it calls "responsible disclosure," in which finders report flaws to a vendor and agree not to go public provided the vendor fixes the problem within a reasonable period of time.
That notion is still somewhat controversial and not universally accepted. There are some people who believe that once a bug is found it must be publicized so those at risk can protect their systems on their own, even if a patch from the vendor is not available. There are also frequent debates over what constitutes a reasonable amount of time.
And, there is still somewhat of a culture clash at Blue Hat. Inside Microsoft's conference center earlier this year, engineers sat quietly and tapped on their laptops as the invited researchers made their presentations. In the speakers' lounge, located in an adjoining building, the researchers snacked on munchies and watched their colleagues on a TV monitor.
Shane Macaulay spoke about ways of using Microsoft's new graphics engine to help visualize the way code interacts with data. At one point, he made a reference to "hookers" in the data. Although the term refers not to prostitutes, but to the practice of intercepting an application's call to the operating system, the word choice nonetheless prompted a chorus of snickers from the speakers' lounge.
Stathakopoulos wasn't among those cracking jokes, though. He was taking mental notes.
"Underestimating him would be a mistake," he whispered as Macaulay gave his talk.
Beginnings go back to Black Hat
Even though they were in separate rooms for the show, the fact the two groups are interacting at all is a sign of significant change. The seeds of today's collaboration can be traced back to 1997, when Microsoft first sent people to the Black Hat security show in Las Vegas. It took things a step further at Black Hat in 2003, renting out the Palms Hotel's ultra-hip Ghost Bar and inviting the attendees to toss back a few at Microsoft's expense.
At first, Stathakopoulos said, it was like a high school dance, with the hackers on one side and the Microsoft people on the other.
But gradually, folks began talking, and conversations continued beyond Las Vegas. Cushman said that by engaging with the security community, they were able to see that inside Microsoft there were people as passionate about security and as smart as those in the outside community. Without such interaction, it was far easier for outsiders to assume that Microsoft just didn't care and for those inside Microsoft to assume that the security community just wanted to find holes in Microsoft products that others could exploit.
Day 1: From pain to progress
Remond's security practices have been transformed since threats like Slammer and Blaster first wormed their way onto the scene.
Day 2: Inviting the hackers inside
Aiming to be more open, company reaches out to the security research community it once kept at a distance.
Day 3: Emerging security threats
Forget widespread worms. Nowadays, limited-scale threats like targeted e-mail attacks are causing the most concern.
Day 1: Inside the war room
After years of having to scramble whenever an outbreak hit, Microsoft builds adjoining situation rooms to coordinate its response efforts.
Day 2: Off to the Limo Races
In what might seem an unlikely pairing, Microsoft employees and security researchers team up to go on a scavenger hunt through Seattle.
Day 3: Meet the bug hunters
One talks a mile a minute, another dresses like a bug. Meet some of the people who have helped lead a massive culture change at the company.
Inside the war room
Painful episodes lead to the creation of a security response center, where teams take on the task of hunting bugs and keeping customers informed.December 3, 2007
The bug hunters
Just who are the people charged with the task of keeping code secure at Microsoft? They're risk takers, whether donning silly costumes or swimming with sharks. December 5, 2007
Editors: Anne Dujmovic, Mike Ricciuti
Design: Andrew Ballagh
Production: Kendra Dodds
5 commentsJoin the conversation! Add your comment