October 21, 2002 4:00 AM PDT
IT's exercise in utilities
Until recently, however, most big companies didn't have a good economic reason for considering so-called "utility" computing services. Now, given the slender IT budgets and cost-cutting at many of those same companies, proponents say the concept is getting renewed attention.
Jim Mathison, vice president of IT for leasing company GATX Capital, has already taken the plunge. He gets a monthly statement from Hewlett-Packard that would seem familiar to most: It "looks like a phone bill," he said.
GATX outsources its data center to HP managed services, which oversees the bulk of the company's IT infrastructure.
Mathison said the decision to go with utility computing came down to economics: The company didn't want to pay to set up and maintain infrastructure that ran full tilt for two weeks, only to go silent for the rest of the month.
"I'm just trying to buy what I need when I need it," Mathison said. "And that goes for bandwidth and people."
Utility computing describes a system that lets companies pay for IT services as needed, much as GATX is doing. Rather than buying their own servers (or additional machines) and hiring the maintenance staff associated with them, a company simply purchases server time or processing power from another company, which also takes care of upkeep and other such concerns. Services companies expect to charge customers the same way an electric company does: When usage spikes, so does the bill.
Although analysts and technology buyers don't agree on when the utility computing movement will reach critical mass, most say the ball has at least begun to roll.
"We are starting to see a lot of action in this space," said Laurie McCabe, an analyst with Summit Strategies. "Companies are looking at their IT investment and wondering if they've gotten a payback. If the answer is no, on-demand computing becomes pragmatic."
Some companies are begging to take the plunge. In its third-quarter earnings report last week, IBM--seen as the leader in commercializing the utility computing concept--said e-business hosting sales were up 20 percent, courtesy of its e-business on-demand products.
Big Blue says it now offers a wide range of utility computing options, such as Linux Virtual Services, where customers consolidate Unix and Intel systems on a network-based Linux server, paying only for what they use.
Another example is what IBM calls "Leveraged Procurement," where Big Blue procures parts and goods, and the customer pays by the purchase order. That way, customers don't have to build a procurement network of their own.
Earlier this month, IBM named a major customer for its Linux Virtual Services: Exxon Mobil's Mobil Travel Guide. IBM said customers such as American Express and Dow Chemical are also experimenting with the utility concept.
Other vendors are quickly moving toward utility computing. Sun Microsystems has outlined some milestones for its N1 software, which would create virtual mainframes out of a company's existing computer resources. Sun considers the strategy a component of utility computing.
In addition to providing services to companies such as GATX, Hewlett-Packard has signed a $1.5 billion outsourcing contract with Canadian Imperial Bank of Commerce that could lead to utility computing arrangements, said an HP representative.
What's still needed
Despite the recent advances, though, plenty of technology issues need to be worked out before large companies embrace the utility model, analysts said.
Like waves of industry advances that have come before it, utility computing has seen its early days overshadowed by differences in terminology and big promises. Tech vendors don't necessarily define utility computing the same way, and they're prone to put hype before reality.
The birth of organic IT
On-demand computing, Forrester says,
is part of a revolution in IT efficiency.
"We've been hearing about this for a while, and now it seems like everyone has a strategy," said Galen Schreck, a Forrester Research analyst. "But the day I call IBM and tell them 'I need a few more MIPS (millions of instructions per second, a measure of raw computing power) on the grid' is still a ways away."
Also, some of utility computing's technological underpinnings are still in the early stages of acceptance by big companies. These notably include grid computing--a way to weave computing hardware together--and Web services, a flexible software-linking technology.
Furthermore, once everything is tied together, there will have to be industrial-strength management tools to monitor usage and move computing power to where it's needed most.
And those hurdles may be the easy part. Some chief information officers could be reluctant to outsource key business functions because they've heard the tech sector's big pitches before. After all, utility computing sounds a lot like the application service provider (ASP) movement--which was supposedly going to let customers use software whenever they needed it. IBM disputes the comparison, noting that utility computing is a much bigger concept.
"ASPs focused on the smallest part of the problem, running applications and hardware efficiently," said Devajit Mukherjee, VP of strategy and marketing for IBM's e-business on demand initiative. "Utility is about delivering a process and doing it at scale. It's about integration."
Nevertheless, analysts say the move to utility computing--like the adoption of most new technologies in the corporate computing world--is real, although it will be slow and deliberate.
Today, some customers, such as GATX, are leasing or buying hardware and paying based on what capacity they use. The bulk of on-demand computing deals fall into this category.
But such dressed-up server leasing may be just the beginning. It may open the door for the likes of Sun and HP to push managed services down the line, ultimately letting them truly function like a utility company.
Behemoths such as IBM and EDS are also bulking up with additional services to bolster their utility computing offerings. Both companies made big moves recently. IBM purchased PricewaterhouseCoopers' consulting arm to pump up its services division, and EDS licensed Opsware, formerly Loudcloud's technology, to manage what could be the beginning of utility services.
The utility computing pieces are falling into place, and many programs are in the testing phase. Analysts say there's a lot of interest, but no groundswell yet.
"It's debatable how quickly the market will shift," said Jeff Kelly, president of hosting services for EDS. "Customers are asking questions and appear to me to be evaluating when the time would be right (to adopt utility computing)."
IBM's Mukherjee said the future is closer than many realize. He said the IT bill of the future will look a lot like an electricity or phone bill--a standard rate for set services with itemized payments for things such as bandwidth and data center usage.
"There will be a phasing-in process," said Mukherjee. "But IT will be a centralized utility where the value...will come from plugging it all together."
Indeed, the phasing in has already begun because customers are increasingly torn between investing in new technology and cutting back on spending. According to Joe Hogan, VP of marketing for HP managed services, utility computing is likely to kick in as companies add new variable-pricing clauses to existing outsourcing contracts. "Things will be combined, with some old and some new," said Hogan.
"Most traditional outsourcing deals will have utility elements at some level," said EDS's Kelly. "It won't be in the true utility format, but it will be a start."
On the hardware side of the equation, sales are increasingly set up so customers quickly can scale up or scale down hardware capacity. With variable-pricing leasing, hardware can be a vendor's ticket to bigger services deals. "By giving flexibility in hardware you can eventually migrate toward more managed computing," said Hogan.
GATX's Mathison estimates that the IT utilities of the future will make money by running outsourced IT departments efficiently and by charging customers a premium. Customers won't mind, because they'll still be saving money by not maintaining their own computing infrastructure.
"We wanted full-time skills without full-time employees," said Mathison. "It was easier to rent part of the infrastructure, because they manage it better than we do."
One step at a time
GATX appears to be ahead of the game, however. Financial services companies, which invest heavily in IT and have standard business practices, are likely to try utility computing before other industries. Telecommunications companies are also expected to be big fans of utility computing, analysts said.
Most customers, however, will take a hybrid approach.
Memphis City Schools is a school district in Tennessee with 200 remote locations, mostly schools. Managing the district's wide area network was a hassle that quickly became overwhelming, said Linda Mainord, director of information technology.
Mainord said IBM's Tivoli software was being used to manage the network, but the set-up "lacked the human resources."
Now the school district rents Tivoli as a service from IBM, so Mainord can manage the network and introduce it in a short period of time. "If I tried to do it myself, it would have taken three years," she said.
The school district pays IBM a flat monthly fee, but anticipates more utility-like billing as the network expands.
Until technology buyers become more comfortable with the utility computing idea, many installations may look like the Memphis City Schools setup: piecemeal affairs, reminiscent of the ASP movement that came before.
"You will see a lot outsourcing in the future, but there are barriers to break down," said Forrester's Schreck. "Will companies outsource mission critical applications and share grids and computing power? Changing cultures will take years and years."
Ultimately, the success of utility computing will hinge on good service and trust. "A lot of this will depend on the service you get," said Mathison. "You want to know you will save 'x' amount of dollars without service degradation."