October 5, 2004 4:00 AM PDT

Where's the simplicity in Web services?

Has Web services, the technology intended to simplify programming, gotten too complex?

A debate is raging over whether the number of specifications based on Extensible Markup Language (XML), defining everything from how to add security to where to send data, has mushroomed out of control. Defenders of advanced Web services specifications say they are needed to ensure that new computing architectures are flexible enough to accommodate both sophisticated and smaller-scale applications. Detractors say that simpler application development methods are good enough.

The rallying cry for people who favor simplicity is a technology approach called REST, or Representational State Transfer, a method of building applications by sending XML documents over existing Internet protocols. This allows programmers to construct applications with existing tools and computing infrastructure, notably HTTP (Hypertext Transfer Protocol).

News.context

What's new:
The complexity of Web services is raising hackles. A vocal minority wants the more complicated protocols to give it a rest in favor of a simpler approach.

Bottom line:
Analysts say REST, a Web services alternative that sends XML documents over existing Internet protocols, is suited for relatively simple applications. But businesses wanting the benefits of the flexible systems design called a services-oriented architecture should adopt Web services.

More stories on Web services

The long-running dispute has even drawn in some of the technological fathers of Web services. Tim Bray, co-inventor of XML and director of Web technologies at Sun Microsystems, said recently that Web services standards have become "bloated, opaque and insanely complex."

At stake is whether, or how quickly, customers will continue to invest in emerging Web services software--considered the foundation of modern computing systems--to replace older methods of wiring business applications together. Researchers at the Radacati Group last week forecasted that the market for Web services-related software and services will balloon from $950 million this year to about $6.2 billion in four years.

An attempt at flexibility
The term "Web services" emerged about four years ago to describe a set of software specifications, or blueprints, designed to make incompatible programs communicate over Internet protocols. Heavy hitters IBM, Microsoft and others agreed to back the specifications rather than pursue differing approaches to software compatibility as they had done in the past.

In an effort to make these Web services systems as reliable as older computing systems, but more flexible, vendors have supplemented the initial basic Web services specifications with a number of extensions. Infrastructure software providers IBM, Microsoft, BEA Systems, Oracle and others have authored specifications to add security, reliability and other features on top of the basic Web services protocols, notably SOAP (Simple Object Access Protocol) and WSDL (Web Services Description Language).

That ongoing process of specification development has caused consternation among some people, who claim that programmers and their employers cannot absorb the flow of new specifications. There are now more than 30 specs, which include hundreds of pages of technical guidelines. IBM and Microsoft have fostered the development and publication of many of those specifications, referred to collectively as the "WS-*" or "WS-star" rubric.

The concerns over undue Web services complexity grew louder in recent weeks with the publication of three new specifications. WS-Transfer and WS-Enumeration are protocols meant to give developers more control over data transfer between two programs, while WS-MetaData-Exchange provides a mechanism for sharing information about the capabilities of individual Web services.

Bray, among others, has voiced some skepticism over the committee-driven specifications development process, which has been dominated by large vendors such as IBM and Microsoft. Bray's concerns cross political boundaries as well. His employer, Sun, is actively participating in Web services standards development. Sun, along with IBM, Microsoft and others, holds a seat on the board of the Web Services Interoperability group (WS-I), an organization formed to provide guidelines to make sure standards-based applications are compatible.

Other programmers share Bray's concerns. Rather than learn the latest specifications for doing Web services security, some developers claim that simply sending XML-formatted documents over existing Internet protocols can get most jobs done.

Independent software consultant Mike Gunderloy bemoans the complexity and growing number of Web services specifications. Having given up trying to keep pace with the regularly published specifications--which are then meant to be submitted to standards bodies for standardization--Gunderloy recommends that other developers not "bother to learn all of this WS-stuff," he wrote recently.

Some corporate customers are also cautious about embracing Web services technology and the standardization process. Many of them have stuck to the most basic Web services protocols rather than aggressively pursue the latest specifications.

Business process automation company Ultimus has decided not to use the Business Process Execution Language specification (BPEL) in its products because the industry already has the "right building blocks" in place, according to Hank Barnes, vice president of product marketing. "This focus on standards, particularly ones that are incomplete, is misplaced," Barnes said.

So what's the alternative?
Advocates of Representational State Transfer argue that REST allows the same application-to-application communication that Web services promises. One of the most successful public Web services using REST is offered by Amazon.com, which allows programmers to use Amazon's services in e-commerce applications.

But REST has its limitations, according to experts.

Michael Champion, a research and development specialist at Software AG, said "nasty enterprise integration problems" demand more sophisticated protocols and methods. In a blog posting, he called on backers of both approaches--Web services and REST--to make their cases as to why their approach is better.

Ron Schmelzer, an analyst at Web services research company ZapThink, said REST can indeed yield better results in isolated instances. But the method misses the overall point of Web services, where interoperability between products from different vendors is the ultimate goal.

"You can build whatever you want and optimize it behind your own firewall," Schmelzer said. "But if you want interoperability, then you have to agree to something. It's not meant to be optimal--it's what companies can agree to do together, given that they have very different products."

Schmelzer noted that the advanced Web services protocols now under development are designed for thorny computing problems that demand complex protocols. For example, REST does not address aspects, such as security, reliable messaging, or business process automation in a standardized way, he noted.

Other Web services advocates argue that developers can be shielded from a great deal of complexity by the development tools they choose.

"If you don't understand all the specs, don't worry about it. Tools are being created by people everywhere to make it so you can just indicate the capabilities you need, and the rest will be done for you," said Matt Powell, a content strategist at Microsoft's developer network, in a posting.

Also, Web services advocates note that the specifications were designed so that the latest capabilities, such as reliable messaging and security, can work with applications that use simpler Web services standards, such as the transport protocol SOAP. Microsoft earlier this month published a white paper stating that Web service protocols are designed to be "autonomous," which allows developers to pick the level of sophistication they need.

Randy Heffner, an analyst at IT consulting company Forrester Research, noted that REST-style development is suited for relatively simple applications. But ultimately, corporations that want to reap the benefits of a flexible systems design called a services-oriented architecture should adopt Web services based on SOAP.

Heffner draws a parallel to the early days of Web services, where SOAP was meant to be a simpler method than CORBA, an older programming standard that never reached market ubiquity, in part because of its complexity. But as Web services become more mainstream, companies will need to exploit the advanced protocols embedded in the latest products.

"Given REST?s simpler technology stack, it is reasonable that it should be faster," Heffner said. "The pursuit of development productivity...adds overhead to SOAP and, in most cases, the overhead will be worth it."

See more CNET content tagged:
Web service, BEA Systems Inc., specification, WSDL, Simple Object Access Protocol

20 comments

Join the conversation!
Add your comment
Simpler wins...
Every time. Complexity benefits larger vendors, who can then charge for amies of consultants. But, like TCP/IP, ethernet, html, and other "simpler" protocols, REST will be what most "normal" people implement.

Good luck to the big, bloated, money-sucking multi-national corporations that need to "differentiate" and "innovate" through standards...and complexity.
Posted by ordaj (338 comments )
Reply Link Flag
Simpler wins...
Every time. Complexity benefits larger vendors, who can then charge for amies of consultants. But, like TCP/IP, ethernet, html, and other "simpler" protocols, REST will be what most "normal" people implement.

Good luck to the big, bloated, money-sucking multi-national corporations that need to "differentiate" and "innovate" through standards...and complexity.
Posted by ordaj (338 comments )
Reply Link Flag
Not This Time
It isn't a contest, or a conspiracy, or even a weekend debate. Simpler doesn't always win. If it did, you would ride to work on a horse and be sending mail written with a quill pen.

Service-oriented-architectures by means and method have distinct advantages, but to see that, you need to work in a content domain that is distributed, large, and varies by site. Web services aren't for one-off software or custom installations.

This is one where the web pundits are being overcome by events and requirements.
Posted by (101 comments )
Reply Link Flag
Not This Time
It isn't a contest, or a conspiracy, or even a weekend debate. Simpler doesn't always win. If it did, you would ride to work on a horse and be sending mail written with a quill pen.

Service-oriented-architectures by means and method have distinct advantages, but to see that, you need to work in a content domain that is distributed, large, and varies by site. Web services aren't for one-off software or custom installations.

This is one where the web pundits are being overcome by events and requirements.
Posted by (101 comments )
Reply Link Flag
Use Wisely
Where is it written, that every "specification" has to be used? If it does not work, poorly designed, too complicated, narrowly focused, un-needed, or just plain redundant, then DO NOT USE.

Specifications can only be accepted by the development community based on their use. So choose and use wisely. Don't except every answer.
Posted by Thomas, David (1947 comments )
Reply Link Flag
Use Wisely
Where is it written, that every "specification" has to be used? If it does not work, poorly designed, too complicated, narrowly focused, un-needed, or just plain redundant, then DO NOT USE.

Specifications can only be accepted by the development community based on their use. So choose and use wisely. Don't except every answer.
Posted by Thomas, David (1947 comments )
Reply Link Flag
information overload
Sounds like some IT people are grappling with the familiar information overload problem. We see this a lot... when an IT person has a commanding grasp of the field they are in, and suddenly things change so fast that they realize they cannot know it all... it is discouraging, but normal. One of the quotes in the article mentioned new tools, and I firmly believe in this concept. Nobody can know it all, but by using the proper tools, we build upon knowledge and technology that works well without an absolute need to understand every piece.
Posted by David Arbogast (1709 comments )
Reply Link Flag
information overload
Sounds like some IT people are grappling with the familiar information overload problem. We see this a lot... when an IT person has a commanding grasp of the field they are in, and suddenly things change so fast that they realize they cannot know it all... it is discouraging, but normal. One of the quotes in the article mentioned new tools, and I firmly believe in this concept. Nobody can know it all, but by using the proper tools, we build upon knowledge and technology that works well without an absolute need to understand every piece.
Posted by David Arbogast (1709 comments )
Reply Link Flag
A Brief Conversation
Specs Vendor: We have this brand new specification
that will revolutionize the way you
do business
Specs Customer: Really? Let me see it... Wow, this
looks really complex.
Specs Vendor: We have a tool that removes all that
complexity. Use it and everything just
works
Specs Customer: Really? How much?
Specs Vendor: (Evil Grin)
Posted by ferricoxide (1125 comments )
Reply Link Flag
A Brief Conversation
Specs Vendor: We have this brand new specification
that will revolutionize the way you
do business
Specs Customer: Really? Let me see it... Wow, this
looks really complex.
Specs Vendor: We have a tool that removes all that
complexity. Use it and everything just
works
Specs Customer: Really? How much?
Specs Vendor: (Evil Grin)
Posted by ferricoxide (1125 comments )
Reply Link Flag
Why mutually exclusive?
Several Web services publishing companies like Amazon and StrikeIron are making it easy to use either SOAP or REST with their Web services, and in some cases it makes more sense to use one versus the other. People will use what they know, and any one doing serious work in commercial Web services ought to make it easy to use both.
Posted by (2 comments )
Reply Link Flag
Why mutually exclusive?
Several Web services publishing companies like Amazon and StrikeIron are making it easy to use either SOAP or REST with their Web services, and in some cases it makes more sense to use one versus the other. People will use what they know, and any one doing serious work in commercial Web services ought to make it easy to use both.
Posted by (2 comments )
Reply Link Flag
Fallacy
The problem with the argument as currently posed is that it presumes that there are only two possible solutions. The reality is that the practical solution falls somewhere in between.

The more concerning issue surrounding web services is that the major players (IBM, Microsoft, et al.) all have patents on the underlying technologies. It makes one wonder what would have happened if a major vendor owned the underlying technology of a standard such as TCP/IP, i.e., would the Internet have experienced the explosive growth of the 1990s?

I'm all for making a buck, but the next corporate wave seems to be occuring in the courtrooms as major players beef up their patent portfolio to sue their competitors out of existence. This also has the effect of reducing or eliminating the need for innovation, a strong reason for eliminating software patents in the first place.

I also take issue with web services being touted as the foundation of modern computing systems. The author seems to make a broad, unsupportable generalization about technologies that by any standards have yet to be uniformly adopted in enough volumes to warrant the statement. It smacks of evangelism, not fact-based reporting and casts a shadow on the rest of the article.
Posted by (2 comments )
Reply Link Flag
Right On Charles!
This is about much more than simple vs. complex
ways of making use of the Internet. The bottom
line here is Open vs. Proprietary. Is the the
Internet of the future going to be Open or Owned?

The WS-Star guys are using these proposals to
claim ownership over core Open Internet protocols
and methods important to Web 2.0. IMHO, the more
important battle ground here will not take place
at some predator friendly standards group like
the recently pillaged and raped OASIS, where
WS-Security was ratified. No, the battle will
more likely be fought between the Apache Open
Source Community and WS-Star predators. (With
IBM caught nervously in the middle, their Open
Source reputation and investment at stake.)

An interesting question is how will the W3C play
this titanic struggle for the future of the
Internet?

WS-Star predators failed in their attempt to get
the W3C to accept their encumbered and permission
ladened proposals. They failed to get the IETF
to go along with a similar scheme involving an
encumbered Sender ID proposal and the MARiD
standards effort. So the gangsters went forum
shopping and found an anxiously compliant and
willing OASIS.

Will Apache turn to the W3C, or go it alone with
the Apache License as the governing model? Or
will the W3C step into the breach, and take back
the Open Internet?

Stay tuned, video at 11.
~ge~
Posted by gary.edwards (17 comments )
Link Flag
Fallacy
The problem with the argument as currently posed is that it presumes that there are only two possible solutions. The reality is that the practical solution falls somewhere in between.

The more concerning issue surrounding web services is that the major players (IBM, Microsoft, et al.) all have patents on the underlying technologies. It makes one wonder what would have happened if a major vendor owned the underlying technology of a standard such as TCP/IP, i.e., would the Internet have experienced the explosive growth of the 1990s?

I'm all for making a buck, but the next corporate wave seems to be occuring in the courtrooms as major players beef up their patent portfolio to sue their competitors out of existence. This also has the effect of reducing or eliminating the need for innovation, a strong reason for eliminating software patents in the first place.

I also take issue with web services being touted as the foundation of modern computing systems. The author seems to make a broad, unsupportable generalization about technologies that by any standards have yet to be uniformly adopted in enough volumes to warrant the statement. It smacks of evangelism, not fact-based reporting and casts a shadow on the rest of the article.
Posted by (2 comments )
Reply Link Flag
Right On Charles!
This is about much more than simple vs. complex
ways of making use of the Internet. The bottom
line here is Open vs. Proprietary. Is the the
Internet of the future going to be Open or Owned?

The WS-Star guys are using these proposals to
claim ownership over core Open Internet protocols
and methods important to Web 2.0. IMHO, the more
important battle ground here will not take place
at some predator friendly standards group like
the recently pillaged and raped OASIS, where
WS-Security was ratified. No, the battle will
more likely be fought between the Apache Open
Source Community and WS-Star predators. (With
IBM caught nervously in the middle, their Open
Source reputation and investment at stake.)

An interesting question is how will the W3C play
this titanic struggle for the future of the
Internet?

WS-Star predators failed in their attempt to get
the W3C to accept their encumbered and permission
ladened proposals. They failed to get the IETF
to go along with a similar scheme involving an
encumbered Sender ID proposal and the MARiD
standards effort. So the gangsters went forum
shopping and found an anxiously compliant and
willing OASIS.

Will Apache turn to the W3C, or go it alone with
the Apache License as the governing model? Or
will the W3C step into the breach, and take back
the Open Internet?

Stay tuned, video at 11.
~ge~
Posted by gary.edwards (17 comments )
Link Flag
Some truth to the argument
There is some truth to the argument that Web services are getting too complex for customers to digest and that the problem is getting worse. There is definitely a problem here.

Maybe we can trace the roots of the problem through the history of Web services:

1. As the Web evolves, many different developers create their own home-grown XML over HTTP protocols just because it "made sense"
2. There is a proposal to standardize this function (combining those that exist) and make sure higher level tools can created.
3. The first specifications (SOAP WSDL UDDI) seem to be just what the doctor ordered and the developer community is jazzed.
4. Web services are hyped as the next big thing after all large vendors support it. Everybody predicts Web services everywhere.
5. The large vendors believe the hype and start spewing out standards _before_ they release tools to use them.
6. Customers start the fatal "wait for standards" and large companies start the "wait for customers" through three "year of Web service" predictions.
7. Finally somebody realizes this and writes an article about the complexity problem.

Meanwhile I believe the smaller startups have definitely fared better because they remained connected more closely to the customer and have a more pressing need to provide value.

Time will tell how this story will progress.
Posted by bmukund (12 comments )
Reply Link Flag
Some truth to the argument
There is some truth to the argument that Web services are getting too complex for customers to digest and that the problem is getting worse. There is definitely a problem here.

Maybe we can trace the roots of the problem through the history of Web services:

1. As the Web evolves, many different developers create their own home-grown XML over HTTP protocols just because it "made sense"
2. There is a proposal to standardize this function (combining those that exist) and make sure higher level tools can created.
3. The first specifications (SOAP WSDL UDDI) seem to be just what the doctor ordered and the developer community is jazzed.
4. Web services are hyped as the next big thing after all large vendors support it. Everybody predicts Web services everywhere.
5. The large vendors believe the hype and start spewing out standards _before_ they release tools to use them.
6. Customers start the fatal "wait for standards" and large companies start the "wait for customers" through three "year of Web service" predictions.
7. Finally somebody realizes this and writes an article about the complexity problem.

Meanwhile I believe the smaller startups have definitely fared better because they remained connected more closely to the customer and have a more pressing need to provide value.

Time will tell how this story will progress.
Posted by bmukund (12 comments )
Reply Link Flag
balance?
I agree that some problems are more natural for REST-style
solutions and others are more natural for RPC-style services like
SOAP. So it's clearly not an exclusive choice.

Howerver, I think that the overhead associated with SOAP is
going way beyond the minimum required to make the basic RPC
apporach work effectively. I also think that the fundamental
problem in many of these service architectures is a data
representation one, not a protocol one, and it seems that REST
and SOAP both have to deal with that sort of standardization.

My experience of using SOAP is that the extra structure of SOAP
requests sometimes makes interoperation harder rather than
easier. Paul Prescod's papers on REST (on the web, just search)
present a very clear argument to that effect.

In particular, I think SOAP's linkage to XML Schema is a big part
of the excess complexity in this area.
Posted by modified.dog (2 comments )
Reply Link Flag
balance?
I agree that some problems are more natural for REST-style
solutions and others are more natural for RPC-style services like
SOAP. So it's clearly not an exclusive choice.

Howerver, I think that the overhead associated with SOAP is
going way beyond the minimum required to make the basic RPC
apporach work effectively. I also think that the fundamental
problem in many of these service architectures is a data
representation one, not a protocol one, and it seems that REST
and SOAP both have to deal with that sort of standardization.

My experience of using SOAP is that the extra structure of SOAP
requests sometimes makes interoperation harder rather than
easier. Paul Prescod's papers on REST (on the web, just search)
present a very clear argument to that effect.

In particular, I think SOAP's linkage to XML Schema is a big part
of the excess complexity in this area.
Posted by modified.dog (2 comments )
Reply Link Flag
 

Join the conversation

Add your comment

The posting of advertisements, profanity, or personal attacks is prohibited. Click here to review our Terms of Use.

What's Hot

Discussions

Shared

RSS Feeds

Add headlines from CNET News to your homepage or feedreader.