XML complexity as war of attrition
by Simon St. Laurent
Ari Krupnikov sends a warning about the dangerous consequences of the ever-growing complexity of the XML family of standards.
People who have been working with XML long enough may remember that XML began as a simplification process, an effort to reduce SGML to a small enough feature set that it would be easy for developers to implement and humans to use. Although the original XML 1.0 acquired a few complexities of its own during its development cycle, it was still relatively simple to implement. (I've personally implemented about half of it for a J2ME parser, and I hope to finish creating a complete J2SE parser in the next couple of months.)
Since then, however, the specifications have grown and grown, far outpacing the SGML specifications that XML was designed to replace. Namespaces in XML was brief, but has spawned thousands of endless discussions. W3C XML Schema has been cursed for years as an over-complicated and seemingly contradictory contraption. While XSLT 1.0 and XPath 1.0 largely avoided major controversy, their bloated 2.0 descendants are enormous, picking up huge volumes of extra features from their W3C XML Query cousin and the insistence that they support W3C XML Schema.
On the Web Services front, the arena where vendors push XML the hardest, it's reached the point where simply keeping up with the competing specs is a full-time job, and where some are suggesting "blackmailing" vendors (just with dollars) to ensure interoperability.
Krupnikov suggests that this complexity isn't an accident. Although some find his suggestions "conspiratorial", it doesn't sound all that far from business as usual by large vendors forging ahead with business plans that they can, in the end, control. Driving competitors out of the market is, after all, how some organizations make money, and if they can do it by offering customers more and more attractive-sounding features, they've combined marketing power with significant difficulties for their competitors.
What Krupnikov doesn't say - perhaps it's obvious - is that this war of attrition may not be so profitable for the rest of us. From my perspective, XML is collapsing under the weight of these features, with many developers giving up what seems like an infinitely complex system and turning to tools created by vendors who promise to ease the pain. As people turn to tools, they lose sight of the capabilities that were available in the simpler specifications, trapping themselves ever deeper in the webs of features cast by companies with a clever business plan. Larger webs catch more food, and may overshadow and starve out smaller webs.
Would you like some features with that?
I believe that such theories about the complexity of the later XML specs are quite justified. While the intent may not be quite as clear cut, it is undeniable that entities such as Microsoft fully believe that easy-to-implement protocols are what makes its Open Source/Software Libre competitors capable of competing. The Halloween documents are clear-cut proof of this.
me too :) serious comment below
As you say, if you have lots of developers a good way to beat off the small upstarts is to make them spread their limited resources across a range of targets. If you can convince potential customers that these things are must have features then the small developers are always playing catchup; implementing features just so that a customer can check a box on a RFQ for something they will never use.
A conspiracy? More like over want
Rather than subscribe to the conspiracy theory (although anything involving Redmond has to fit) I think what we are seeing happen to XML is no less than what happened to HTML - over want for the protoocl to do more than it was ever supposed to do.