Standardization not always an answer

by Simon St. Laurent

Related link:

Miles Sabin brilliantly writes "All in all, I think the costs of standardization have been wildly
underestimated," without noting the corollary that the benefits of standardization are often elusive.

The benefits of XML's standardized syntax seem pretty clear at this point, despite some annoying aspects that just won't seem to settle. Markup syntax seems to belong to the side of technology that benefits strongly from network effects - things like low-level protocols. Beyond that, there are lots of questions, especially as the market for any given standard grows smaller.

For the past few years, any time questions arose about how to make XML useful, the usual answer was "well, everyone needs to get together and hammer out an agreement about what vocabulary to use." It sounds so simple, if you're fond of committees. Of course, gathering to produce agreements sets up the usual set of political arguments, as participants assume different roles, use their clout, often close the doors to outsiders, and make compromises that either leave much-desired features out or put too many features in.

Sometimes the process has a breakout hit - HTML is pretty widely understood, though even that has been a long slow process - but the notion of setting up an organization to determine the one true approach to a particular information representation problem seems, well, laughable. Once a proposal reaches a certain critical mass, adherence to standards is crucial to using them, but reaching that point doesn't seem to have anything directly to with the standardization process.

Looking past the difficulties of getting a standard created, standards themselves create problems. Standards can be traps that limit organizations from applying their own understandings of problems, or burdensome contraptions that require organizations to provide information they may or may not consider important.

Another strange phenomenon is the pile-up of proposed standards that seems to be happening as more organizations join the Web Services universe. While Web Services in some ways reopen the world to anarchic diversity by letting service providers create services and publish descriptions, the organizations pushing this set of technologies seem bent on standardizing every possible tool-related aspect of this anarchy. SOAP begat WSDL and UDDI, and now we have piles of new proposals surfacing nearly weekly.

As automated processing moves further and further up the semantic ladder and standards address more and more specialized fields, it often becomes less clear whether the benefits of "everyone's doing it" outweigh the costs of "we all have to do it the same way."

Computing has traditionally been about creating information monocultures, fields of common structures where ambiguity is minimized and the simple logic that computers have traditionally provided is challenged as little as possible.

As the computing world starts to digest markup's combination of simple foundations and potentially endless labeled structure, it may be time to reconsider the creation of shared monocultures, and perhaps even jettison that process for a wide variety of diverse information understanding that are processed locally - where developers have the clearest understanding of the work they are doing, with sharing handled on an as-needed basis rather than given first priority.

Is this just a standard anti-standards rant?