Is XML Doomed?

by Kurt Cagle

Call it Len's Proposition ... I've been taken to task recently by Len Bullard for my unflagging support and belief in open standards in general and XML in particular. I respect the many voices here on XML.com, especially Len's, so when he starts saying the sky is falling I generally at least look up, but it occurs to me that this offers an opportunity for many of those same commentators to express their positions about the big issues about XML. With AJAX and JSON in one corner, .NET and Linq off in another, Java sitting impatiently in a third, not to mention a host of languages such as Scheme, Lisp, Haskell, etc., just waiting to get their boxing gloves on, XML's position may be far from secured. So I think the ultimate question I'd ask here is simple:

Is XML doomed? Is it fatally flawed, too weak, not weak enough, too abstract, too specific? Is the core philosophy that it enables, the principle of open standards, a far-left communist plot or the salvation of the computing world as we know it? Have we gone down the wrong path, and only determined action now will right that wrong?

I'll weigh in with my own opinions in a bit, but I open the floor to one and all ... was XML a mistake? If so, why, if not, why not? (You may use a #2 pencil for your answer).

39 Comments

John Flack
2007-03-14 10:10:23
XML is not doomed - it answers too many real needs for data that comes with its own metadata. However, it has been mis-used as well. For instance, XML is NOT a database. XML was not designed for fast random access and retrieval.
Joe Max
2007-03-14 10:31:54
Open standards are the salvation of the whole world and not just the computing world.


They unite us. They give us freedom. They will also give us bread. XML shows us the way and we will fight to the end.

Jason
2007-03-14 11:00:16
Open standards are the only effective means for the Internet to exist. And the only effective way to *educate* the next generation of developers, and not lock our data into proprietary formats. XML as it stands can be used or mis-used, just like any hammer.
W^L+
2007-03-14 11:12:38
XML, perhaps with some alteration (such as accepting JSON as merely an alternative representation of XML or even better allowing the encapsulation of JSON within XML) is going to be around for some time. I also agree with John and Joe, that XML meets the needs of businesses for self-describing and transformable data. Open standards ARE the future.


That said, there will remain certain niches where there is not a way to support multiple vendors, where proprietary "standards" and formats will continue to thrive. In general, however, I believe that XML, JSON, and a large set of other open, standardized tools (including ECMAScript/JavaScript and of course Java) will remain for some time.

Terry Laurenzo
2007-03-14 11:30:46
I would be skeptical of anyone who made such a sweeping statement like "XML is Doomed". XML has turned into an ecosystem. I think some parts of that Ecosystem are doomed. But just because some of us think that some parts of the ecosystem more resemble cancer than innovation doesn't mean that the standards in those spaces won't be dragged forward for a good deal longer by middleware companies trying to get rich off of selling bloated ideas to ignorant customers.


So to answer your question: XML was certainly not a mistake. The unthinking, crazed high that we all got from it for so long, however, was a mistake. But it was a mistake that made a lot of people a lot of money, so it was somewhat inevitable.

GreatWhiteDork
2007-03-14 11:33:21

Although I'm sure we all have our favorite stories of hideous XML mis-use and contrasting stories of wild success, I really don't think XML is doomed anymore than any other technology. No technology is truly "binary" in usefulness (I'll use it for everything or not at all!!).


Like any other language, human or computer, it's value is all in how you use it.


As for open standards, if they are communist, then long live the revolution! They are the best way of ensuring that everyone can participate in innovation. There just needs to be more of them. You just can't make one standard cover everything well. Multiple specs are how we end up with our Hegelian holy wars (e.g. ODF/OOXML) of diametrically opposing, or at least very significantly different, specs that end up converging to something we can live with. If there's a great many, the ones that suck don't get used and the ones that are good live on to change and spark new holy wars later on.

Sylvain Hellegouarch
2007-03-14 12:20:57
Kurt,


You know how much I respect your views but how can you ask that question a week after your essaye on XSLT2 forthcoming success?


Mind you because I do respect so much what you say I would venture saying that your capacity to praise for XSLT2 while being able to admit the underlying technology might be flawed in some aspects is evidence that you're not blind.

bryan
2007-03-14 12:23:35
XML was stabbed in the back by its schema language. Dark muttering conspiracy theories are our only hope now.
Devon Young
2007-03-14 12:48:28
XML was a work of art. It's only a 1.0 spec though, so I don't see how anyone could write it off as "fatally flawed", before saying "we need to improve it". I've always liked XML's draconion parsing standard, but recently I've begun to be convinced that maybe it would be a good idea to have some kind of error handling rules instead of errors killing. I think XML could use some improvements, but it's definitely not fatally flawed.
Kurt Cagle
2007-03-14 13:59:43
Sylvaen,


Not to get too far ahead of myself here, but I ask this purely in the spirit of inquiry. My personal feelings on this should be pretty clear - that as a standard it is useful for many things ... it has some significant shortcomings, but I have yet to see any technology that didn't. However, I think the question should be asked, because I've noted that the question is being asked at a fairly high level and the only way of remedying those shortcomings is to be open and "not blind" to the limitations.

Josh Peters
2007-03-14 14:00:07
The answer is "maybe, but probably not."


The two biggest issues with XML today seem to be the passé-quality of angle brackets and the lack of a binary format.


The first thing is mainly stylistic and is subject to whims of the crowds who care. RELAX NG has found a lot of popularity for being easy to write (I refer to its compact form) which leads me to believe that XML in general would benefit from a similar compact form (which has been discussed at length every so often in the past few years on this site). The wisdom here being that if XML is easier to read and write it will be easier to maintain. The opposing viewpoint seems to believe that nobody writes XML by hand anyway, they use good tools. This argument does nothing to refute the first, so XML could adopt a compact syntax so long as it is fully compatible with available tools.


The second issue has been a sticky point of which I still miss the value. OpenDocument and OOXML seem to have dealt with the desire to have a compact over-the-wire format by using zip (or gzip I can never remember). Unless there is a desire by one camp to convert XML into some form of bytecode language I'm not sure why binary-proponents make as much noise as they do on this subject.


The recent grumpiness towards the W3C (from the HTML people and others) seems to be a bigger issue in my mind than whether or not XML is doomed.

Sylvain Hellegouarch
2007-03-14 14:04:51
Kurt,


I see. I didn't want to apply you weren't consistant by the way ;)

Michael Day
2007-03-14 15:19:48
With AJAX and JSON in one corner, .NET and Linq off in another, Java sitting impatiently in a third, not to mention a host of languages such as Scheme, Lisp, Haskell, etc., just waiting to get their boxing gloves on, XML's position may be far from secured.


This is a category error: in what way does the existence of the .NET and Java virtual machines and associated programming languages threaten XML? Do they also threaten the existence of other data formats, like CSV, PDF, and PNG?


Similarly, functional programming languages like Scheme, Lisp and Haskell can be used to manipulate XML (and very nicely, too) but I don't see many people serving up Haskell on the web or writing books in S-expressions.


As for AJAX... I don't think that a browser programming paradigm affects XML to any great degree. JSON is the only actual competitor to XML that you listed, but like YAML and S-expressions it is unlikely to spread far outside of its current niche.


In my opinion XML has been a little bit too successful, leading to architectural missteps such as namespaces and XSD schema being enthusiastically adopted before they're fully cooked. The success of XML also thoroughly killed off SGML and put a huge damper on the development of new markup languages, which now have to ensure some level of compatibility with XML in order to be relevant.

M. David Peterson
2007-03-14 15:24:59
This one's too good to want to jump into > I think I'll just watch.


And laugh.


And watch some more. ;-)

Rick Jelliffe
2007-03-14 17:28:23
I think it depends on your starting position. It has certainly got vendors onside, been embraced, stimulated a lot of libraries and products, and enabled the current decade of openness, collaboration, and standardization in file formats.


From the POV of the industrial document publishers who were using SGML, it probably has been a step sideways, if not backwards. I was talking to a mate in industrial publishing last week, who said that basically they were doing the same things with less capable (but better supported) tools that they were doing ten or even fifteen years ago: markup, transformation, large-scale link generation, validation, multi-publishing to lesser formats, re-combination for different products with different formats and on different media.

Ric
2007-03-14 20:22:56
My comment was too big, so I put in in my blog: http://www.json.com/2007/03/14/xml-is-dead-long-live-xml/
However, the pingback did not seem to work.
Kurt Cagle
2007-03-14 20:34:28

This is a category error: in what way does the existence of the .NET and Java virtual machines and associated programming languages threaten XML? Do they also threaten the existence of other data formats, like CSV, PDF, and PNG?


Um, this was meant primarily as a literary device, not an accurate description of the industry. Apparently an unsuccessful literary device ... oh, well, you win some and lose some.


As to your assessment about the impact of XML on SGML, I think you're completely correct there (not sure I completely agree on namespaces, though ... there are problems there, but I'm not sure any other solutions are all that much better). There are times where I wonder if the next stage in the process of XML is not in fact the incremental improvement but rather the revivification of SGML. XML is a meta-language. SGML is a meta-meta-language, of which XML is one instance. SGML emerged in an era of extremely low bandwidth, minimal computational power (compared to today's processors) and a very document-centric approach predicated largely on its origins. If SGML was in fact to be re-envisioned today, I suspect that it would likely prove rather dramatically different, given the experience of a decade of XML and nearly four decades of SGML practice in place.

David Carver
2007-03-14 21:16:49
One thing I think many of us as architects tend to forget, is that in the world of trying to get things accomplished, it's not necessarily the perfectly architected system that gets the job accomplished. XML while it has it's flaws, is "good enough" to accomplish many such tasks. The flaws that show themselves tend to be in the corner cases. The areas where maybe XML isn't the best choice.


My own back ground coming from data integration, and not the SGML publishing arena where XML got it's birth, is that XML has made the integration in B2B scenarios easier to deal with. Even XML Schema is not perfect, and the few enhancements and tweaks that seem to be coming out of the next release will elevate some of the issues there, but it will still have flaws.


I do agree that XML in many cases has been used for the wrong things, but that is partly due to its flexibility and really ease of implementation. It's this same ease of creation, that also is it's downfall to an extent, as it starts being used to store everything. I think over the years an equilibrium will occur on where and when it will be used.


What I find more scarey is the lack of knowledge coming from the people that are using it, and do not understand the underlying specification or why their tools work or generate XML the way they do. I see more issues from that, instead of issues with XML itself.

Scott
2007-03-15 10:39:48
XML is not doomed because of two reasons:
1. DoD
2. Distance education


XML might fail in business application. Its rumored the DoD uses XML for basic doc ML, and SCORM 2.0 use of XML will secure its place as the metadata resource for educational development.

Ralphie
2007-03-15 12:28:57
With the coming (now!) revolution of open source, there are so many variables, and distractions. The idea of any perceived product, code, or concept being doomed or successful is changed , hour to hour, by the rapid growth of technology and innovation. Xml definitely has its place in the future, with (at least) perceived support.
Bill Donoghoe
2007-03-15 14:57:21
For XML to be doomed it needs a replacement which takes us forward. The major strength of XML is that it is a true standard that has commoditized data exchange formats. It is not perfect, no standard ever is, but it offers a stable platform to build upon. Thus, IMHO, XML will continue to be supported in the future with the newer standards needing to provide bridging functionality (like Unicodes support for ASCII).
ShelbyGT
2007-03-15 21:13:31
While I don't think that XML is doomed, I don't think it is living up to its potential at all. I think that a large part of the problem is the standards process itself - I need to be careful here, one of my immediate coworkers coauthored XML. The result of the standards efforts has been a vast collection of disjoint standards, solving intellectually stimulating problems more often than conveying a true sense of purpose and benefit. As a result, few standards are implemented well, many standards simply die on the vine, and most developers remain totally ignorant to the true possibilities of XML.
Stormy
2007-03-16 23:23:51
XML is Brilliant! Ive recently built a database using XML, and if used correctly, it certainly has a wide swath of uses beyond what most people envision. Its also brilliant because its not only helped standardize and "clean-up" so many other recommendations (ie HTML and the web) but has allowed so many non-standard uses across a wide range of industries.


XML is NOT doomed at all, and to me is the "bedrock foundation" for a new Web and new data exchange framework across the board, that will no doubt take us into the next generation of standards, both online and offline. The only thing holding back mankind from moving agressively forward with this new "family of standards" will be this proprietary prison we find ourselves in currently, all fueled by corporate greed. - Stormy

f.enzenhofer
2007-03-17 02:57:35
once programmers thought: everything should be done in XML, than a few programmers asked: why? now we realized, that XML can be very usefull, but that doesn't mean, everything has to be XML.


XML isn't doomed, but the missuse of XML is, sooner or later, doomed.

Peter Hale
2007-03-17 05:38:35
XML is thriving as a document exchange framework and as the basis for ontologies. More work is needed to increase its use on the web, and on translating from one representation that might be XML based or other formats to the XML representation required for any particular purpose. There is a need for widespread development of tools that can save or convert a document or web page to XML e.g. Amaya http://www.w3.org/Amaya/. Then lots of XML information would appear on the Web. With more XML based information there is more incentive for the development of XML search and visualisation tools.


What we need is end-users who can work with XML without needing to know anything about it, and experts to provide them with the tools and applications they need, and who must have the in depth knowledge as David Carver suggests.

Mikael Bergkvist
2007-03-17 06:26:57
Infact, all the aspects of XML hasn't been explored yet in regard to webapplications, which this site here, http://www.xindesk.com and the blog, http://www.xindesk.com/blog , is solid proof of.


It's using XML in a completely new way, and that in itself means that, if not all the aspects has been even explored yet, it's surely not over by a longshot.

len
2007-03-18 08:50:07
The easiest way to find out why people support a language is to take note of its imminent death. It is the sort of tactic that made Huey Long and even George Wallace famous. In the case of the latter, he made the proclamation, "Segregation now..." etc, was reelected multiple times, and in all of those terms was systematically dismantling the legal infrastructure of segregation.


That's politics.


XML won't die. VRML never died. X3D is better with the exception of its XML Schema which is byzantine and did not improve the grammar or the syntax. It replaced the curlies with the pointies and the squares with extra tags such as and so on. It is easy to abuse any syntax if the opportunity to improve the syntax it is replacing does not also improve the semantics of the language being upgraded.


If XML is replaced, it will be replaced with an object XML because the primary forces for XML are not documentation but programming. The abuse, misuse and outright funny things done with XML so far have mostly been done by programmers. Once SGML was effectively killed off, it was the uptake by the programming community that spawned all the mistakes.


I say that provocatively but sincerely. SGML was a document wonk's language. XML is a programmer's language. It will change, evolve or die based on the perceived needs and desires of a programming society at some point in time and it is the point in time that will be the crucial determinant of that event, the causes and the means.


As the person who wrote and published "Information Ecosystems" now over a decade ago, I was as interested in the higher level model where the intersection of technical interoperability is actually driven not by technology but human will and needs. The OO-who'sXMLToday battles are a very illustrative example. That XML enables plurality WAS THE WHOLE POINT BECAUSE HTML CANNOT DO EVERYTHING AND ITS EVOLUTION WAS AT A DEAD END.


Now when plurality comes, legions step up to fight it as if all cars must be black and Fords.


You are a funny society. For all you think you achieve, you have produced a http-drivenXMLOntheWeb application ecosystem that is harder to use than the fat clients it replaces (try editing Google blogs and uploading photos), the largest mass rip off of copyright one can envision (and good can come of that because creativity is a mashup by nature), enabled both the China firewall and the internal revolution to heat up at once, and a list of other societal ills and benefits that may just bring it all to its knees faster than it would have happened otherwise.


But you are not yourselves changing very fast, and that may be the reason XML dies or deserves too. As the aliens say, "You are not ready for this technology..." and they are usually right in the movies. But this is not a movie. It is your life and the potential lives of your children's children.


You might want to work on the interoperability of the programmers. Compassion, tolerance, self-restraint... and the occasional bender when you just can't hold it in.


Otherwise, the web dies and takes XML with it. One doesn't think that can happen, but it really doesn't take a big catastrophe, just one that keeps amplifying by dint of unrestrained feedback.


XML is just stuff. The meaning is between the brackets.

Kurt Cagle
2007-03-18 13:28:41
Len,


I was hoping you would join the fray soon, and have been awaiting your post with a certain glee. It is, as always, erudite, salient, and thought provoking.


Your reference to George Wallace is intriguing, and I have to admit, particularly apt; for all that there are many aspects of Wallace I didn't care for, there is no question that the man was a brilliant politician dealing with an incredibly complex problem over decades and ultimately, I believe, finding resolutions that worked for the betterment of all, not just in Alabama but throughout the South (much of my childhood was in Montgomery, though at the time he was for me mostly background noise, but it helped make me aware of that period and the players involved).


Also, a brief followup on VRML. I like VRML, and like you like X3D except for its XSD, which is byzantine. My comments previously concerning VRML had more to do with its timing, that it was pushed as a standard at a time prior to the development of the necessary infrastructure to support it, and that as a consequence VRML gained a very bad reputation that will likely be hard to overcome, even though we are probably now well within the window where VRML is feasible. I also want to note that while the perceptions of VRML developers may have been for the production of an open 3D standard, in the realm of the communication interface - the articles, books and websites - it was definitely the SnowCrash view that predominated.


I think this leads to an interesting point, one that I see reflected in a number of comments here and that seems to be relevant to most standardization efforts. For all that I have occasionally be involved with the W3C, I see myself more as being part of the evangelism efforts on a lot of these techs, responsible both for getting the word out about technologies and focusing on where such technologies are most and least appropriate.


One area that I have noticed that the W3C falls down on -- a great deal -- is in this evangelism role. There is a certain level of Field of Dreams belief here, that if you build a standard, people will recognize it as some kind of holy grail, will adopt it without question, and that there is no real need for promoting these technologies beyond their own fairly limited efforts.


Unfortunately, or fortunately as the case may be, this also means that technologies based upon these standards may very well use things in ways never intended (and in some cases completely counter to their intended uses) because there are no boundaries, no expectations about whether this is the "appropriate" approach to take.


The downside to this is that XML has had a tendency to be seen as the equivalent of superglue or putty, shaped into whatever was needed whether the use made any sense or not to the overall development process of XML (or of the user for that matter).I personally believe that this is a self-correcting problem in the long term - the things that work will be remembered, the things that don't work will be remembered too, shaping the direction of the technology as appropriate.


XML itself involved a streamlining of SGML, ostensibly for bandwidth reasons, but I am beginning to realize for other reasons as well. Fielding's Thesis boils down to the notion that you need to maintain simplicity at the lowest levels of a network, because the degree of variability at the lowest levels has an exponential impact upon the complexity at higher levels. Stephen Wolfram echoes this in A New Kind of Science, where he indicates that even very simple automata tend to be resposible for complexity and chaotic behavior. SGML is not all that complex, but it is sufficiently complex that it blurs the lines between abstraction layers fairly dramatically, and that in turn makes it more difficult to isolate design patterns.


My suspicion at this stage is that XML now has a life of its own, one that's only marginally controlled by the W3C. Certain constructs we've built around it to make it manipulatible within imperative languages, such as the the DOM 2.0 specs, are proving to be more like training wheels than necessary aspects of the language and there are now efforts to rethink what exactly we mean by these actions (and like training wheels left on too long, they can hinder a person's ability to use the language to its fullest extent). Others, such as XSLT are fairly brilliant (if occasionally flawed) realizations that it is in fact the nascent patterns within XML structures, rather than the imposed semantics, that will ultimately shape the language's use.


On the other hand, I do wish to argue one point you bring up:


For all you think you achieve, you have produced a http-drivenXMLOntheWeb application ecosystem that is harder to use than the fat clients it replaces (try editing Google blogs and uploading photos)...


Is this a cry against XML, or against web applications? We've disagreed on this point before, so it strikes me that trying to argue it here further may end up being frustrating for both of us. However, my primary contention is that this really comes down to the point in an application where the logic comes from the web rather than from a standalone environment. If I'm editing a blog, I really don't want to use Microsoft Word to do it - I'd rather use something small and lightweight and that can handle the display aspects as well as the editing aspects.


For instance, Performancing is a pretty cool app - a Mozilla extension - that I actually use now for most of my day-to-day writing, largely because my presentation layer is the web itself. Is it as full featured as Word? No, not by a long shot. However, this was more a conscious design decision rather than a limitation of the environment - I've built similar apps for clients that actually did have a significant portion of the Word namespace and that worked considerably better with XML requirements than I believe Word does.


On the other hand, I know other developers who use the XSLT facility of Open Office to generate both websites (not pages, sites) and standalone PDF. Both of these are "XML" applications, but it can be argued that neither one of them has a high dependency upon HTTP except at the actual time of data acquisition or publication.


Indeed, one of the things that I like so much about XML is that while it works quite well over HTTP, it doesn't have an explicit dependency upon it. HTTP is simply one transport vector among many, and understanding this can change the way that you work with XML.


I don't necessarily think that a formal methodology about XML has fully emerged yet, though it's getting there. This isn't surprising - it took nearly two decades of near constant use for OOP to gain such methodologies, and I consider XML to be at least one abstraction layer beyond OOP - not superior or inferior, just working at a different level of organization. To a cosmologist, the Sun is just a particularly well defined use case and the real interest lies in the galactic clusters and intergalactic bubbles, to a planetary astronomer the sun is immensely important, to a geologist the sun is largely an abstraction. Curiously enough, at small enough levels cosmology does indeed become important - but the sun is important primarily as a source of particles. Different levels of organization, each important in their own way, and at the appropriate layers one facet of this system - the Sun - has a paramount place.


Are the Luddites holding back the evolution of XML? I frankly think that the real Luddites in this case aren't necessarily those hoping to keep the standards pure (that sort almost never wins, no matter how noble their cause) but rather are the people who choose ignorance and staying with a technology that they are "comfortable with" even if it isn't appropriate.


An argument was made earlier in this thread that XML isn't a database. Actually, I tend to see XML as being a domain specific database at the appropriate levels. It's a way of storing state. If you move away from the notion that XML has to be in files, understand that "groves" are possible within an XML context as they are within an SGML context, and that the internal representation - the indexing scheme - does not in fact necessarily have to look anything like an XML document so long as the interfaces can be presented to make it look so, then I think XML can actually make for a very good database.


XML's an abstraction - a serialization face - and has the advantage of presenting information in a hierarchical format, something that is very intuitive to the human brain. SQL JOINS are fairly counterintuitive unless you're trained to use them. Directory oriented XPath generally makes more sense, even with the notion of predicates (which SQL also requires).


Most people use SQL, because this is how they were taught, and because relational databases have existed since the late 1970s while even the oldest XML databases are less than a decade old. I think that XML databases are beginning to gain some serious exposure now (eXist, sleepyCat, X-Hive, others) and many relational database now also sport XML oriented interfaces. Thus education will come as the tools become available, affordable and sufficiently powerful to make their adoption attractive.


Are we ready for the Internet, for XML, for working with global systems? I frankly don't think that it matters. Human beings have a remarkable ability to foul their own nest, but at the very last minute to realize that they may be in trouble if they don't clean things up, and then they clean up. It never comes without cost, and sometimes those costs are horrendous, but it happens.


I also believe that the Internet is an incredibly disruptive technology, and while we are used to technology advances occurring at breakneck speeds, society does not adapt that quickly and friction inevitably occur. I foresee a number of economic and social crises taking place as the definition of work, value, and social interactions get (repeatedly) redesigned. Having unleashed the genie, we can't put it back in the bottle, and in many ways I suspect this would be unwise even if we could. Society will adapt, it always has, and in the end it will end up being neither a perfect utopia nor a perfect dystopia, though it will almost certainly be both eerily familiar and inconceivably different from our own society.

len
2007-03-18 20:52:18
"Is this a cry against XML, or against web applications? We've disagreed on this point before, so it strikes me that trying to argue it here further may end up being frustrating for both of us. However, my primary contention is that this really comes down to the point in an application where the logic comes from the web rather than from a standalone environment. If I'm editing a blog, I really don't want to use Microsoft Word to do it - I'd rather use something small and lightweight and that can handle the display aspects as well as the editing aspects. "


1) I'm not crying against XML. XML is fine. SGML was fine. As to complexity and beginnings, I think you'll find some interesting writing at 3donthewebcheap.blogspot.com. As I've said for quite awhile, complex beginnings can lead to more complexity, but Fielding is actually begging a question he himself does not know the answer to: what is the endgame design? In fact, one starts simple when one is experimenting and Fielding's work as reflected in the web starts with his thesis, not his final design. So what the results are the results of a learning curve, not experience.


In fact, VRML was created from a mature design and we knew that. That is why we are still using it. River of Life is a VRML97 world because I haven't hit a challenge that requires X3D. When I do, I will use it, but not before.


If you are for open standards then you support them in all cases, yes? Or do you make exceptions? What are your exceptions? Are they related to the source of the design or the problems the design solves? Can you identify all of those in advance? If you can't how can you be sure a standard is the right answer?


My problems are not with XML or with open standards. Both work in their respective ways for a range of problems. But so does Microsoft software. In fact, it is running on more machines than all of the open source combined at this time. When people pick up pitchforks and go after Microsoft for their hegemony, the reasons are as varied as there are people. Bray doesn't like them for various reasons but one he has been public about was the MS protests of his editing the XML standard when he was a paid consultant for Netscape. When Rick Jeliffe became a paid editor for Wikipedia, he got to hear about the 'elephant in the room' but Netscape, then being touted as the 'MS Killer' turned out to be a mouse that roared. If it frightened the elephant, well, silly elephant if they didn't know a web browser could be built by any competent team. There is much about that time that is silly.


The current dustup is a little less silly. State governments are being told to use an 'open standard' because it is 'open' not because it is technically complete or widely implemented. The public is being told that a standards organization with over 50 years of experience should refuse to allow a standard that is on the majority of the world's desktops because it is not open.


I'll leave it to the lawyers to figure out what open means. I am told the best definition comes from Microsoft competitors. Que bueno, but I do know that the fact of the majority use means that NOT to standardize it does far more damage than the question of its openness. Standardization enables more open use than the question of openness itself.


So what I am taking you to task for is not your support but your motives. Who really will be harmed by the ISO standard and who will benefit? Will it really harm world trade? Are you sure about that?


What I believe we have spawned in these issues is not a question of XML's survival or ISO's endorsement. I think we are starting to see through the W3C myths, the XML myths, the Spy Vs Spy myths where we all get recruited to support a cause to burn down the mission but we don't pay much attention to what comes next.


Ecosystems in the wild, say Borneo, are more readily recognized for their plurality and diversity of wildlife in all its forms. It is when the organizers begin to cut down the wildlife, cut roads, in fact, try to manage it for the good of the people but really for the good of those who extract resources from it, that is when the wildlife begins to go extinct. There is no doubt that we benefit by the management and the extraction of the resources, but only if we are very very careful how much management we need and how much extraction we allow.


It is, in my opinion, the case that the ISO standard for the MS technologies is a greater good than a harm to the customers for MS products. If by these crusades it is your intent to do good, ask if it is also your right to tell those customers what is good for them and if by doing so, you may become precisely that which your values do not support.


It is easy to thresh wheat. It is much harder to plant it.


Pardon me if I take a while to respond to these. I am working very hard on the River of Life world, blogging the design and the backstory so kids can learn VRML and 3D design using royalty free open technologies not to stifle competitors but because this is what a kid can afford, as it was in the early days of the web. I am tired of talking about open standards and 3D on the web. I think I can do a lot more good by using them and then showing others how I did that.


I can talk a lot and easily. It won't do as much good as Walking Kamala.

len
2007-03-18 20:59:04
Oh yeah... Google blogging: I would much rather create the blog in MS Word, proof it and get the graphics lined up right, then bulk load it to blogspot. Having to load graphics one at a time, then copy the link provided at the top of the file to the position in the blog every time, plus try to fix the formatting to match what is actually displayed rather than what the preview shows, that is WAAAAY more productive than an AJAX app trying to do a job over a network never designed to support interactive editors.


XMLhttp is a good trick as we always knew it would be, but the web is a lousy infrastructure for real-time work and we all know that. We accept it because we need the URLs. Otherwise, we would have shot the inventor.


All the yakking we can do won't change the fact that statelessness and REST are very dumb ways to edit files.

Peter Brown
2007-03-19 07:18:58
XML is the first intelligent "middleware" between programmers and business users. If used intelligently, it allows both to come to an agreed, explicit and encapsulated understandingof the information assets, objects and types in use in an enterprise and treat them as business assets. Without XML, programmers would be doing their usual stuff of running around talking to each other and pretending that the world is still flat if their data model still says it is.
Kurt Cagle
2007-03-19 10:20:26
This is fun!



If you are for open standards then you support them in all cases, yes? Or do you make exceptions? What are your exceptions? Are they related to the source of the design or the problems the design solves? Can you identify all of those in advance? If you can't how can you be sure a standard is the right answer?

I've spent a lot of time myself trying to figure out what exactly an open standard is ... like many words used casually, it's precise definition tends to be much harder to pin down the deeper you look.


I think what I tend to fall back on is the following definition for open standard:
1) Royalty Free - It does not cost a royalty should you use the standard in your work. This rules out most RAND based standards, no matter how low the "reasonable" bar is.
2) Patent Unencumbered - If you use the standard you are not violating any specific patents the infringement of which you could be held liable for some time in the future.
3) Transparent Development Process - The standard must produce periodic updates of progress, must be open to feedback from individuals and organizations, and the editorial list must be diversified across multiple organizations.


Those three criteria admittedly establish a spectrum of conformance - most of the W3C standards satisfy the first two criteria very well and the third fairly well (I think their process could be more transparent and I believe the membership fees are too high to enable participation by a broad enough coalition - this is a fairly common charge against them). OASIS is better in that regard, though not all of their standards are RF.


In general I feel that while the concept of open standards is profoundly important and should be supported, there are standards which I personally am more vested in than others, and there are some open standards that I readily see as having at best a niche market appeal. There are also "closed" standards which have obvious importance to computing technology - SQL and Unicode both come to mind - that I accept as a given either because these technologies predate the open source standards or because the cost and effort in "opening" them would prove prohibitive.


A standard represents a commonly agreed upon set of rules. It is not, by itself, a technology, and there are times where standards have been pushed before the technology itself had reached a point where such standards were viable, and as such ended up emphasizing the wrong aspects of the technology. WAP/WML provides a good use case there. In essence this implies that for a standard to be most effective, it needs to be written at a time when a consensus between various interests are beginning to appear but after the technological underpinnings have cooled to a point of being malleable ... just as working iron that's too hot will likely fracture the metal, so too does trying to develop a standard for a technology that is too much in flux result in failure. You have to lead with innovation, follow with standards ... always.


I don't tend to see existing usage by itself as being that major a factor in determining whether a standard or marketing forces should be dominant. Most technologies are like a game of Go - market share can be high for a given agent, but if the right pieces are placed on the board, that market share can vanish practically overnight. Market position is a backward facing factor that usually tends to heavily weight toward first-to-market, not necessarily best.


I am not going to say here that I feel Microsoft is incapable of innovating - clearly they are, and just as clearly those innovations have the additional advantage of having a powerful and effective marketing engine behind them. However, that position also gives them sizeable advantages that have nothing to do with their ability to innovate. They can keep OEMs from distributing other systems by creating fairly onerous licensing restrictions, and by threatening to withhold sales. They can pressure hardware vendors to build only to their specifications, meaning that for most other operating systems there is a process of re-engineering that has to be involved, usually at no small expense. They can insure that video and audio codecs remain proprietary, limiting the ability for alternative OS's to playback most media.


This is the real reason that I tend to be so much for Open Standards. They act as a leveling factor in the market; they act as a brake to the natural tendency in most capitalistic systems towards monopolization. I believe in the long run that open standards are more important than open source because open standards do fundamentally effect the ability of other companies to compete, even within technologies that themselves may (and should) be proprietary.


Microsoft could introduce, and by all indications, is introducing ODF into their Office products. That's great. It means that those people who like the Microsoft Office suite (which by all measures is one of the best) can continue to use it without significant impact upon their workflow. But it also means that other vendors can also produce office products, whereas before they were basically locked out of the market by the dominance of legacy Word and Excel documents.


If those competitors can take advantage of it, then that's fantastic - if not, then at a minimum Microsoft has a common framework beyond its immediate control that can't be used to vendor-lock their customers again ... and they are forced to compete on quality, which is what Microsoft is claiming that it does by putting the marketing emphasis on innovation. I see no downside there for anyone.


Re: editors - Word still uses the same sockets, the same communication protocols and so forth to post content onto the web that "online" editors do - it's just a question of whether they do it over HTTP or FTP in most cases. You edit locally - I think that is almost inevitably the case - and you publish.


Re: River of Life - please post a URL here when it's ready ... I liked what I saw earlier with it, and think it can very nicely showcase what VRML can do to a generation that things Second Life is new and cool.




GramBorder
2007-03-20 02:19:33
Hello

I want to all of you know, World is mine, and yoursite good

G'night
len
2007-03-20 05:45:16
"3) Transparent Development Process - The standard must produce periodic updates of progress, must be open to feedback from individuals and organizations, and the editorial list must be diversified across multiple organizations."


The problem is this: if you really want royalty free and unencumbered, transparency and open access hit the IP wall like a car going at high speed with the driver looking over their shoulder instead of straight ahead. X3D/VRML meets your first conditions, but to our chagrin and frustration, we found that if we left the process open to the public without first getting those membership agreements that obligated the contributors to the conditions you cite, we could not guarantee that we would produce a product that met those first conditions.


This is an excellent example of what goes wrong with the mythos of open source. What should be a process for open participative software development has become a cause celebre of the legally naive. We could make it work with XML by closing the inner group. Other efforts appeared to have worked until the encumbrances were discovered later. Note Tim Bray's recent blog about the legal hoops Sun puts up for starting such projects.


So the learning curve, experience, is showing that open source is in the final analysis, a legal set of conditions for contribution and distribution which the contributors and distributors must publicly and by publicly recorded authority acknowledge and obligate themselves to.


Everything else that follows is a political game. The ISO endorsement is a legal process even if standards are not law. The campaign on the web to kill off the Microsoft initiative is a political campaign that some consider moral, others market-wise, but so far too few have considered in terms of the Microsoft customers, a very considerable majority of the web users, who can benefit by this as if their good does not matter because they were not smart enough to cobble together open source systems for their own use, or ready to pay an open source company to do so.


The pay me now or pay me later strategy is used by every school yard bully and every corrupt politician. That it has become the tactic favored by the open source community in part is to be expected given its weak acknowledgement of the processes of law in governance of the marketplace where those processes contradict its myths and goals.


By the way, the InfoSet is more complex than the SGML Declaration. The weakness of Fielding's thesis is that it is itself simplistic. In a scaling system, it is a good rule of thumg to keep initial conditions simple, yes. But complexity is a quality of compounding choices in which the signal is being modified but the controls (the choices and the relationships among choices, or the choosers of choice) evolve. In complex systems, over simplification of controls creates complexity at higher levels of the system. It can be the case that low in the stack one needs a more complex control to prevent complex conditions or less interoperable controls in the high levels.


Again, naivete is the order of the day.


River of Life is not on the web yet. This is the sort of work one doesn't do online. One publishes it. I put the world version at ABNet to enable Rick Kimball to help debug it and determine if the world shell was suitable for MU and chat. It is. The version I am working on now has a reactive character, Kamala, and avatar animation is long tedious work when done for a reactive character. Those interested in what this is, screen shots, code examples etc, are encouraged to visit


http://3donthewebcheap.blogspot.com


where I am documenting it and sharing the code online.

David21001
2007-03-20 06:03:00
I think people see XML as the silver bullet, the end-all-be-all. IMHO, people hype this technology to the point unconscionable amounts of Federal and State government investments are made, i.e. your hard earned tax dollars invested in vapor-ware. Some folks, in fact, try to shoe-horn the technology into their products (or the other way around). I think we should embrace the technology for what it is, and carefully think how we can optimally use it. Steady as she goes; fair winds and following seas my friends.
Bdub
2007-03-20 06:31:33
Certainly XML is an enabler. Is there work to do in this space still? Yes of course. Its going to take creativity just like anything else to make XML's virtues obvious. It allows us to exploit XML in a way that we see fit. Simple rules that allow for powerful results (when done right).
One of the soundest virtues of XML is the integration aspect allowing different technology stacks to communicate. Are we ready to trash that? I hope not it has been a life saver for me to this point.
Some will say that creativity has been happening and we are at the end of the road, however that is just not the case. I often view XML like the kids toy play doe, you mold it to what shape you want now then later you're able to mold it again into a new shape. What is the alternative?
len
2007-03-20 09:33:28
A clarification: transparency is required. Without it, any process or organization are subjected to numerous conspiracy charges with or without credible evidence. Yet to be manageable and protected from abuse, transparent process governed by transparent policy is required. Some such as David Megginson who occupy even more prominent positions in the open source and development communities have decried process. Process can certainly slow down development and can even be used to derail it, thus transparent process and clear policy are the sine qua non. On the other hand, there are the Raph Koster's of the world who claim that only the speed of the reflexes of the commercial formats produce the survival-worthy formats. The existing success of XML says that that argument is false if applied uniformly. On the other hand, if we wait until all unknowns are known, we will never start. So as in all demogogic debates, a scintilla of truth can be amplified to become an Occam's Razor where in fact, no cut is necessary.


Today we have more interesting issues than these. For example, is the HTML browser really The Web Browser or is the web in fact, mostly the machinery of URI resolution? For the plugin languages, the incompatible market strategies, development strategies, licensing of languages and so on have proven deadly. The page metaphor was already aged and vulnerable when TimBL rediscovered it and made it popular again. This retrograde design is blocking progress. XMLhttpRequests don't require XML and the assumption they do makes it difficult to innovate in AJAX. If AJAX assumes HTML, that is another roadblock.


It is past time to consider the web URI as address plumbing, the HTML web browser as only one possible means of rendering, and to quit thinking of the browser as an operating system, but really as an application using the services of an operating system. If that operating system metaphor means extending it to distributed systems, we should pick another term and reconsider how the terminology improves or degrades the innovation of the distributed interoperating technologies for communication.


We suffer the myths gladly when they serve us. When they do not, we become passengers on that vessal from the Twilight Zone whose pilots presented the book to the UN: "To Serve Man" which was accepted on face value before it was finally translated.


This is true of so many levels of these debates about standards, openness, and appropriate technologies that it should be obvious to all. If it isn't, then we want things hidden and are afraid to talk about them openly. If so, openness is a sham, or just and excuse for a minority right more important than a majority necessity.


Choose wisely.

Michael
2007-03-27 20:57:54
I come from a document-centric, rather than data-centric use of XML. So, I want to capture structure in irregular documents, structure that can not elegantly be mapped to a relational data structure. It seems to me that anything that allows me to do that and remains readable in plain text (so that I am not 'hiding' it in an array for instance which will depend on a certain language environment, etc.) will be XML by any other name, whether it uses angle-brackets or some other convention. So, if there is a need in 100 years to capture meaning in documents, why would xml be replaced by something else, other than a new notation?
Patrick E Connors
2008-03-19 04:47:50
Well your much more knowledgeable on this subject than I. That said from my point of view, XML solves the old copybook problem for small files, That is an old view however many copybooks still exist.
I am replacing them as I work at different client sites.
It also allows one parameter to be passed across all platforms such as (Z\os, Windows, Linix, VM).