XHTML 2.0 and the health of the Web

by Simon St. Laurent

Related link: http://aspn.activestate.com/ASPN/Mail/Message/xml-dev/1440387



Andrew Watt wrote to xml-dev asking "What are the strongest arguments for continuing development of XHTML 2.0?". Keeping the Web alive seems the most important to me.




This was my reply, mildly edited to free it from the xml-dev context:


---------------------------


[Note that all statements about corporate or consortium behavior are based on sheer Kremlinology, and may not in fact represent the way those organizations see their own behavior. I'm sure many people here will disagree with that perspective.]



XHTML 2.0 feels like a crucial ingredient of the browser conversation to me. That conversation has pretty much stalled over the last few years. Why?



Netscape has vaporized into AOL, an organization that hasn't exactly loved the Web, and while Mozilla is perhaps the torch-bearer par excellence for the 'traditional Web', it lacks the "installed with everything" advantage.



Microsoft appears to have lost interest in Internet Explorer since "winning" the Browser Wars. The upgrades to IE appear to be largely outside the browser engine, and improvements are clearly incremental. In
a lot of ways, it feels like the last major change to IE was in IE4, when they rebuilt the object model - and bonded it tightly to HTML.



There are other companies and projects in the browser space, and I'm quite fond of Opera, but I don't see great hope for a new browser reinvigorating the overall market.

Meanwhile, the W3C as an organization appears to have lost interest in HTML. The leadership appears distracted by RDF and (to a lesser extent) XML, while the membership appears distracted by Web Services, a tool kit with only remote connections to its supposed "Web" origins. The HTML WG seems to spend a lot of time fending off other parts of the W3C who think they know better than mere HTML culture can produce.



The HTML WG seems to have suffered a cursed existence for the past few years. Namespaces were supposedly going to help us integrate other XML vocabularies (SVG, SMIL, MathML, etc.) but proved a huge nuisance for XHTML's three-forms-one-vocabulary approach. XML 1.0 DTDs might have
been all right for XHTML modularization, but wow the namespace interactions were awful. W3C XML Schema didn't help that front much either, as the backers of RELAX NG have been happy to demonstrate.



Given all of that, the sheer perseverance of the HTML WG deserves applause. <pause />



Despite that perseverance, the forward development of HTML has suffered about the same lack of momentum that Mozilla has suffered. Delays caused by internalizing these externally-inflicted issues and a general drift away from HTML as the central activity at the W3C pretty much mean
that the "traditional Web" with its HTML and browser-based foundations
is in trouble.



There have been a few nice pieces of work to come out of the XHTML Working Group in that period (XHTML Basic is a favorite of mine), but it's only fairly recently (last six months?) that I've really begun to see XHTML discussion on Web developers mailing lists. (XHTML-L doesn't count for that, though its membership has grown slightly.)



XHTML 2.0 is an opportunity to break out of the stall. I see XForms in particular as crucial to keeping the Web from being eaten by the Web Services and "Rich Internet Applications" that have been eating around
its edges for a while. The Web has to move forward if it hopes to survive, as its promise of cheap interoperability (which it did remarkably well, even with the Browser Wars) is under assault once more from vendors who have a lot to gain by fragmenting the Web and Web applications into proprietary pieces under their control.



XForms is especially critical because it addresses probably the weakest part of the HTML infrastructure. Forms have sort of barely evolved since their first appearance, and they're pretty much a nuisance.
They're one of the few places where I find information validation genuinely powerful, and current solutions on both the server and the client are pretty ugly. XHTML 2.0 looks overall like an effort to make XHTML a cleaner environment in which to work, and that seems worth
pursuing as a first step toward putting XHTML back on the tracks.



While it's not at all clear that Microsoft or the other browser vendors have the time or interest in improving the Web, the only way forward seems to be to try, and (X)HTML appears to be the right place to do it. It has a wide base, a lot of committed developers who have been starved for features over the past few years, and foundations that are both well-understood and reasonably extensible.



Will XHTML 2.0 sweep the world? It's too soon to tell. Is XHTML 2.0 worth the effort? I think the answer to that is an obvious yes, and not because XHTML needs all the latest XML gimmicks. The Web needs some progress in XHTML for the to stay alive. I worry that at current rates of progress the Web will have disappeared by the time the Semantic Web is ready.




Is the Web fading? Can XHTML 2.0 help?


2 Comments

anonymous2
2002-11-22 06:30:11
Much Needed
Yes, as both the Gecko and Mozilla engine are coming into the current level of standards they should have been in 2 years ago - I think it is strong important that we start looking at XHTML 2.0 and make some improvements on where we would like to see the web go - visually. As my designer mind would like to see Flash improve - my programmer mind knows that can't be the future and we need to start creating new standards to allow developers to better control the virtual environment which they are creating. XForms would be a great start... and hopefully the start of a new beginning.
anonymous2
2002-11-24 19:27:59
Out of touch?
As a sometime Mozilla QA flunky, I must say that I sometimes feel that the HTML and CSS working groups are the only ones in touch with reality. I read www-tag occasionally, where there's always a tremendous amount of heat and light being generated over various abstractions: RDF! bags! namespaces! RDDL! what's an URI? REST! SOAP! XPointer, XML Schema, Cthulhu ftagn (as Joe English would say). Then I go triage Bugzilla bugs, and we're still trying to get people to close their tags so we don't pop a stack. The HTML WG is making a valiant effort to clear up the masses of cruft that have accumulated on top of HTML (which didn't exactly start as a rich semantic language), and the CSS WG is developing CSS3 and whittling CSS2 down to a commonly implemented subset. All the X* stuff has its uses, and I'm sure XSLT, etc. are holding up the back ends of some web sites, providing web services, and so forth, but the-Web-as-it-is-browsed is only taking the first little steps towards being XML. We need a well-designed language expressing basic web semantics, so that we can crawl, before we can stand up and walk with ontologies and the Semantic Web.