XML Outlook for 2008

by Kurt Cagle

Once again into the breach, dear friends. I've made it a habit over the last several years to put together a list of both my own forecasts for the upcoming year in the realm of XML technologies and technology in general and, with some misgivings, to see how far off the mark I was from the LAST time that I made such a list. The exercise is useful, and should be a part of any analyst's toolkit, because it forces you to look both at the trends and at potential disruptors to trends (which I've come to realize are also trends, albeit considerably more difficult to spot).

So, without further ado, I bring up the list from last year ...


13 Comments

Rams
2007-12-30 04:34:33
Check out Flower - http://www.basiscraft.com/


"Flower is a new kind of user programmable web service, especially well suited for applications which process, store, and query XML data sets. Clients of a flower web service interactively modify and extend the code the server runs. This is is the ordinary way to build new flower applications. Flower is a true web operating system in the sense that it forms a self-contained, web-addressable computing environment."

len
2008-01-02 14:52:29
You have to ask yourself where standards make good sense and where they are just service vendors looking to absorb smaller market segments into their own master plans.


What we're seeing where we look is that X3D is not yet growing for virtual world applications. The reasons are basic. VWs as MMOs are server-side applications with little for the client to do but pick and render. The reasons are that security is much better with that choice and maintaining a verisimilitude of synchronicity is better. On the other hand, with the Christmas release of the Bit Management implementation of the network sensor node and the market beginning to notice the difference between a 4mb client download vs a 40mb download for SL, some serious heads are turning. Growth of X3D in infrastructure applications has been increasing. Think collaborative CAD online. Think asset interchange formats. So you may be right but only by not looking where the growth is.


The market fact that seems to be on the point of wide acceptance is that a metaverse is not only not happening, it is futile to attempt this. IBM simply got it wrong. Games make up the biggest and most lucrative market and these vendors have little use for interchange standards. Business markets such as collaborative applications do and that is where Bit Management, Octaga and ParallelGraphics are doing well.


Here is the nasty bit. The death of VRML was largely an American and Canadian funeral. The Europeans kept going and have solid businesses. The American/Canadian markets are extremely fractured. Second Life growth has slowed and small footprint platforms (see Ogoglio) are positioning themselves to pick up the business markets where LL's declining reputation is losing business. The business and infrastructure markets predicatbly need security and the ability to host locally or to rent from high reliability server farms. In the end, the VW market is a server farm market (it isn't simple to host a scaling VW). The infrastructure markets don't always need or want that scaling because they are not open to the public.


I think you will find X3D growth to be slow but steady. What has to be achieved here relies not on the graphics technology (basically, it just works) but on convergence on messages that need to be passed among the just-in-time sims.

Kurt Cagle
2008-01-02 16:03:04
Len,


Thanks for the insight on this - and overall what I've seen seems to corroborate what you've indicated as well. I think that Canadians are generally more attuned to standards than Americans are in general, the big question is just in having enough people playing in these spaces in a meaningful fashion to be able to push the standards at the international level.


I rather despair about the shrinking American presence wrt most otherwise global standards. This indifference to compelling standardization put the American mobile market years behind their European, Japanese and Korean counterparts, the American presence within the W3C and OASIS has been shrinking steadily for years, the US is significantly underrepresented on a per-capita basis in the open source realm, and this attitude appears to be one that's more symptomatic of a creeping isolationism in general than it is one of American know-how and ingenuity trumping global community-driven development, though that is often the rationale that is used.


I'm not necessarily a huge believer in standards for their own right, though I do believe that consensual standards are important. However, what is often touted as independence masks what is increasingly seen both at home and abroad as laziness and arrogance. Perhaps as an American ex-pat living in Canada its more obvious to me than it is to most people, but is disturbing nonetheless.


len
2008-01-03 07:15:13
Possibly the Americans only use the standards when they need them but otherwise resist using them if they offer no competitive advantage. A lot of what you are witnessing is Silly Valley behavior where VC-led companies need/must have IP to make good on their debts. Otherwise, some of this is technological churn. As I said, client-side 3D languages have to adapt to the scaling problems brought about by real-time synchronization where in an odd twist, the more you want the same event at the same time in multiple locations, the more you have to rely on broadcasting state over sharing information. Thus the TV is the perfect VW device and the peer-to-peer client is the worst. I've been wrong about that because I favor independence of the local system but the market wants simultaneity. TVs win.


On the other hand, take a look at Tim O'Reilly's blog on the topic of Wall Street firms and Google investing against their own clients. This is precisely the behavior I've been predicting for a decade now as the web became a dominating means of transacting information. O'Reilly is still stuck on the ecosystem metaphors (they feel good) but the behavioral patterns do exactly what the cyberneticists and complexity theorists say they will do. My current image is black hole formation. I note this in IBM's incursion into the 3D market. It isn't increasing sales for the clients or partners. It speeds up their orbits and makes them noticeable, but then it absorbs their mass and stops their signal output. Attractor formation in information systems is entirely predictable. Give-away economy models tend to drown out competition for profits by organizing transactions around the attractor. Socialist systems fail.


I'm mystified that Radar et al are just noticing this. Did they really believe in the web kumbayah? Unfortunately, standards based on simplified assumptions accelerate this process. Then the very people who supported this come back to tell everyone about the dangers. This is a self-preserving A-lister behavior and the same class of attractor formation.


So we have to be careful. Survival is the first imperative and in a world where there is an oldest profession, there is a second oldest profession as well. Someone manages their business for them. I'm looking for some velvet clothes and dice for my Caddy.

Lars
2008-01-03 08:28:19
I predicted another bad hurricane season, but that didn’t come to pass, however enough nasty weather did hit both the US and Canada that there’s become very little doubt that global warming is real.

Kurt, do you really want to open that can of worms (again)? It kind of sounds like you're saying that the failed predictions of global-warming enthusiasts nevertheless confirm their theories.
To give this topic the treatment it deserves would take far more space than you probably want to devote on your XML blog, and to give the topic short shrift will raise plenty of hackles.
Still, I should not be thinking about a career as a meteorologist.

Yes, let's please keep that battle separate from XML issues if you don't mind.
Kurt Cagle
2008-01-03 12:39:29
I don't see that much difference between the ecosystem and the attractor models - they are simply different expressions of the same underlying chaotic system dynamics. You're still talking ultimately about energy flows, where the ecosystem energy flows involve predation on high-volume/low-energy density converters (most herbivores) vs. low-volume/high-energy density converters (most carnivores), or high-volume/low-energy density converters (most startups) vs. low-volume/high-energy density converters (Microsoft, IBM, et al). It's only in countries where the typical exposure that most people have with wilderness involves zoos and highly structured parks that people believe that most ecosystems are comparatively benign for the participants.


However, shifting to the galactic metaphor for a second ... any sufficiently large stellar object will exhibit the same behavior - gravity sucks. A company absorbed by IBM (or Microsoft or Google) will eventually show up in a fairly different form (mass converted to energy) ... even with a black hole the absorption of that object would show off as a micro-nova in the ambient magnetic fields before becoming absorbed. If the conditions are right, however, the most important and energetic parts of such systems (the key programmers, for instance) may end up escaping the maws of the system before the final denouement.


However, back to the comments you made before the diversion into metaphor - I've long made the assumption that the reason that the open source community remains vital has far more to do with economics than it has to do with enlightened cooperation, no matter how self-serving that cooperation may be. I think that there's a systolic/diastolic feedback system here ... open source thrives when the ability of the market to monetize information technology is weakest (diastolic), and wanes when the market trends towards its maximum point of monetization (systolic). I suspect that this has to do with the creation of network connections within the innovative community, which in turn spawns the innovation of new ideas (and products) which in turn makes the harvesting of these ideas (either by a company growing successfully or by absorption by a larger company) more efficient. In time, this reduces the total number of viable ideas, innovation wanes and the ability of the large companies to retain the most energetic idea producers diminishes to the point where they escape.


I'm not sure "idea" is quite the term I'm wanting here, but in essence it is the basis on which products and services are built. Perhaps "value" .. I'm not sure. Certainly I agree with you that socialist economies in general do not work, but neither do purely capitalistic systems (the US economy at the transnational level is a functional oligarchy, despite the protestations of free market enthusiasts, one where government works heavily to reinforce the power of the largest corporations, usually at the expense of smaller players - in a purely capitalist society, Citicorp would have closed their doors by now).

Kurt Cagle
2008-01-03 12:57:43
Lars,


My personal belief is that the solar system has likely entered into a the very outer edges of an "anomalous" magnetic artifact from a supernova eruption in the Milky Way from perhaps 8-10 billion years ago. This can be inferred from the distortions to the solar bow wave envelope that was picked up by Voyager, by the increase in solar magnetism as evidenced by subspot activity and by the formation of thin atmospheres on the moon and Titan, the expansions of the atmospheres on Mars, Jupiter and Venus, and the increase in electromagnetic distortions and the number of lightning strikes on the earth and on Jupiter. It may also be the reason for what appears to be a shift in the earth's magnetic poles.


There is little doubt in my mind that man-made carbon emissions play a part in the process of reinforcing a feedback loop, but that the energy itself is exogenous - and that ultimately there may be remarkably little that we can do about it but ride through it. There are any number of good reasons for getting off of a carbon-based economy (delaying the inevitable, if you will, being a big one) but I'm also not at all convinced that reducing carbon emissions will in fact do much at all for what is, after all, a galactic phenomenon.

len
2008-01-03 16:01:32
The difference between pure attractors and ecosystems is typically the ecosystems *feel* effects and the attractors just 'affect'. Other than that, proof of intent is quite difficult unless one can prove action/reaction coupling is strong. Otherwise, accidents happen. OTOOH, the problem of attributing anything to an accident and saying that this is not intent or intelligence is that it would not be an accident if it didn't disrupt and intention. The fact of an accident proves intent exists. It doesn't prove what was intended.


The trick of the metaphors is revealing. Google is not absorbing the companies. It is absorbing their energy, by effect, the information they intermediate by redirecting the searches to their own resources/clients. This is the gamed system I said would come about without some kind of intervening force (say regulations). A-listers can harvest the ideas of the B-listers and wait until the verifying event is imminent then step forward in the same way the Google search redirects.


We used to call this "White Pigging" in honor of the Animal Farm characters. The A-listers would call it 'uncredited research' and so the attractor spins.


That attractors emerge in open flat systems is inevitable. Then it is a game of disrupting them which can be difficult and expensive and isn't guaranteed to work. See OOXML vs ODF.


So the game played is brand dissolution. Capturing and redirecting signal weakens the brand. It is as was described of Rove's strategy, addition by division, where all one needs is small percentages at or near the amplifying node to create mass effects. Timing and proximity are key.


Of late I have been considering the "unreality of the conversation" where the open exchange of ideas results in just as much or more superstition than fact. That superstition is motivating creates the illusion of the 'wisdom of crowds' when by comparison, there are better and more efficient ideas possible that get drowned in noise until the tide of immediate attention goes out. I think that is what the political analysts call 'the spin zone'.


IOW, there is not much profit in long range prophecy but quite a lot of profit in just-in-time predictions. If one can arrange the fait accompli, one can get credit too.

Mike Champion
2008-01-03 20:27:45
Comment Preview
Very thought provoking post - I agree with much of it, disagree with some, and plan to address this and other XML end-of-year wrapups in a blog post of my own. In this comment I will just take issue with some assertions and assumptions about Microsoft which aren't consistent with my experience:


"Microsoft had to rebrand XAML/WPF as Silverlight" is not right; WPF/XAML is an integral part of Vista, and Silverlight is a technology to allow one to build web applications with WPF. Something called WPF/E was rebranded as Silverlight


"Vista has not been the great success that Microsoft had been hoping". It is certainly true that it has not been universally well received, but it doesn't take much psychic ability to predict that a major release from Microsoft will cause a lot of grumbling. It has been a considerable financial success. The geeks may hate it, but the stockholders definitely do not. I'm not so sure that Apple's recent success has much to do with Vista disappointment. Apple has done a phenomenal job of designing hardware to appeal to a market willing to pay a premium price and tailoring their OS to meet those needs. Give them credit for their own success. [I'm typing this on a Mac running Vista -- in general, Intel Macs make excellent Vista boxes :-) ].


"[T]he biggest detractors of the [ECMAscript 4] standard are the predictable ones - Microsoft, who see it as a credible alternative to their Linq language (which has yet to really gain much traction)." You seem to have a touching faith that Microsoft is more organized than it is, if you really believe that the Silverlight or LINQ people tell IE what to think about Javascript. Believe Chris Wilson http://blogs.msdn.com/cwilso/archive/2007/11/02/my-opinion.aspx on why there is pushback from MS on ECMAscript 4. As for LINQ getting traction, note that it has not actually been formally launched yet, so it's a bit early to expect traction. Let's talk in 3-5 years.


"[T]here are actually some significant arguments within the Microsoft campus as to which particular strategy to follow". That is a bit of an understatement! Was it not like that when you worked at Microsoft? The book THE STRATEGY PARADOX argues that it is an intrinsic part of the culture, and a key reason for its success because if a bet fails (e.g. "Blackbird" as a proprietary alternative to the Web) there is a rival team waiting in the wings to offer something different. Sheesh, there were rival teams making different strategic bets within the LINQ project http://blogs.msdn.com/mattwar/archive/2007/05/31/the-origin-of-linq-to-sql.aspx


"Internet Explorer is a somewhat troublesome product for Microsoft — they would prefer that it go away". See above; some would like to see it go away, some would like to see it fully standards compliant and modernized, and the customers want it to be both backwards-compatible and standards compliant by default (which is a difficult trick to pull off without a telepathy co-processor to read the users mind). There is no grand plan, other than to experiment and keep options open.


"It has become obvious from a fair number of reports in recent months that LINQ has become the favored child of Microsoft’s XML technologies, and as such has largely pushed off the development map completely any native XSLT 2.0 or XQuery 1.0 processors (outside of SQL Server) from the roadmap." That's not obvious to me. LINQ to XML did indeed replace XQuery as the presumably mainstream-friendly *alternative* to XSLT in .NET 3.5, but the resources that might have been devoted to XSLT 2.0 went to building XSLT tools (e.g. the new debugger), not to LINQ. Those folks are now working on XSLT2, and it is reasonable to expect it in the next major .NET release after 3.5. (Just my guess as a former team member!). BTW, the original LINQ team members have mostly moved on from the XML group because of a *lack* of parental interest, so the "favored child" bit is rather painfully untrue.


"[LINQ] doesn’t deal well with complex transformations in particular, an area where XSLT2 is especially strong ". Exactly, and this is why I've always argued that LINQ and XSLT are complementary more than competitive. The shipping .NET 3.5 version of LINQ to XML supports XSLT transforms over a LINQ tree and from text or a DOM tree into a LINQ tree, to support scenarios where XSLT's portability and superior ability to handle loosely structured documents is relevant.
Close

Kurt Cagle
2008-01-05 02:33:33
Mike,


You keep taking the fun out of bashing Microsoft! Some very good points. Individual comments follow:


"Microsoft had to rebrand XAML/WPF as Silverlight" is not right; WPF/XAML is an integral part of Vista, and Silverlight is a technology to allow one to build web applications with WPF. Something called WPF/E was rebranded as Silverlight"


One thing that does emerge out of this is that the message is rather muddled - especially to someone who does not follow Microsoft as closely as I probably should (though that may be changing in the next couple of weeks), the confusing welter of acronyms, especially things like WPF vs. WPF/E, can make it difficult sometimes telling which technology is which. Granted, this isn't exclusively a Microsoft problem (I've already castigated certain people on the W3C for similar issues wrt CDF vs. WICD).


"Vista has not been the great success that Microsoft had been hoping". It is certainly true that it has not been universally well received, but it doesn't take much psychic ability to predict that a major release from Microsoft will cause a lot of grumbling. It has been a considerable financial success. The geeks may hate it, but the stockholders definitely do not. I'm not so sure that Apple's recent success has much to do with Vista disappointment. Apple has done a phenomenal job of designing hardware to appeal to a market willing to pay a premium price and tailoring their OS to meet those needs. Give them credit for their own success. [I'm typing this on a Mac running Vista -- in general, Intel Macs make excellent Vista boxes :-) ]."


I work with Vista on one of my home systems and have tested it on others. Overall, my own experience has not been that positive, and it becomes easy to see where much of the grumbling comes from. Given that most Vista sales are due to OEM sales rather than independent sales of boxed system content (which I have heard has been quite low), I think its rather difficult to necessarily correlate big computer sales in general with Vista adoption. I've also been following enterprise adoption numbers, and there is considerably less interest in upgrading to Vista until at least a solid and well-tested SP1. Long term, of course, Vista sales will be solid - not spectacular, but not a disaster, simply because of the nature of the distribution chain, but certainly not living up to the hype and the LONG development time.


"[T]he biggest detractors of the [ECMAscript 4] standard are the predictable ones - Microsoft, who see it as a credible alternative to their Linq language (which has yet to really gain much traction)." You seem to have a touching faith that Microsoft is more organized than it is, if you really believe that the Silverlight or LINQ people tell IE what to think about Javascript. Believe Chris Wilson http://blogs.msdn.com/cwilso/archive/2007/11/02/my-opinion.aspx on why there is pushback from MS on ECMAscript 4. As for LINQ getting traction, note that it has not actually been formally launched yet, so it's a bit early to expect traction. Let's talk in 3-5 years."


I understand Chris's reasoning on ES4. I just don't agree with it. JScript is lagging further behind the state of the art every year, and it is becoming harder to support AJAX platforms that work everywhere EXCEPT IE. Admittedly, I'm more inclined to take Brendan Eich's viewpoint on this because its something I've felt for years myself. We're building complex apps on the web - SO LONG as the language is fully backwards compatible with ES3 (which, by my reading of ES4 anyway, it is) it makes sense to start introducing higher order programming capabilities into the system, and frankly IE8 would be the logical place for this to happen on the Microsoft side. Now there are of course version control issues that need to be resolved here, but that's been true of the web from its inception. That's one of the reasons that you have the type attribute on script blocks, so that you CAN go in and say "this is running in an ES4 context, this is running in an ES3 context and so forth".This is true regardless of whether you're talking minor changes or major ones.


Concerning LINQ, one of the problems in being a specialist in a given area is that you often end up seeing (and understanding the relevance of) certain technologies long before the average developer does. I've been working with E4X for nearly two years now, but its only now just beginning to penetrate within the mindset of fairly advanced web devs, so the obvious similarities (and differences) to LINQ mean that I'm probably more aware of that technology than most people are.


Personally, there's a lot to like about LINQ. I think that Erik Meijer is brilliant, and the work that he's tied into LINQ from Haskell in particular (especially monads and closures) is ground-breaking. Having said that, I am aware of the somewhat glacial adoption speeds for anything having to do with XML technology, and even though LINQ obviously has significant use as a generalized data abstraction object, getting people to adopt it will take time. I would tend to agree with your time frame of 3-5 years, though that also tends to be the window in which non-performing tech tends to disappear at Microsoft, so don't see it as a sure thing.


"[T]here are actually some significant arguments within the Microsoft campus as to which particular strategy to follow". That is a bit of an understatement! Was it not like that when you worked at Microsoft? The book THE STRATEGY PARADOX argues that it is an intrinsic part of the culture, and a key reason for its success because if a bet fails (e.g. "Blackbird" as a proprietary alternative to the Web) there is a rival team waiting in the wings to offer something different. Sheesh, there were rival teams making different strategic bets within the LINQ project http://blogs.msdn.com/mattwar/archive/2007/05/31/the-origin-of-linq-to-sql.aspx"


If you've been a part of Microsoft the struggles are a daily part of life, but for a lot of people who've never been a part of the company, the sometimes violent internecine warfare can seem counterproductive if not irrational compared to the way that other companies work. Does it produce best of breed products? Sometimes, in the hands of a good manager who knows how, at the end of the day, to cool down heated arguments and who has the strength (and authority) to become a necessary arbiter. In the hands of a bad manager, though, this kind of strategy can burn out developers unnecessarily, can cause a high rate of attrition, and can often times force out good ideas with inferior political backing in favor of bad ideas with those more entrenched in the power structure. It also means that, for those people who do fall victim to such strategies and are either forced out or leave under their own volition, the likelihood is high that they will become fairly vocal about the company's failures and faults.


"Internet Explorer is a somewhat troublesome product for Microsoft — they would prefer that it go away". See above; some would like to see it go away, some would like to see it fully standards compliant and modernized, and the customers want it to be both backwards-compatible and standards compliant by default (which is a difficult trick to pull off without a telepathy co-processor to read the users mind). There is no grand plan, other than to experiment and keep options open."


I still think that the best strategy to do that would have been to implement a separate CSS rendering engine for application/xhtml that would have been ACID2 compliant; as most legacy content as still server up as text/html, this would have been a clear differentiator. I was pleased to hear about ACID2 compliance with IE8 pre-alpha, as I think it lays the argument about customer resistance to change to rest. However, ultimately, I think that a decision SHOULD be made with regard to IE - its too integral a technology to Microsoft to be left in limbo. (And yes, we both know at one point I had the opportunity to make that decision - I was not the person for that role, however).


"LINQ to XML did indeed replace XQuery as the presumably mainstream-friendly *alternative* to XSLT in .NET 3.5, but the resources that might have been devoted to XSLT 2.0 went to building XSLT tools (e.g. the new debugger), not to LINQ. Those folks are now working on XSLT2, and it is reasonable to expect it in the next major .NET release after 3.5. (Just my guess as a former team member!). BTW, the original LINQ team members have mostly moved on from the XML group because of a *lack* of parental interest, so the 'favored child' bit is rather painfully untrue."


That's a mix of good and bad news. XSLT2 has some obvious benefits MS, though given its own extension mechanisms perhaps not so much as is true for other XSLT1 implementations. I've heard different news from M.David Peterson on whether XSLT2 is still under development, but I'll readily concede that this is also likely one of those day-to-day struggle types of things.


Overall, I like some of the things that I see coming from MS, but as an analyst, it is difficult to make the case that there's progress given the way that things work there until there is literally a product in MSDN to point to. Thanks for the comments.

Mike Champion
2008-01-05 18:01:27
I wouldn't dispute that the internal competition for resources to fund alternative visions has plenty of downsides and that it's hard to manage. My point is that it is usually pointless and misleading to construct elaborate conspiracy theories to account for alternative approaches to the same basic goal (e.g. AJAX vs Silverlight or XSLT vs LINQ). The best explanation for why something is not happening is usually that a product team is more worried about annoying existing users with "breaking changes" than enabling new use cases with an updated standard such as ECMAScript4 or XSLT2. That may or may not be short sighted; obviously IE's stagnation was a terrible idea. On the other hand, sometimes it is a useful stabilizing force, e.g. it kept XML 1.1 from getting traction, which motivated W3C to devise (so far tentatively) a less "breaking" way improve Unicode support.


Another point to keep in mind when trying to deconstruct why Microsoft does or does not do something related to a standard is that developers (and testers, PMs, etc.) vote with their feet by moving from team to team fairly freely. For example, it's pretty hard to keep developers on the DOM farm when the bright lights of LINQ are calling. More generally, people aren't exactly clamoring for the opportunity to dig into the nasty details of XML these days, and opportunities up the stack in WCF, WPF, "Astoria", etc. tend to attract the people who have served their time in the XML barns. Again, that's not necessarily a Good Thing for MS or the customers, but it does indicate where the interesting problems and lucrative opportunities are perceived to be these days ...

Taylor
2008-01-06 21:53:58
I think you downplayed the success of Microformats in 2007. I don't see RDFa making much headway. It's very easy to GRDDL or parse out the MF and convert to triples. The problem with RDFa (and rdf) is that no one is specifying in a clear way how to represent common data elements (like hCard, hCalendar, etc). I'm thinking the web will tend towards Atom and Microformats...that alone is enough to get semantics on the pages.
Kurt Cagle
2008-01-07 00:32:59
Taylor,


"I think you downplayed the success of Microformats in 2007. I don't see RDFa making much headway. It's very easy to GRDDL or parse out the MF and convert to triples. The problem with RDFa (and rdf) is that no one is specifying in a clear way how to represent common data elements (like hCard, hCalendar, etc). I'm thinking the web will tend towards Atom and Microformats...that alone is enough to get semantics on the pages."


I agree with you, and looking at my comments, I'd say you're right about me downplaying microformats. My biggest problem with microformats as they exist right now is that they make presumptions about the idea that you won't have namespace collisions and that you can get by with a single tagging mechanism for identifying a given taxonomy. Yet at this point I see both XBRL and HL7 talking about their own microformats and before you know it you're in the same boat as mime-types, where you have hundreds of taxonomies in play and know clear way of differentiating commonly named ones except through a centralized authority (which just does not work well on the web, as has been proven again and again).


I like Mark Birbeck's proposal on CURIEs, which provide a way of binding namespaces into taxonomic terms. I personally feel a lot more comfortable being able to say:


<html xmlns:gml="http://www.opengis.net/gml">
...
<body>
<div class="gml:geo">
<span class="gml:latitude">120,
<span class="gml:longitude">45
..
</html>


than I do on hoping that no one else decided to collide with my "geo" term. Of course, this opens up a general rant that I have about a lot of web developers' dislike for namespaces. To me, namespaces are a necessary evil. Without namespaces, you can only have scope for one microformat active at any given time. With namespaces, you can have intertwined microformats that can still be easily parsed. With namespaces, you even have the potential of defining ad-hoc microformats dynamically, something that I think will be of enormous value.


Whether RDFa will itself fly is debatable - I think that triple coercion is hard enough when done by people who understand RDF, and I suspect that most web devs don't. However, I feel very adamant about CURIEs.