The Software World of 2010: Its about the Suite

by Rick Jelliffe

Nick Carr (the Australian XML éminence grise not the US journalist) asked me whether the software world was fragmenting. Here's my answer with some diagrams: software seems to be organizing itself into three layers of support/runner/plug-ins where each of those layers in turn is organized as support/runner/plug-ins.

5 Comments

len
2006-05-08 08:00:25
Yep.


If you are predicting futures based on this model, you look for locales that control spending on the platforms. Those spaces transform the others.


A smart XML company looking for pure media plays looks for fat data transform pipelines with a reasonably frequent update rate. Then you factor in complexity of creating updates.


Some media models don't rely on XML or transformational economics. The porn model worked for those that understood it is an uncomplicated media with a limited range of expression. So in terms of cheaper faster better, one can only do cheaper faster. A good game costs what a major motion release costs and becomes stale in a single release cycle. Second Life? That is a very different model and interesting to compare to others.


For a media company, the cost of creating and owning the content BY MEDIA AND FORMAT TYPE is the limit on profitability. That is why open format standards win in the long run and proprietary standards win in the short run.


To a media company, platforms are stuff you need to host content, but otherwise, are just stuff. As a result, the period in which the technological aspects of the web fascinated the press and public is coming to a close.


len


2006-05-09 13:33:00
As ever domain specifics will always lead the way...not VS200x but perhaps more like Open Lazlo or even (god forbid) license free FLEX beta3
Kurt Cagle
2006-05-10 19:47:55
Very salient essay, and I largely agree with the decompositions that you've laid out. I think that there are some aspects though that are creating alternative models that haven't completely settled out yet. Len's comments on transform pipelines brings up the whole notion of bindings within bindings within bindings, where each binding layer effectively creates a map between an underlying data abstraction at one end and an application representation on the other.


Case in point: a set of business rules and ACLs binds against schemas and templates to generate a core model mapping into components as an XHTML+XForms entity. This in turn reaches the client which instantiates the mappings into free-standing models, which in turn drive XForms content. Add in to that XBL bindings which turn custom intermediate markup into additional XForms content pulling from the data-model via a CSS binding layer, and tie that into posted XML that in turn gets passed into a second rules processing and validation engine to generate a data object which pushes back into the data store.


I see these playing right now primarily in the application space, but its easy to see this pushing further and further down the stack over time. They are (mostly) platform agnostic, they are increasingly underlaying low level services (such as file metadata and access) and they do not really differentiate strongly between local and remote resources.


I suspect that much of the fragmentation that is becoming apparent is thus also due to this recalibration around an XML paradigm.


-- Kurt

Rck Jelliffe
2006-05-10 21:34:58
For different architectures (SOA, etc) I would model these as different arrangements of the 3-layersystems.


For pipelines, these are runners (middle layer) and the functions are plug-ins. So there could be pipelines at the Operating System level, at the Platform level, and at the Suite level. And because the plug-ins (and others) can be modeled (to varying degrees of acceptability) with the same 3-layer model, you could have pipelines inside plug-ins etc. too.


What about the brave new world of Web Operating Systems? Remotely hosted disk space fits into the model as a (virtual) device driver, and I bet at the other end there is enough complexity to model with this kind of three-layer model too. It would be interesting to see where or whether Google's MapReduce libaries fit in: suite, platform or OS.


I suppose my point is not that everything can be squeezed into this simplistic model (things have libraries, things have plugins, things have the bits that tie them together, big deal, ne?), but more that over time systems mature/evolve/grow into explicitly following this arrangement. Whether this means that systems should start off being designed using this pattern or not, I wouldn't like to say: however, presumably there can be a sweet spot between premature frameworkitis and inadequate top-level design.

Rick Jelliffe
2008-04-07 00:20:49
[UPDATE] On the moving to make a distinct layer "Kernel and User Services", it seems that MS is indeed moving to a more modular construction, making a virtue of the necessity to have modularity that virtualization requires.

See MinWin at
http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9043359