Merger-mania

by Kurt Cagle

Merger-mania is in full swing of late, which is rather astonishing given the current credit market problems. Oracle finally managed after months of trying to snare business services provider BEA, and Sun's purchase of mySQL was both hailed as a master-stroke and derided as a last gasp hope by a fading giant to bolster its database claims. Today the announcement hit the fan that Microsoft had made a $45 billion dollar bid for Yahoo, something that's made Wall Street happy but that has people in Silicon Valley scratching their heads.

I am not a CEO, nor am I a big investor ... and I'm generally not a big fan of mergers and acquisitions (M&As), because, especially when the mergers involve two reasonably large, well-established companies, the results in terms of performance seldom justify the costs involved. This is especially true of tech sector companies, where so many of the assets of the companies are not tied up in physical capital but rather in abstract ideas carried around in smart people's heads.

The overtures that Microsoft has been making to Yahoo have been obvious for some time. The integration of Yahoo and Microsoft's IM formats, for instance, hinted that the two were playing footsies under the table, and certainly the announcements that Yahoo would be utilizing Microsoft technology (and Microsoft's subsequent PR blitz to that effect) at least indicated that the relationship was serious. Thus, the signs have been around for a while, and both the tech and Wall Street press have generally been playing the role of matchmakers. Yet there are more than a few signs that this particular marriage, if consumated, may end up in divorce court nonetheless.

3 Comments

len
2008-02-05 14:17:23
Because of mergers and other forces, we should all be rethinking the costs of launching or selling applications based on third party services. This should become more evident in the GIS markets where the GIS is almost always going to be a base layer with the value derived in the feature layers. Keeping all of that up to date with high precision isn't cheap. Assuming a free service will remain free without taking a hard look at the terms of service is a pretty obvious mistake. Failing to obtain indemnity for events such as mergers is a pretty obvious mistake.


Yet people continue to blithely assume these services that start as freebies will continue to be.


A suggestion for an article is one that illustrates the problems of creating a business based on different offerings from the GIS web services. Typically, a GIS application is only one module in such a business, but it should be a core application that is customized for each market segment. Say you are brought in as a consultant for a customer of such vendors. What would you tell them to be aware of when procuring the product?

Kurt Cagle
2008-02-05 23:51:50
Len,


I absolutely agree with you here, and I think you've hit the nubbin about why the SOA model is going to require a considerably balancing act to make it viable. GIS is in fact also a remarkably good model for exploring this, as it so clearly articulates the various facets of SOA, but I think that it also can be a proxy for nearly all other vertical web services.


As I see it, GIS applications fall into one of three types:


1) Entity transients - keeping track of changed towns, streets, the location of various resources, boundaries and so forth. Obviously, while this information is vital to GIS, at the same time, much if not most of it is basically a by-product of local, regional or national cartographic efforts, and as such forms the substrate on which all other GIS applications are ultimately based. Once you standardize on an internal geoinformatics set of descriptive languages (SHAPE files in the pre-XML days, GML in its more recent incarnations) much of this information falls within the direct province of government generated data.


2) Geo-informatics - this is the process of mapping scalar or vector fields onto the entity transients, and this is the province that many GIS vendors have targeted as being the most profitable layer. Typically the role of the geoinformatics vendor is to act as the conduit for aggregating this data, which falls more properly into the realm of data research, marketing or polling, and as such represents a value added component that can be paid for. Not that the software for geoinformatics generally will not be the value-add component - if I have a database of household incomes by zipcode, mapping that content onto layer #1 is trivial, but gathering that information is not. Geoinformatics is also critical because such information has a distinct temporal component - a decay rate for the data based upon freshness.


3) Geo-modeling and analysis. Informatics involves the accrual, aggregation and convolving of data-sets; geo-modeling on the other hand involves making and testing assertions on that data and using it to predict future behavior. Realistically, these are the consulting services that you allude to in your post - the ability to take a set of maps over time and turn that into a predictive model is something that generally requires both systems theorists and domain experts, and as such these fees can be fairly high. It is possible to make AI systems that perform at least some of this analysis (I'm aware of a number of petroleum modeling systems that do precisely that) and in this case the development of that AI system represents the primary "tech" component, but otherwise this is less a programming problem and more of an analysts problem.


I see this breakdown occurring not just within the GIS sector but pretty much anywhere that you end up with a services oriented architecture (whether SOAP or REST based is largely immaterial here, though the data model processes obviously will be shaped by the nature of those services). What this points to, though, is that the monetization aspect of SOA is in general not in those parts that can be automated but specifically in those parts that can't. Right now, distributed GIS is still relatively young, so a great deal of work is still being focused on building the automation layer for #1 (such as by automating construction permit processing so that the information within those permits can feed into lot dimensions, zoning utilization and the like). However, my suspicion is that this is itself a very temporary state that will play itself out in the course of the next decade. The real value will be in #2 and (especially) #3, but in both cases the IT sector is subordinated to the analysis aspect.


-- Kurt

len
2008-02-06 07:53:14
GIS as a service will have different QOS requirements depending on the application. This isn't complicated. Some critical requirements are dimensions supported, rate of change, reliability of access, and historical series. Analytics (eg, epidmeiolocial analysis) rely on historical series so rate of change of information is important but not a deal breaker depending on the analysis. Also, a 3D system isn't as necessary. An EOC (Emergency Ops Center) application need very high reliability of access and rates of change, but the historical series isn't as important. The 3D system can be necessary if the function is also command and control where positions of assets in 3D space and in time are critical.


Sources vary. Some prefer GDT or Tiger data and as you say, government systems can provide these. However coverage isn't always complete or timely. Google provides excellent sat coverage but availability depends on the terms of service. The same is roughly true of Microsoft. In both of those, 3D coverage is possible but someone has to spend a lot of time making those models and dropping them on the shared maps. Even then, these tend to be 3D baubles without the kinds of detail required. It is clear that more attention to standards-based integration of multiple sourced data has to be paid by any server farm vendor presuming they can play in these markets. So far, I don't see Google or Microsoft playing well or understanding the responsibilities they would assume if they did.


SOA 'appropriateness' where the class of service has to have the requisite data quality has to be considered. Right now, the source has the power and the mashup does with as provided. That can't be the case for serious applications such as Homeland Security (not just terrorism, but public safety, health and welfare). I've seen some loudly touted but considerably underpowered attempts. I believe that after the next election in the US, there will be a renewed focus on acquiring 'appropriate' infrastructure. Current models from DHS and Justice include SOA concepts. What I am not yet seeing is attention to the workability or appropriateness of services being offered.


Oddly or not so oddly, not nearly enough attention is paid in these markets to keeping the systems 'appropriate' but 'simple'. We can toss a lot of tech at them but making sure it is useful remains a design challenge.