XML and Government Schizophrenia

by Michael C. Daconta

The U.S. Government is very leery of technology "fads" and that is why it often has a love/hate relationship with XML. You must remember that at any time, for EVERY technology that exists - the government has a huge legacy investment. So, while the corporate world may turn on a dime and quickly adopt the latest and greatest thing ... the government MUST contend with HUGE legacy issues, a 2-year (minimum) budget planning cycle, and a horde of technologists actively engaged and personally invested in that legacy technology that you want to throw away!

So, in this blog, I hope to engage with you in many discussions about how the government is using XML technology, how they should use XML technology and everything in between!

So, bring me your Government XML stories and I will bring you mine ... today, let me briefly discuss a program that I initiated when working for the Department of Homeland Security (DHS). The National Information Exchange Model (NIEM) started as a joint-venture between DHS and the Department of Justice (DOJ) to harmonize and speed up the process of information sharing between the federal government and state and local governments (actually State, Local and Tribal governments). Here is the website for you to learn about and experiment with this XML data model. The basic idea is that it combines a registry of standard data objects (modeled via XML Schema), a process for quickly producing an exchange message, a governance process for the model, and robust tool support. The model leveraged and extended an existing model called the Global Justice XML Data Model (GJXDM). It is widely used by law enforcement at all levels of government and now is also being widely used at DHS. It has multiple success stories behind it including the Amber Alert and the national sex offender registry. I highly encourage everyone to look at it and help make it better. So, what does this mean for Government Schizophrenia? For information sharing, XML is a favorite but is attacked continuously in relation to weak data modeling support, weak encoding of binary objects, performance issues, and many more. Remember, the roar of legacy systems has a long tail ...

Until next time, see you in the XML trenches... :)


Len Bullard
2008-04-05 14:28:02
CAP works. We may want to reexamine its points of extensibility.

GJXDM is overbuilt for the job at hand. We need lighter specifications. We need to look closely and perhaps a little cynically at the level and depth of integration really required for Homeland Security systems. A thinner horizontal communication layer over vertical deep enterprise layers is our best bet for the next stages of integrated public safety. We need to look critically at the methodologies used to derive the XML schemas and ask where we can apply lighter approaches.

Michael Daconta
2008-04-05 15:52:29
Hi Len,

CAP (the common alerting protocol) is a good standard and certainly has its uses; however, I think you are making an apples to oranges comparison.

The goals of CAP are different than the goals of NIEM or GJXDM.
One is a specific message for a specific purpose and the other is a reusable set of entities and a process to build messages. In other words, one is a particular car and the other is a car factory.

As to lightweight approaches, I think you have a good point there ... I certainly always feel that Einstein's quote on simplicity is important to keep in mind. Thus, there is a certain level of necessary complexity for the task at hand but you want to beware of crossing the line into unnecessary complexity. From your characterization, I would safely assume you think that GJXDM has crossed that line. I think the tool support in GJXDM has worked to ease the implementation difficulties while not comprising the goals of the project.

Not a perfect effort but an improving one ...

Best wishes, - Mike

2008-04-05 17:21:30
Hi Mike:

I didn't think I would be doing this work again but oh well... here we are on Saturday night at the office. ;-)

I mention CAP because it is an example of building a message standard rather than deriving it from an uber-schema. CAP mapped easily to the database, then the RAD tool made a good first pass. We don't like to duplicate entities and that is what occurs naturally in bottom-up integration, but top-down integration at scale introduces complexities at all levels they are expensive. A factory approach has it's uses but we don't all drive the same car from the same car manufacturer.

A lumpy information system isn't all bad either.

It may be we have over emphasized messaging as a means of creating a level playing field for the vendors. While it certainly is the right approach where there are multiple vendors for different pieces of the same system, we have to ask at what enterprise levels we are using that versus straight forward database level integration: shared state server systems with distributed pages and role-based security doing the work of segregating processes and data sharing. In that scenario, XML gets much less attention.

Abstractions aside, the problem we may be facing is logistical, Mike. As the political climate changes, the emphasis and the funding shifts to getting more public safety infrastructure up and running faster and cheaper. A perfect solution takes a long time and a lot of funding. We failed at that in CALS. Prioritization is called for because economically, we have some big challenges ahead.

The simple case of figuring out how a health incident, a law enforcement incident, a fire incident, an emergency responder (say ambulance) and a hazmat incident are all related is enough to make us reach for the ibuprophen. Then we have the case management systems which are horrifically variant by type.

So again, we may want to shift the emphasis to the thinner communication layer with lighter messaging for command and coordination. The deeper layers favored by deep analysis can wait until we get the ports and the borders secured for command and control. If we get the IEPDs for the most useful message sets in place, we get more done faster.

Thanks for the blog, Mike. I look forward to this conversation.



Michael Daconta
2008-04-06 04:47:50
Hi Len,

You bring up some good points ... I actually would love to ferret out the differences and commonality between the various incident types. The semantic differences are important to trigger different responses. While we may both reach for the ibuprophen - I would do so with a smile on my face. :)

You know that I have always pushed for greater levels of semantic fidelity ... your point on budgets and funding shifts though is a strong counterpoint.

I do agree that finding lighter and simpler ways is well worth the effort - so simple as possible, but no simpler. I also am looking forward to our continuing dialogue...


- Mike

Samuel B. Quiring
2008-04-07 15:23:51
One area where XML is being used successfully in the government is Grants.gov. With a published XML schema, commercial entities like Cayuse and institutional partners like universities can communicate electronically with the government on funding opportunities.


Michael Daconta
2008-04-09 03:24:29
Thanks Sam ... I'll check out grants.gov! I have heard good things about it but had not dug into the details... - Mike
2008-04-09 16:12:40
Please, please, please stop using the term 'schizophrenia' to refer to (a) split-personality (to which it is unrelated), and more generally (b) something of which you disapprove.

Schizophrenia is a bloody terrible disease, afflicting millions of people. Your misuse of a clinical label describing their condition as a term of abuse is (I have no doubt unintended) just another kick in the teeth.

Michael Daconta
2008-04-10 14:33:40
Hi CB,

I apologize. I did not mean to offend in the use of the term. I will be more careful in the future with such analogies.


- Mike

2008-04-10 15:30:02
Michael -- of course, I realise you didn't intend anything offensive. This use of 'schizophrenic' is in the culture at large, but we wouldn't accept it with other terms for disadvantage, so I think it's worth pointing out in the hope that it drops away. Anyway, I'll let you get back on-topic now!


David Webber
2008-05-01 11:30:35

Job done! I will be releasing open source tools that address exactly this next week. We're just doing final testing and refining right now. We've improved on the wantlist concept that NIEM uses - and bridged the gap down to the actual implementation XML and documenting the IEPD using a killer package of XSLT combined with some OASIS standards work on templates. Generating test case examples also.

I think you will find this is the last mile piece you were looking for.

Now if I can just get the NIEM upgraded to an ebXML Registry with a REST interface...

Cheers, DW

David Webber
2008-05-01 11:35:12

Not sure I'd hold out Grants.gov XSD as a shining example any time soon. I worked on that project for two years and that schema(s) is a bear. It definately fits Lens' request for thinner ways to do this. And even though folks have eventually made it work - that was not without tangible cost in pain, time and effort.

There are definately lighter ways to get at this - and definately I'll look at applying lessons learned here on NIEM side to the GGov XSD at some point...

Dan McCreary
2008-05-12 18:57:09
Hi Michael,

I have used both GJXDM and the NIEM to successfully as a foundation to build K-12 education and property tax metadata registries. The NIEM is a wonderful foundation that all state and federal agencies should build upon. The subschema generation tools are a huge time saver and the topic of blending NIEM with local registries need to be a top issue for future developers.! I am looking forward to your insights in the column and hope to share my stores if you are interested.

- Dan McCreary