The National Defense Industrial Association’s Integrated Program Management Division (NDIA IPMD) just had its quarterly meeting here in sunny Orlando where we braved the depths of sub-60 degrees F temperatures to start out each day.
For those not in the know, these meetings are an essential coming together of policy makers, subject matter experts, and private industry practitioners regarding the practical and mundane state-of-the-practice in complex project management, particularly focused on the concerns of the the federal government and the Department of Defense. The end result of these meetings is to publish white papers and recommendations regarding practice to support continuous process improvement and the practical application of project management practices–allowing for a cross-pollination of commercial and government lessons learned. This is also the intersection where innovation among the large and small are given an equal vetting and an opportunity to introduce new concepts and solutions. This is an idealized description, of course, and most of the petty personality conflicts, competition, and self-interest that plagues any group of individuals coming together under a common set of interests also plays out here. But generally the days are long and the workshops generally produce good products that become the de facto standard of practice in the industry. Furthermore the control that keeps the more ruthless personalities in check is the fact that, while it is a large market, the complex project management community tends to be a relatively small one, which reinforces professionalism.
The “blues” in this case is not so much borne of frustration or disappointment but, instead, from the long and intense days that the sessions offer. The biggest news from an IT project management and application perspective was twofold. The data stream used by the industry in sharing data in an open systems manner will be simplified. The other was the announcement that the technology used to communicate will move from XML to JSON.
Human readable formatting to Data-focused formatting. Under Kendall’s Better Buying Power 3.0 the goal of the Department of Defense (DoD) has been to incorporate better practices from private industry where they can be applied. I don’t see initiatives for greater efficiency and reduction of duplication going away in the new Administration, regardless of what a new initiative is called.
In case this is news to you, the federal government buys a lot of materials and end items–billions of dollars worth. Accountability must be put in place to ensure that the money is properly spent to acquire the things being purchased. Where technology is pushed and where there are no commercial equivalents that can be bought off the shelf, as in the systems purchased by the Department of Defense, there are measures of progress and performance (given that the contract is under a specification) that are submitted to the oversight agency in DoD. This is a lot of data and to be brutally frank the method and format of delivery has been somewhat chaotic, inefficient, and duplicative. The Department moved to address this by a somewhat modest requirement of open systems submission of an application-neutral XML file under the standards established by the UN/CEFACT XML organization. This was called the Integrated Program Management Report (IMPR). This move garnered some improvement where it has been applied, but contracts are long-term, so incorporating improvements though new contractual requirements tends to take time. Plus, there is always resistance to change. The Department is moving to accelerate addressing these inefficiencies in their data streams by eliminating the unnecessary overhead associated with specifications of formatting data for paper forms and dealing with data as, well, data. Great idea and bravo! The rub here is that in making the change, the Department has proposed dropping XML as the technology used to transfer data and move to JSON.
XML to JSON. Before I spark another techie argument about the relative merits of each, there are some basics to understand here. First, XML is a language, JSON is simply data exchange format. This means that XML is specifically designed to deal with hierarchical and structured data that can be queried and where validation and fidelity checks within the data are inherent in the technology. Furthermore, XML is known to scale while maintaining the integrity of the data, which is intended for use in relational databases. Furthermore, XML is hard to break. It is meant for editing and will maintain its structure and integrity afterward.
The counter argument encountered is that JSON is new! and uses fewer characters! (which usually turns out to be inconsequential), and people are talking about it for Big Data and NoSQL! (but this happened after the fact and the reason for shoehorning it this way is discussed below).
So does it matter? Yes and no. As a supplier specializing in delivering solutions that normalize and rationalize data across proprietary file structures and leverage database capabilities, I don’t care. I can adapt quickly and will have a proof-of-concept solution out within 30 days of receiving the schema.
The risk here, which applies to DoD and the industry, is that the decision to go to JSON is made only because it is the shiny new thing used by gamers and social networking developers. There has also been a move to adapt to other uses because of the history of significant security risks that had been found in Java, so much so that an entire Wikipedia page is devoted to them. Oracle just killed off Java applets, though Java hangs on. JSON, of course, isn’t Java, but it was designed from birth as JavaScript Object Notation (hence the acronym JSON), with the purpose of handling relatively small bits of data across web servers in a number of proprietary settings.
To address JSON deficiencies relative to XML, a number of tools have been and are being developed to replicate the fidelity and reliability found in XML. Whether this is sufficient to be effective against a structured LANGUAGE is to be seen. Much of the overhead that technies complain about in XML is due to the native functionality related to the power it brings to the table. No doubt, a bicycle is simpler than a Formula One racer–and this is an apt comparison. Claiming “simpler” doesn’t pass the “So What?” test knowing the business processes involved. The technology needs to be fit to the solution. The purpose of data transmission using APIs is not only to make it easy to produce but for it to–you know–achieve the goals of normalization and rationalization so that it can be used on the receiving end which is where the consumer (which we usually consider to be the customer) sits.
At the end of the day the ability to scale and handle hierarchical, structured data will rely on the quality and strength of the schema and the tools that are published to enforce its fidelity and compliance. Otherwise consuming organizations will be receiving a dozen different proprietary JSON files, and that does not address the present chaos but simply adds to it. These issues were aired out during the meeting and it seems that everyone is aware of the risks and that they can be addressed. Furthermore, as the schema is socialized across solutions providers, it will be apparent early if the technology will be able handle the project performance data resulting from the development of a high performance aircraft or a U.S. Navy destroyer.