Probably the biggest news out of the NDIA IPMD meeting this past week was the unofficial announcement by Frank Kendall, who is the Undersecretary of Defense for Acquisition, Technology, and Logistics USD(AT&L), that thresholds would be raised for mandatory detailed surveillance of programs to $100M from the present requirement of $20M. While earned value management implementation and reporting will still be required on programs based on dollar value, risk, and other key factors, especially the $20M threshold for R&D-type projects, the raising of the threshold for mandatory surveillance reviews was seen as good news all around for reducing some regulatory burden. The big proviso in this announcement, however, was that it is to go into effect later this summer and that, if the data in reporting submissions show inconsistencies and other anomalies that call into question the validity of performance management data, then all bets are off and the surveillance regime is once again imposed, though by exception.
The Department of Defense–especially under the leadership of SecDef Ashton Carter and Mr. Kendall–has been looking for ways of providing more flexibility in acquisition to allow for new technology to be more easily leveraged into long-term, complex projects. This is known as the Better Buying Power 3.0 Initiative. It is true that surveillance and oversight can be restrictive to the point of inhibiting industry from concentrating on the business of handling risk in project management, causing resources to be devoted to procedural and regulatory issues that do not directly impact whether the project will successfully achieve its goals within a reasonable range of cost and schedule targets. Furthermore, the enforcement of surveillance has oftentimes been inconsistent and–in the worst cases–contrary to the government’s own guidance due to inconsistent expertise and training. The change maintains a rigorous regulatory environment for the most expensive and highest risk projects, while reducing unnecessary overhead, and allowing for more process flexibility for those below the threshold, given that industry’s best practices are effective in exercising project control.
So the question that lay beneath the discussion of the new policy coming out of the meeting was: why now? The answer is that technology has reached the point where the ability to effectively use the kind of Big Data required by DoD and other large organizations to detect patterns in data that suggest systems issues has changed both the regulatory and procedural landscape.
For many years as a techie I have heard the mantra that software is a nice reporting and analysis tool (usually looking in the rear view mirror), but that only good systems and procedures will ensure a credible and valid system. This mantra has withstood the fact that projects have failed at the usual rate despite having the expected artifacts that define an acceptable project management system. Project organizations’ systems descriptions have been found to be acceptable, work authorization, change control, and control account plans, PMBs, and IMSs have all passed muster and yet projects still fail, oftentimes with little advance warning of the fatal event or series of events. More galling, the same consultants and EVM “experts” can be found across organizations without changing the arithmetic of project failure.
It is true that there are specific causes for this failure: the inability of project leadership to note changes in framing assumptions, the inability of our systems and procedures to incorporate technical performance into overall indicators of project performance, and the inability of organizations to implement and enforce their own policies. But in the last case, it is not clear that the failure to follow controls in all cases had any direct impact on the final result; they were contributors to the failure but not the main cause. It is also true that successful projects have experienced many of the same discrepancies in their systems and procedures. This is a good indication that something else is afoot: that there are factors not being registered when we note project performance, that we have a issue in defining “done”.
The time has come for systems and procedural assessment to step aside as the main focus of compliance and oversight. It is not that systems and procedures are unimportant. It is that data driver assessment–and only data driver assessment–that is powerful enough to quickly and effectively identify issues within projects that otherwise go unreported. For example, if we call detailed data from the performance management systems that track project elements of cost, the roll up should, theoretically, match the summarized data at the reporting level. But this is not always the case.
There are two responses to this condition. The first is: if the variations are small; that is, within 1% or 2% from the actuals, we must realize that earned value management is a project management system, not a financial management systems, and need not be exact. This is a strong and valid assertion. The second, is that the proprietary systems used for reporting have inherent deficiencies in summarizing reporting. Should the differences once again not be significant, then this too is a valid assertion. But there is a point at which these assertions fail. If the variations from the rollups is more significant than (I would suggest) about 2% from the rollup, then there is a systemic issue with the validity of data that undermines the credibility of the project management systems.
Checking off compliance of the EIA 748 criteria will not address such discrepancies, but a robust software solution that has the ability to handle such big data, the analytics to identify such discrepancies, and the flexibility to identify patterns and markers in the data that suggest an early indication of project risk manifestation will address the problem at hand. The technology is now here to be able to perform this operation and to do so at the level of performance expected in desktop operations. This type of solution goes far beyond EVM Tools or EVM engines. The present generation of software possesses both the ability to hardcode solutions out of the box, but also the ability to configure objects, conditional formatting, calculations, and reporting from the same data to introduce leading indicators across a wider array of project management dimensions aside from just cost and schedule.