Walk This Way — DoD IG Reviews DCMA Contracting Officer Business Systems Deficiencies

The sufficiency and effectiveness of business systems is an essential element in the project management ecosystem.  Far beyond performance measurement of the actual effort, the sufficiency of the business systems to support the effort are essential in its success.  If the systems in place do not properly track and record the transactions behind the work being performed, the credibility of the data is called into question.  Furthermore, support and logistical systems, such as procurement, supply, and material management, contribute in a very real way, to work accomplishment.  If that spare part isn’t in-house on time, the work stops.

In catching up on reading this month, I found that the DoD Inspector General issued a report on October 1 showing that of 21 audits demonstrating business system deficiencies, contracting officer timeliness in meeting DFARS deadlines at various milestones existed in every case.  For example, in 17 of those cases Contracting Officers did not issue final determination letters within 30 days of the report as required by the DFARS.  In eight cases required withholds were not assessed.

For those of you who are unfamiliar with the six business systems assessed under DoD contractor project management, they consist of accounting, estimating, material management, purchasing, earned value management, and government property.  The greater the credibility and fidelity of these systems, the greater level of confidence that the government can have in ensuring that the data received in reporting on execution of public funds under these contracts.

To a certain extent the deadlines under the DFARS are so tightly scheduled that they fail to take into account normal delays in operations.  Forbid that the Contracting Officer may be on leave when the audit is received or is engaged in other detailed negotiations.  In recent years the contracting specialty within the government, like government in general, has been seriously understaffed, underfunded, and unsupported.  Given that oftentimes the best and the brightest soon leave government service for greener pastures in the private sector, what is often left are inexperienced and overworked (though mostly dedicated) personnel who do not have the skills or the time to engage in systems thinking in approaching noted deficiencies in these systems.

This pressure for staff reduction, even in areas that have been decimated by austerity politics, is significant.  In the report I could not help but shake my head when an Excel spreadsheet was identified as the “Contractor Business System Determination Timeline Tracking Tool.”  This reminds me of my initial assignment as a young Navy officer and my first assignment as a contract negotiator where I also performed collateral duties in building simple automated tools.  (This led to me being assigned later as the program manager of the first Navy contract and purchase order management system.) That very first system that I built, however, was tracking contract milestone deadlines.  It was done in VisiCalc and the year was 1984.

That a major procurement agency of the U.S. Department of Defense is still using a simple and ineffective spreadsheet tracking “tool” more than 30 years after my own experience is both depressing and alarming.  There is a long and winding history on why they would find themselves in this condition, but some additional training, which was the agency’s response to the IG, is not going to solve the problem.  In fact, such an approach is so ineffective it’s not even a Band-Aid.  It’s a bureaucratic function of answering the mail.

The reason why it won’t solve the problem is because there is no magic wand to get those additional contract negotiators and contracting officers in place.  The large intern program of recruiting young people from colleges to grow talent and provide people with a promising career track is long gone.  Interdisciplinary and cross-domain expertise required in today’s world to reflect the new realities when procuring products and services are not in the works.  In places where they are being attempted, outmoded personnel classification systems based on older concepts of division of labor stand in the way.

The list of systemic causes could go on, but in the end it’s not in the DCMA response because no one cares, and if they do care, they can’t do anything about it.  It’s not as if “BEST TALENT LEAVES DUE TO PUBLIC HOSTILITY TO PUBLIC SERVICE”  was a headline of any significance.  The Post under Bezos is not going to run that one anytime soon, though we’ve been living under it since 1981.  The old “thank you for your service” line for veterans has become a joke.  Those who use this line might as well say what that really means, which is: “I’m glad it was you and not me.”

The only realistic way to augment an organization in this state in order the break the cycle is to automate the system–and to do it in a way as to tie together the entire system.  When I run into my consulting friends and colleagues and they repeat the mantra: “software doesn’t matter, it’s all based on systems” I can only shake my head.  I have learned to be more tactful.

In today’s world software matters.  Try doing today what we used to do with slide rules, scientific calculators, and process charts absent software.  Compare organizations that use the old division-of-labor, “best of breed” tool concept against those who have integrated their systems and use data across domains effectively.  Now tell me again why “software doesn’t matter.”  Not only does it matter but “software” isn’t all the same.  Some “software” consists of individual apps that do one thing.  Some “software” is designed to address enterprise challenges.  Some “software” is designed not only to enterprise challenges, but also to address the maximization of value in enterprise data.

In the case of procurement and business systems assessment, the only path forward for the agency will be to apply data-driven measures to the underlying systems and tie those assessments into a systemic solution that includes the contracting officers, negotiators, administrators, contracting officer representatives, the auditors, analysts, and management.  One can see, just in writing one line, how much more complex are the requirements for the automated panacea to replace “Contractor Business System Determination Timeline Tracking Tool.”  Is there any question why the “tool” is ineffective?

If this were the 1990s, though the practice still persists, we would sit down, perform systems analysis, outline the systems and subsystem solutions, and then through various stages of project management, design the software system to reflect the actual system in place as if organizational change did not exist.  This is the process that has a 90% failure rate across government and industry.  The level of denial to this figure is so great that I run into IT managers and CIOs every day that fail to know it or, if they do, believe that it will apply to them–and these are brilliant people.  It is selection bias and optimism, with a little (or a lot) of narcissism, run amok.  The physics and math on this are so well documented that you might as well take your organization’s money and go to Vegas with it.  Your local bookie could give you better odds.

The key is risk handling (not the weasel word “management,” not “mitigation” since some risks must simply be accepted, and certainly not the unrealistic term “avoidance”), and the deployment of technology that provides at least a partial solution to the entire problem, augmented by incremental changes to incorporate each system into the overall solution. For example, DeLong and Froomkin’s seminal paper on what they called “The Next Economy” holds true today.  The lack of transparency in software technologies requires a process whereby the market is surveyed, vendors must go through a series of assessments and demonstration tests, and where the selected technology then goes through stage gates: proof-of-concept, pilot, and, eventually deployment.  Success at each level gets rewarded with proceeding to the next step.

Thus, ideally the process includes introducing into the underlying functionality the specific functionality required by the organization through Agile processes where releasable versions of the solution are delivered at the end of each sprint.  One need not be an Agile Cultist to do this.  In my previous post I referred to Neil Killick’s simple checklist for whether you are engaged in Agile.  It is the best and most succinct distillation of both the process and value inherent in Agile that I have found to date, with all of the “woo-woo” taken out.  For an agency as Byzantine as DCMA, this is really the only realistic and effective approach.

DCMA is an essential agency in DoD acquisition management, but it cannot do what it once did under a more favorable funding environment.  To be frank, it didn’t even do its job all that well when a more favorable condition was in place, though things were better.  But this is also a factor in why it finds itself in its current state.  It was punished for its transgressions, perhaps too much.  Several waves of personnel cuts, staff reductions, and domain and corporate knowledge loss on top of the general trend has created an agency in a condition of siege.  As with any organization under siege, backbiting and careerism for those few remaining is rewarded.  Iconoclasts and thought leaders stay for a while before being driven away.  They are seen as being too risky.

This does not create a condition for an agency ready to accept or quickly execute change through new technology.  What it does do is allow portions of the agency to engage in cargo cult change management.  That is, it has the appearance of change but keeps self-interest comfortable and change in its place.  Over time–several years–with the few remaining resources committed to this process, they will work the “change.”  Eventually, they may even get something tangible, though suboptimized to conform to rice bowls; preferably after management has their retirement plans secured.

Still, the reality is that DCMA must be made to do it’s job because it is in the best interests of the U.S. Department of Defense.  The panacea will not be found through “collaboration” with industry, which consists of the companies which DCMA is tasked with overseeing and regulating.  We all know how well deregulation and collaboration has worked in the financial derivatives, banking, mortgage, and stock markets.  Nor will it come from organic efforts within an understaffed and under-resourced agency that will be unable to leverage the best and latest technology solutions under the unforgiving math of organic IT failure rates.  Nor will deploying the long outmoded approach of deploying suboptimized “tools” to address a particular problem.  The proper solution is to leverage effective COTS solutions that facilitate the challenge of systems integration and thinking.

 

 

Driver’s Seat — How Software Normalization Can Drive Process Improvement

Travel, business, and family obligations have interrupted regular blogging–many apologies but thanks for hanging in there and continuing to read the existing posts.

Over the past couple of weeks I have taken note of two issues that regularly pop up: the lack of consistency in how compliance is applied by oversight organizations within both industry and within government, especially in cases of government agencies with oversight responsibility in project management; and the lack of consistency in data and information that informs project management systems.

That the same condition exists in both areas is not, I believe, a coincidence, and points to a great deal of hair-pulling, scapegoating, and finger-pointing that would otherwise have been avoided over the years.  I am not saying that continual process improvement without technology is not essential, but it is undoubtedly true that there is a limit to how effectively information can be processed and consumed in a pre-digitized system compared to post-digitized systems.  The difference is in multiples, not only in the amount of data, but also in the quality of the data after processing.

Thus, an insight that I have observed as we apply new generations of software technology to displace the first and second wave of project management technologies is an improved ability to apply consistency and standardization in oversight.  Depending on where you stand this is either a good or bad thing.  If you are a traditional labor-intensive accounting organization where you expect a team of personnel to disrupt the organization in going through details of a paper trail, then you are probably unhappy because your business model will soon be found to be obsolete (actually it already is depending on where your customer sits on a scale of digitization).  If you are interested, however, in optimization of systems then you are probably somewhere to the positive on the scale.

Software companies are mainly interested in keeping their customers tied to their technology.  For example, try buying the latest iPhone if you have an existing plan with a carrier but want to switch to someone else.  This is why I am often puzzled by how anyone in the economics or political science professions cannot understand why we have new technological robber barons with the resulting concentration in wealth and political power.  One need only look at how the railroads and utilities tied entire swaths of the country into knots well into the 20th century prior to anti-trust enforcement.  The technology is new but the business model is the same.

The condition of establishing technological islands of code and software is what creates diseconomies in processes.  The costs associated with multiple applications to address different business processes increases costs and reduces efficiency not only because of proprietary idiosyncrasies which create duplicative training, maintenance, and support requirements, but also because of the costs associated with reconciliation and integration of interrelated data, usually accomplished manually.  On the systems validation and oversight side, this lack of consistency in data drives inconsistency in the way in which the effectiveness of project management systems are assessed.

Years of effort, training, policy writing, and systems adjustments have met with the law of diminishing returns while ignoring the underlying and systemic cause of inconsistency in interdependent factors.  Yet, when presented with a system in which otherwise proprietary and easily reconcilable data is normalized to not only ensure consistency but quality, the variations in how the data is assessed and viewed diminishes very quickly.  This should be no surprise but, despite the obvious advantages and economies being realized, resistance still exists, largely based in fear.

The fear is misplaced only because it lies in the normal push and pull of management/employee and customer/contractor relations.  Given more information and more qualitatively insightful information, the argument goes, the more that oversight will become disruptive.  That this condition exists today because of sub-optimization and lack of consistency does not seem to occur to the proponents of this argument.  Both sides, like two wrestlers having locked each other in a stronghold that cannot result in a decision, is each loathe to ease their own grip in fear that the other will take advantage of the situation.  Yet, technology will be the determining factor as the economic pressures become too hard to resist.  It is time to address these fears and reestablish the lines of demarcation in our systems based on good leadership and management practices–skills that seem to be disappearing as more people and companies become focused on 1s and 0s.

Note: The post has been modified to correct grammatical errors.  Travel took its toll on the first go-round.

Don’t Be Cruel — Contract Withholds and the Failure of Digital Systems

Recent news over at Breaking Defense headlined a $25.7M withhold to Pratt & Whitney for the F135 engine.  This is the engine for the F35 Lightning II aircraft, also known as the Joint Strike Fighter (JSF).  The reason for the withhold in this particular case was for an insufficient cost and schedule business system that the company has in place to support project management.

The enforcement of withholds for deficiencies in business systems was instituted in August 2011.  These business systems include six areas:

  • Accounting
  • Estimating
  • Purchasing
  • EVMS (Earned Value Management System)
  • MMAS (Material Management and Accounting System), and
  • Government Property

As of November 30, 2013, $19 million had been held back from BAE Systems Plc, $5.2 million from Boeing Co., and $1.4 million Northrop Corporation.  These were on the heels of a massive $221 million held back from Lockheed Martin’s aeronautics unit for its deficient earned value management system.  In total, fourteen companies were impacted by withholds last year.

For those unfamiliar with the issue, these withholds may seem to be a reasonable enforcement mechanism that sufficient business systems are in place in order to ensure that there is traceability in the expenditure of government funds by contractors.  After all, given the disastrous state of affairs where there was massive loss of accountability by contractors in Iraq and Afghanistan, many senior personnel in DoD felt that there needed to be teeth given to contracting officer, and what better way to do this than through financial withholds?  The rationale is that if the systems are not adequate then the information originated from these systems is not credible.

This is probably a good approach for the acquisition of wartime goods and services, but doesn’t seem to fit the reality of the project management environment in which government contracting operates.  The strongest objections to the rule, I think, came from the legal community, most notably from the Bar Association’s Section of Public Contract Law.  Among these was that the amount of the withhold is based on an arbitrary percentage within the DFARS rule.  Another point made is that the defects in the systems in most cases are disconnected from actual performance and so redirect attention and resources away from the contractual obligation at hand.

These objections were made prior to the rule’s acceptance.  But now that the rule is being enforced the more important question is the effect of the withholds on project management.  My own anecdotal experience from having been a business manager in a program management staff is that the key to project success is oftentimes determined by cash flow.  While internal factors to the project, such as the effective construction of the integrated master schedule (IMS), performance management baseline (PMB), risk identification and handling, and performance tracking against these plans are the primary focus of project integrity, all too often the underlying financial constraints in which the project must operate is treated as a contingent factor.  If our capabilities due to financial constraints are severe, then the best plan in the world will not achieve the desired results because it fails to be realistic.

The principles that apply to any entrepreneurial enterprise also apply to complex projects.  It is true that large companies do have significant cash reserves and that these reserves have grown significantly since the 2007-2010 depression.  But a major program requires a large infusion of resources that constitutes a significant barrier to entry, and so such reserves contribute to the financial stability necessary to undertake such efforts.  Profit is not realized on every project.  This may sound surprising to those unfamiliar with public administration, but this is the case because it sometimes is worth breaking even or taking a slight loss so as not to lose essential organizational knowledge.  It takes years to develop an engineer who understands the interrelationships of the key factors in a jet fighter: the tradeoffs between speed, maneuverability, weight, and stress from the operational environment, like taking off from and landing on a large metal aircraft carrier that travels on salt water.  This cannot be taught in college nor can it be replaced if the knowledge is lost due to budget cuts, pay freezes, and sequestration.  Oftentimes, because of their size and complexity, project start-up costs much be financed using short term loans, adding risk when payments are delayed and work interrupted.  The withhold rule adds an additional, if not unnecessary, dimension of risk to project success.

Given that most of the artifacts that are deemed necessary to handle and reduce risk are done in a collaborative environment by the contractor-government project team through the Integrated Baseline Review (IBR) process and system validation–as well as pre-award certifications–it seems that there is no clear line of demarcation to place the onus of inadequate business systems on the contractor.  The reality of the situation, given cost-plus contracts for development contracts, is that industry is, in fact, a private extension of the defense infrastructure.

It is true that a line must be drawn in the contractual relationship to provide those necessary checks and balances to guard against fraud, waste, or a race to the lowest common denominator that undermines accountability and execution of the contractual obligation.  But this does not mean that the work of oversight requires another post-hoc layer of surveillance.  If we are not getting quality results from pre-award and post-award processes, then we must identify which ones are failing us and reform them.  Interrupting cash flow for inadequately assessed business systems may simply be counter-productive.

As Deming would argue, quality must be built into the process.  What defines quality must also be consistent.  That our systems are failing in that regard is indicative, I believe, in a failure of imagination on the part of our digital systems, on which most business systems rely.  It was fine in the first wave of microcomputer digitization in the 1980s and 1990s to simply design systems that mimicked the structure of pre-digital information specialization.  Cost performance systems were built to serve the needs of cost analysts, scheduling systems were designed for schedulers, risk systems for a sub-culture of risk specialists, and so on.

To break these stovepipes the response of the IT industry twofold, which constitutes the second wave of digitization of project management business processes.

One was in many ways a step back.  The mainframe culture in IT had been on the defensive since the introduction of the PC and “distributed” processing.  Here was an opening to reclaim the high ground and so expensive, hard-coded ERP, PPM, and BI systems were introduced.  The lack of security in deploying systems quickly in the first wave also provided the community with a convenient stalking horse, though even the “new” systems, as we have seen, lack adequate security in the digital arms race.  The ERP and BI systems are expensive and require specialized knowledge and expertise.  Solutions are hard-coded, require detailed modeling, and take a significant amount of time to deploy, supporting a new generation of coders.  The significant financial and personnel resources required to acquire and implement these systems–and the management reputation on the line for making the decision to acquire the systems in the first place–have become a rationale for their continued use, even when they fail at the same high rate of all IT development projects.  Thus, tradeoff analysis between sunk costs and prospective costs is rarely made in determining their sustainability.

Another response was to knit together the first wave, specialized systems in “best-of-breed” configurations.  In this case data is imported and reconciled between specialized systems to achieve integration needed to service the cross-functional nature of project management.  Oftentimes the estimating, IMS, PMB, and qualitative and quantitative risk artifacts are constructed by separate specialists with little or no coordination or fidelity.  These environments are characterized by workarounds, special resource-heavy reconciliation teams dedicated to verifying data between systems, the expenditure of resources in fixing errors after the fact, and in the development of Access and MS Excel-heavy one-off solutions designed to address deficiencies in the underlying systems.

That the deficiencies that are found in the solutions described above are mimicked in the findings of the deficiencies in business systems marks the culprits largely being the underlying information systems.  The solution, I think, is going to come from those portions of the digital community where the barriers to entry are low.  The current crop of software in place is reaching the end of its productive life from the first and second waves.  Hoping to protect market share and stave off the inevitable, new delivery and business models are being deployed by entrenched software companies, who have little incentive to drive the industry to the next phase.  Instead, they have been marketing SaaS and cloud computing as the panacea, though the nature of the work tends to militate against acceptance of external hosting.  In the end, I believe the answer is to leverage new technologies that eliminate the specialized and hard-coded nature of the first example, but achieve integration, while leveraging the existing historical data that exists in great abundance from the second example.

Note: The title and some portions of this post were modified from the original for clarity.