Walk This Way — DoD IG Reviews DCMA Contracting Officer Business Systems Deficiencies

The sufficiency and effectiveness of business systems is an essential element in the project management ecosystem.  Far beyond performance measurement of the actual effort, the sufficiency of the business systems to support the effort are essential in its success.  If the systems in place do not properly track and record the transactions behind the work being performed, the credibility of the data is called into question.  Furthermore, support and logistical systems, such as procurement, supply, and material management, contribute in a very real way, to work accomplishment.  If that spare part isn’t in-house on time, the work stops.

In catching up on reading this month, I found that the DoD Inspector General issued a report on October 1 showing that of 21 audits demonstrating business system deficiencies, contracting officer timeliness in meeting DFARS deadlines at various milestones existed in every case.  For example, in 17 of those cases Contracting Officers did not issue final determination letters within 30 days of the report as required by the DFARS.  In eight cases required withholds were not assessed.

For those of you who are unfamiliar with the six business systems assessed under DoD contractor project management, they consist of accounting, estimating, material management, purchasing, earned value management, and government property.  The greater the credibility and fidelity of these systems, the greater level of confidence that the government can have in ensuring that the data received in reporting on execution of public funds under these contracts.

To a certain extent the deadlines under the DFARS are so tightly scheduled that they fail to take into account normal delays in operations.  Forbid that the Contracting Officer may be on leave when the audit is received or is engaged in other detailed negotiations.  In recent years the contracting specialty within the government, like government in general, has been seriously understaffed, underfunded, and unsupported.  Given that oftentimes the best and the brightest soon leave government service for greener pastures in the private sector, what is often left are inexperienced and overworked (though mostly dedicated) personnel who do not have the skills or the time to engage in systems thinking in approaching noted deficiencies in these systems.

This pressure for staff reduction, even in areas that have been decimated by austerity politics, is significant.  In the report I could not help but shake my head when an Excel spreadsheet was identified as the “Contractor Business System Determination Timeline Tracking Tool.”  This reminds me of my initial assignment as a young Navy officer and my first assignment as a contract negotiator where I also performed collateral duties in building simple automated tools.  (This led to me being assigned later as the program manager of the first Navy contract and purchase order management system.) That very first system that I built, however, was tracking contract milestone deadlines.  It was done in VisiCalc and the year was 1984.

That a major procurement agency of the U.S. Department of Defense is still using a simple and ineffective spreadsheet tracking “tool” more than 30 years after my own experience is both depressing and alarming.  There is a long and winding history on why they would find themselves in this condition, but some additional training, which was the agency’s response to the IG, is not going to solve the problem.  In fact, such an approach is so ineffective it’s not even a Band-Aid.  It’s a bureaucratic function of answering the mail.

The reason why it won’t solve the problem is because there is no magic wand to get those additional contract negotiators and contracting officers in place.  The large intern program of recruiting young people from colleges to grow talent and provide people with a promising career track is long gone.  Interdisciplinary and cross-domain expertise required in today’s world to reflect the new realities when procuring products and services are not in the works.  In places where they are being attempted, outmoded personnel classification systems based on older concepts of division of labor stand in the way.

The list of systemic causes could go on, but in the end it’s not in the DCMA response because no one cares, and if they do care, they can’t do anything about it.  It’s not as if “BEST TALENT LEAVES DUE TO PUBLIC HOSTILITY TO PUBLIC SERVICE”  was a headline of any significance.  The Post under Bezos is not going to run that one anytime soon, though we’ve been living under it since 1981.  The old “thank you for your service” line for veterans has become a joke.  Those who use this line might as well say what that really means, which is: “I’m glad it was you and not me.”

The only realistic way to augment an organization in this state in order the break the cycle is to automate the system–and to do it in a way as to tie together the entire system.  When I run into my consulting friends and colleagues and they repeat the mantra: “software doesn’t matter, it’s all based on systems” I can only shake my head.  I have learned to be more tactful.

In today’s world software matters.  Try doing today what we used to do with slide rules, scientific calculators, and process charts absent software.  Compare organizations that use the old division-of-labor, “best of breed” tool concept against those who have integrated their systems and use data across domains effectively.  Now tell me again why “software doesn’t matter.”  Not only does it matter but “software” isn’t all the same.  Some “software” consists of individual apps that do one thing.  Some “software” is designed to address enterprise challenges.  Some “software” is designed not only to enterprise challenges, but also to address the maximization of value in enterprise data.

In the case of procurement and business systems assessment, the only path forward for the agency will be to apply data-driven measures to the underlying systems and tie those assessments into a systemic solution that includes the contracting officers, negotiators, administrators, contracting officer representatives, the auditors, analysts, and management.  One can see, just in writing one line, how much more complex are the requirements for the automated panacea to replace “Contractor Business System Determination Timeline Tracking Tool.”  Is there any question why the “tool” is ineffective?

If this were the 1990s, though the practice still persists, we would sit down, perform systems analysis, outline the systems and subsystem solutions, and then through various stages of project management, design the software system to reflect the actual system in place as if organizational change did not exist.  This is the process that has a 90% failure rate across government and industry.  The level of denial to this figure is so great that I run into IT managers and CIOs every day that fail to know it or, if they do, believe that it will apply to them–and these are brilliant people.  It is selection bias and optimism, with a little (or a lot) of narcissism, run amok.  The physics and math on this are so well documented that you might as well take your organization’s money and go to Vegas with it.  Your local bookie could give you better odds.

The key is risk handling (not the weasel word “management,” not “mitigation” since some risks must simply be accepted, and certainly not the unrealistic term “avoidance”), and the deployment of technology that provides at least a partial solution to the entire problem, augmented by incremental changes to incorporate each system into the overall solution. For example, DeLong and Froomkin’s seminal paper on what they called “The Next Economy” holds true today.  The lack of transparency in software technologies requires a process whereby the market is surveyed, vendors must go through a series of assessments and demonstration tests, and where the selected technology then goes through stage gates: proof-of-concept, pilot, and, eventually deployment.  Success at each level gets rewarded with proceeding to the next step.

Thus, ideally the process includes introducing into the underlying functionality the specific functionality required by the organization through Agile processes where releasable versions of the solution are delivered at the end of each sprint.  One need not be an Agile Cultist to do this.  In my previous post I referred to Neil Killick’s simple checklist for whether you are engaged in Agile.  It is the best and most succinct distillation of both the process and value inherent in Agile that I have found to date, with all of the “woo-woo” taken out.  For an agency as Byzantine as DCMA, this is really the only realistic and effective approach.

DCMA is an essential agency in DoD acquisition management, but it cannot do what it once did under a more favorable funding environment.  To be frank, it didn’t even do its job all that well when a more favorable condition was in place, though things were better.  But this is also a factor in why it finds itself in its current state.  It was punished for its transgressions, perhaps too much.  Several waves of personnel cuts, staff reductions, and domain and corporate knowledge loss on top of the general trend has created an agency in a condition of siege.  As with any organization under siege, backbiting and careerism for those few remaining is rewarded.  Iconoclasts and thought leaders stay for a while before being driven away.  They are seen as being too risky.

This does not create a condition for an agency ready to accept or quickly execute change through new technology.  What it does do is allow portions of the agency to engage in cargo cult change management.  That is, it has the appearance of change but keeps self-interest comfortable and change in its place.  Over time–several years–with the few remaining resources committed to this process, they will work the “change.”  Eventually, they may even get something tangible, though suboptimized to conform to rice bowls; preferably after management has their retirement plans secured.

Still, the reality is that DCMA must be made to do it’s job because it is in the best interests of the U.S. Department of Defense.  The panacea will not be found through “collaboration” with industry, which consists of the companies which DCMA is tasked with overseeing and regulating.  We all know how well deregulation and collaboration has worked in the financial derivatives, banking, mortgage, and stock markets.  Nor will it come from organic efforts within an understaffed and under-resourced agency that will be unable to leverage the best and latest technology solutions under the unforgiving math of organic IT failure rates.  Nor will deploying the long outmoded approach of deploying suboptimized “tools” to address a particular problem.  The proper solution is to leverage effective COTS solutions that facilitate the challenge of systems integration and thinking.

 

 

Family Affair — Part III — Private Monopsony, Monopoly, and the Disaccumulation of Capital

It’s always good to be ahead of the power curve.  I see that the eminent Paul Krugman had an editorial in the New York Times about the very issues that I’ve dealt with in this blog, his example in this case being Amazon.  This is just one of many articles that have been raised about the monopsony power as a result of the Hatchette controversy.  In The New Republic Franklin Foer also addresses this issue at length in the article “Amazon Must Be Stopped.”  In my last post on this topic I discussed public monopsony, an area in which I have a great deal of expertise.  But those of us in the information world that are not Microsoft, Oracle, Google, or one of the other giants also live in the world of private monopsony.

For those or you late to these musings (or skipped the last ones), this line of inquiry when my colleague Mark Phillips made the statement at a recent conference that, while economic prospects for the average citizen are bad, that the best system that can be devised is one based on free market competition, misquoting Churchill.  The underlying premise of the statement, of course, is that this is the system that we currently inhabit, and that it is the most efficient way to distribute resources.  There also is usually an ideological component involved regarding some variation of free market fundamentalism and the concept that the free market is somehow separate from and superior to the government issuing the currency under which the system operates.

My counter to the assertions found in that compound statement is to prove that the economic system that we inhabit is not a perfectly competitive one, that there are large swaths of the market that are dysfunctional and that have given rise to monopoly, oligopoly, and monopsony power.  In addition, the ideological belief–which is very recent–that the roots of private economic activity is one that had arisen almost spontaneously with government being a separate component that can only be an imposition, is also false, given that nation-states and unions of nation states (as in the case of the European Union) are the issuers of sovereign currency, and so choose through their institutions the amount of freedom, regulation, and competition that their economies foster.  Thus, the economic system that we inhabit is the result of political action and public policy.

The effects of the distortions of monopoly and oligopoly power in the so-called private sector is all around us.  But when one peels back the onion we can see clearly the interrelationships between the private and public sectors.

For example, patent monopolies in the pharmaceutical industry allow for prices to be set, not based on the marginal value of the drug that would be set by a competitive market, but based on the impulse for profit maximization.  A recent example in the press lately–and critiqued by economist Dean Baker–has concerned the hepatitis-C drug Sovaldi, which goes for $84,000 a treatment, compared to markets in which the drug has not been granted a patent monopoly, where the price is about $900 a treatment.  Monopoly power, in the words of Baker, impose 10,000 percent tariff on those who must live under that system.  This was one of the defects in a system that I wrote about in my blog posts regarding tournaments and games of failure, though in pharmaceuticals the comparison seems to be more in line with gambling and lotteries.  The financial risks of investors, who often provide funds based on the slimmest thread of a good idea and talent, are willing to put great sums of money at risk in order to strike it rich and realize many times their initial investment.  The distorting incentives on this system are well documented: companies tend to focus on those medications and drugs with the greatest potential financial rate of return guaranteed by the patent monopoly system*, drug trials that downplay the risks and side-effects of the medications, and the price of medications is placed at so high a level as to eliminate them from all but the richest members of society since few private drug insurance plans will authorize such treatments given the cost–at least not without a Herculean effort on the part of individual patients.

We can also see the monopoly power at work first hand with the present lawsuits between Apple and Samsung regarding the smartphone market.  For many years (until very recently) the U.S. patent office took a permissive stand in allowing technology firms to essentially patent the look and feel of a technology, as well as features that could be developed and delivered by any number of means.  The legal system, probably more technologically challenged than other areas of society, has been inconsistent in determining how to deal with these claims.  The fact finder in many cases has been juries, who are not familiar with the nuances of the technology.  One need not make a stretch to pick out practical analogies of these decisions.  If applied to automobiles, for example, the many cases that have enforced these patent monopolies would have restricted windshield wipers to the first company that delivered the feature.  Oil filters, fuel filters, fuel injection, etc. would all have been restricted to one maker.

The stakes are high not only for these two giant technology companies but also for consumers.  They have already used their respective monopoly power established by their sovereign governments to pretty effectively ensure that the barriers to entry in the smartphone market are quite high.  Now they are unleashing these same forces on one another.  In the end, the manufacturing costs of the iPhone 6–which is produced by slave labor under the capitalist variant of Leninist China–are certainly much lower than the $500 and more that they demand (along with the anti-competitive practice of requiring a cellular agreement with one of their approved partners).  The tariff that consumers pay for the actual cost of production and maintenance on smartphones is significant.  This is not remedied by the oft-heard response to “simply not buy a smartphone,” since it shifts responsibility for the establishment of the public policy that allows this practice to flourish, to individuals who are comparatively powerless against the organized power of lobbyists who influenced public representatives to make these laws and institute the policy.

The fight over IP and patent (as well as net neutrality) are important for the future of technological innovation.  Given the monopsony power of companies that also exert monopoly power in particular industries, manufacturers are at risk of being squeezed in cases where prices are artificially reduced through the asymmetrical relationship between large buyers and relatively small sellers.  Central planning, regardless of whether it is exerted by a government or a large corporation, is dysfunctional.  When those same corporations seek to not only exert monopoly and monopsony power, but also to control information and technology, they seek to control all aspects of an economic activity not unlike the trusts of the time of the Robber Barons.  Amazon and Walmart are but two of the poster children of this situation.

The saving grace of late has been technological “disruption,” but this term has been misused to also apply to rent-seeking behavior.  I am not referring only to the kind of public policy rent-seeking that Amazon achieves when it avoids paying local taxes that apply to its competitors, or that Walmart achieves when it shifts its substandard pay and abusive employee policies to local, state, and federal public assistance agencies.  I am also referring to the latest controversies regarding AirBnB, Lyft, and Uber, which use loopholes in dealing with technology to sidestep health and safety laws in order to gain entry into a market.

Technological disruption, instead, is a specific phenomenon, based on the principle that the organic barriers to entry in a market are significantly reduced due to the introduction of technology.  The issue over the control of and access to information and innovation is specifically targeted at this phenomenon.  Large companies aggressively work to keep out new entries and to hinder innovations except those that they can control, conspiring against the public good.

The reason for why these battles are lining up resides in the modern phenomena known as disaccumulation of capital, which was first identified by social scientist Martin J. Sklar.  What this means is that the accumulation of capital, which is the time it takes to reproduce the existing material conditions of civilization, began declining in the 1920s.  As James Livingston points out in the same linked article in The Nation, “economic growth no longer required net additions either to the capital stock or the labor force….for the first time in history, human beings could increase the output of goods without increasing the essential inputs of capital and labor—they were released from the iron grip of economic necessity.”

For most of the history of civilization, the initial struggle of economics has been the ability for social organization to provide sufficient food, clothing, shelter, and medical care to people.  The conflicts between competing systems has been centered on their ability to most efficiently achieve these purposes without sacrificing individual liberty, autonomy, and dignity.  The technical solution for these goals has largely been achieved, but the efficient distribution of these essential elements of human existence has not been solved.  With the introduction of more efficient methods of information processing as well as production (digital printing is just the start), we are at the point where the process of less capital in the aggregate being required to produce the necessities and other artifacts of civilization is accelerating exponentially.

Concepts like full employment will increasingly become meaningless, because the same relationship of labor input to production that we came to expect in the recent past has changed within our own lifetimes.  Very small companies, particularly in technology, can have and have had a large impact.  In more than one market, even technology companies are re-learning the lesson of the “mythical man-month.”  Thus, the challenge in our time is to rethink the choices we have made and are making in terms of incentives and distribution that maximizes human flourishing.  But I will leave that larger question to another blog post.

For the purposes of this post focused on technology and project management, these developments call for a new microeconomics.  The seminal paper that identified this need early on was by Brad DeLong and Michael Froomkin in 1997 entitled “The Next Economy.”  While some of the real life examples they give from our perspective today provides a stroll down the digital memory-lane,  their main conclusions are relevant in how information differs from physical goods.  These are:

a.  Information is non-rivalrous.  That is, one person consuming information does not preclude someone else from consuming that information.  That is, information that is produced can be economically reproduced to operate in other environments at little to no marginal cost.  What they are talking about here is application software and the labor involved in producing a version of it.

b.  Information without exterior barriers is non-exclusive.  That is, if information is known it is almost impossible for others to know it.  For example, Einstein was the first to observe the mathematics of relativity but now every undergraduate physics student is expected to fully understand the theory.

c.  Information is not transparent.  That is, oftentimes in order to determine whether a piece of software will achieve its intended purpose, effort and resources must be invested to learn it and, oftentimes, apply it if initially only in a pilot program.

The attack coming from monopsony power is directed at the first characteristic of information.  The attack coming from monopoly power is currently directed at the second.    Doing so undermines both competition and innovation.  The first by denying the ability of small technology companies to capitalize sufficiently to develop the infrastructure necessary to become sustainable.  Oftentimes this reduces a market to one dominant supplier.  The second by restricting the application of new technologies and lessons learned based on the past.  The nature of information asymmetry is a problem for the third aspect of information, since oftentimes bad actors are economically rewarded at the expense of high quality performers as first identified in the automobile industry in George Akerlof’s paper “The Market for Lemons” (paywall).

The strategy of some entrepreneurs in small companies in reaction to these pressures has been to either sell out and be absorbed by the giants, or to sell out to private equity firms that “add value” by combining companies in lieu of organic growth, loading them down with debt from non-sustainable structuring, and selling off the new entity or its parts.  The track record for the sustainability of the applications involved in these transactions (and the satisfaction of customers) is a poor one.

One of the few places where competition still survives is among small to medium sized technology companies.  In order for these companies (and the project managers in them) to survive independently requires not only an understanding of the principles elucidated by DeLong and Froomkin.  Information also shares several tendencies with other technological innovation, but in ways that are unique to it, in improving efficiency and productivity; and in reducing the input of labor and capital.

The key is in understanding how to articulate value, how to identify opportunities for disruption, and to understand the nature of the markets in which one operates.  One’s behavior will be different if the market is diverse and vibrant, with many prospective buyers and diverse needs, as opposed to one dominated by one or a few buyers.  In the end it comes down to understanding the pain of the customer and having the agility and flexibility to solve that pain in areas where larger companies are weak or complacent.

 

*Where is that Ebola vaccine–which mainly would have benefited the citizens of poor African countries and our own members of the health services and armed forces–that would have averted public panic today?