My Generation — Baby Boom Economics, Demographics, and Technological Stagnation

“You promised me Mars colonies, instead I got Facebook.” — MIT Technology Review cover over photo of Buzz Aldrin

“As a boy I was promised flying cars, instead I got 140 characters.”  — attributed to Marc Maron and others

I have been in a series of meetings over the last couple of weeks with colleagues describing the state of the technology industry and the markets it serves.  What seems to be a generally held view is that both the industry and the markets for software and technology are experiencing a hardening of the arteries and a resistance to change not seen since the first waves of digitization in the 1980s.

It is not as if this observation has not been noted by others.  Tyler Cowen at George Mason University noted the trend of technological stagnation in the eBook The Great Stagnation: How America Ate All the Low-Hanging Fruit of Modern History, Got Sick, and Will(Eventually) Feel BetterCowen’s thesis is not only to point out that innovation has slowed since the late 19th century, but that it has slowed a lot, where we have been slow to exploit “low-hanging fruit.”  I have to say that I am not entirely convinced by some of the data, which is anything but reliable in demonstrating causation in the long term trends.  Still, his observations of technological stagnation seem to be on the mark.  His concern, of course, is also directed to technology’s affect on employment, pointing out that, while making some individuals very rich, the effect of recent technological innovation doesn’t result in much employment

Cowen published his work in 2011, when the country was still in the early grip of the slow recovery from the Great Recession, and many seized on Cowen’s thesis as an opportunity for excuse-mongering and looking for deeper causes than the most obvious ones: government shutdowns, wage freezes, reductions in government R&D that is essential to private sector risk handling, and an austerian fiscal policy (with sequestration) in the face of weak demand created by the loss of $8 trillion in housing wealth that translated into a consumption gap of $1.2 trillion in 2014 dollars

Among the excuses that were manufactured is the meme that is still making the rounds about jobs mismatch due to a skills gap.  But, as economist Dean Baker has pointed out again and again, basic economics dictates that the scarcity of a skill manifests itself in higher wages and salaries–a reality not supported by the data for any major job categories.  Unemployment stood at 4.4 percent in May 2007 prior to the Great Recession.  The previous low between recession and expansion was the 3.9 percent rate in December 2000, yet we are to believe that suddenly in the 4 years since the start of one of the largest bubble crashes and resulting economic and financial crisis, that people no longer have the skills need to be employed (or suddenly are more lazy or shiftless).  The data do not cohere.

In my own industry and specialty there are niches for skills that are hard to come by and these people are paid handsomely, but the pressure among government contracting officers across the board has been to drive salaries down–a general trend seen across the country and pushed by a small economic elite and therein, I think lies the answer more than some long-term trend tying patents to “innovation.”  The effect of this downward push is to deny the federal government–the people’s government–from being able to access the high skills personnel needed to make it both more effective and responsive.  Combined with austerity policies there is a race to the bottom in terms of both skills and compensation.

What we are viewing, I think, that is behind our current technological stagnation is a reaction to the hits in housing wealth, in real wealth and savings, in employment, and in the downward pressure on compensation.  Absent active government fiscal policy as the backstop of last resort, there are no other places to make up for $1.2 trillion in lost consumption.  Combine this with the excesses of the patent and IP systems that create monopolies and stifle competition, particularly under the Copyright Term Extension Act and the recent Leahy-Smith America Invents Act.  Both of these acts have combined to undermine the position of small inventors and companies, encouraging the need for large budgets to anticipate patent and IP infringement litigation, and raising the barriers to entry for new technological improvements.

No doubt exacerbating this condition is the Baby Boom.  Since university economists don’t seem to mind horning in on my specialty (as noted in a recent post commenting on the unreliability of data mining by econometrics),  I don’t mind commenting on theirs–and what has always surprised me is how Baby Boom Economics never seems to play a role in understanding trends, nor as predictors of future developments in macroeconomic modeling.  Wages and salaries, even given Cowen’s low-hanging fruit, have not kept pace with productivity gains (which probably explains a lot of wealth concentration) since the late 1970s–a time that coincides with the Baby Boomers entering the workforce in droves.  A large part of this condition has been a direct consequence of government policies–through so-called ‘free trade” agreements–that have exposed U.S. workers in industrial and mid-level jobs to international competition from low-paying economies.

The Baby Boom, given an underperforming economy, saw not only their wages and salaries lag, but also saw their wealth and savings disappear with the Great Recession, when corporate mergers and acquisitions weren’t stealing their negotiated defined benefit plans, which they received in lieu of increases in compensation.  This has created a large contingent of surplus labor.  The size of the long-term unemployed, though falling, is still large compared to historical averages, is indicative of this condition.

With attempts to privatize Social Security and Medicare, workers now find themselves squeezed and under a great deal of economic anxiety.  On the ground I see this anxiety even at the senior executive level.  The workforce is increasingly getting older as people hang on for a few more years, perpetuating older ways of doing things. Even when there is a changeover, oftentimes the substitute manager did not receive the amount of mentoring and professional development expected in more functional times.  In both cases people are risk-averse, feeling that there is less room for error than there was in the past.

This does not an innovative economic environment make.

People who I had known as risk takers in their earlier years now favor the status quo and a quiet glide path to a secure post-employment life.  Politics and voting behavior also follows this culture of lowered expectations, which further perpetuates the race to the bottom.  In high tech this condition favors the perpetuation of older technologies, at least until economics dictates a change.

But it is in this last observation that there is hope for an answer, which does confirm that this is but a temporary condition.  For under the radar there are economies upon economies in computing power and the ability to handle larger amounts of data with exponential improvements in handling complexity.  Collaboration of small inventors and companies in developing synergy between compatible technologies can overcome the tyranny of the large monopolies, though the costs and risks are high.

As the established technologies continue to support the status quo–and postpone needed overhauls of code mostly written 10 to 20 years ago (which is equivalent to 20 to 40 software generations) their task, despite the immense amount of talent and money, is comparable to a Great Leap Forward–and those of you who are historically literate know how those efforts turned out.  Some will survive but there will be monumental–and surprising–falls from grace.

Thus the technology industry in many of its more sedentary niches are due for a great deal of disruption.  The key for small entrepreneurial companies and thought leaders is to be there before the tipping point.  But keep working the politics too.

Why Can’t We Be Friends — The Causes of War

Paul Krugman published an interesting opinion piece in the Sunday New York Times entitled “Why We Fight Wars” in which he attempts to understand why developed, relatively affluent countries still decide to wage war, despite the questionable economic costs.  Many websites seconded his observations, particularly those that view social systems and people as primarily rational economic beings.  I think the problem with Mr. Krugman’s opinion–and there is no doubt that he is a brilliant economist and observer of our present state of affairs with a Nobel to his name no less–is that he doesn’t fully comprehend that while the economic balance sheet may argue against warfare, there are other societal issues that lead a nation to war.

Warfare, its causes, and the manner to conduct it was part of my profession for most of my life.  My education was dedicated not only to my academic development but also to its utility in understanding the nature of civilization’s second oldest profession–and how to make what we do in waging war–at the tactical, operational, strategic level–that much more effective.  In the advanced countries we attempt to “civilize” warfare, though were it to be waged in its total state today, it would constitute post-industrial, technological mass murder and violence on a scale never seen before.  This knowledge, which is even recognized by peripheral “Third World” nations and paramilitary organizations, actually make such a scenario both unthinkable and unlikely.  It is most likely this knowledge that keeps Russian ambitions limited to insurgents, proxies, Fifth Columnists, and rigged referendums in its current war of conquest against Ukraine.

Within the civilized view of war, it begins with Clausewitz’s famous dictum: “War is the attainment of political ends through violent means.”  Viewing war as such we have established laws in its conduct.  The use of certain weapons–chemical and biological agents for instance–are considered illegal and their use a war crime; a prohibition honored throughout World War II, Korea, Vietnam, and most other major conflicts.  Combatants are to identify themselves and, when they surrender, are to be accorded humane treatment–a rule that has held up pretty effectively with notable exceptions recorded by Showa Japan, North Korea, and North Vietnam and–tragically and recently–by the United States in its conduct in the War on Terror.  War is not to be purposely waged on non-combatants and collective punishment as reprisals for resistance are prohibited.  There are also others that apply, such as Red Cross and medical persons being protected from attack.  In the U.S. military, the conduct of personnel at war are also restricted by the rules of engagement.  But in all cases the general law of warfare dictates that only the necessary amount of force to achieve the desired political ends is to be exercised–the concept of proportionality applied to a bloody business.

Such political ends typically reflect a society’s perception of its threats, needs, and grievances.  Japan’s perception that the United States and Western Europe was denying it resources and needed its own colonial possessions is often cited as the cause of its expansion and militarism under Showa rule.  Germany’s economic dislocations and humiliation under the Allies is often blamed for the rise of Hitler, rabid nationalism, and expansionism.  In both cases it seemed that at the societal level both nations possessed the characteristics on the eve of war that is typically seen in psychotic individuals.  Other times these characteristics seemed to behave like a disease, infecting other societies and countries in proximity with what can only be described as sociopathic memes–a type of mass hysteria.  How else to explain the scores of individuals with upraised hands in fealty to obviously cruel and inhumane political movements across the mess of human history–or the systematic collection and mass murder of Jews, Gypsies, Intellectuals, and other “undesirables”: not just in Germany but wherever the influence of this meme spread across Europe, Africa, and Asia?

Nations can also fool themselves in learning the wrong lessons from history.  Our own self-image of righting the wrongs of the Old World go back to our anti-colonial roots and the perceptions of our immigrant ancestors who were either rejected by or rejected that world.  Along these lines, the example of Munich in the 20th century has been much misused as a pretext for wars that have ended disastrously or created disastrous blowback resulting from the fog of war simply because the individuals assessing the strategic situation told themselves convenient stories gleaned from an inapplicable past and ignored the reality on the ground.  We have seen this in my lifetime in Vietnam, Iraq (twice), and Afghanistan.

For all of the attempts to “civilize” warfare and lend it rationality, there comes a time when its successful prosecution requires the rejection of rationality.  This is why soldiers and combatant personnel use euphemisms to dehumanize their enemy: it is easier to obliterate a person who is no longer seen as human.  Correspondingly the public is inflamed, the press becomes a tool of the war party, and dissent is viewed with suspicion and penalized.  This is why warfare cannot be interpreted as an extension of neo-classical or–any–economics.  There are no rational actors; at least, not as it is presently conducted by modern nation-states no matter their level of economic development or the maturity of their political systems.  War is unhinged–part of the savagery found in all of us from our primate pasts.

One of my most effective professors when I was seeking my Masters in Military Arts and Sciences was the brilliant historian Dr. Roger J. Spiller–a protégé’ of T. Harry Williams.  “We are always learning,” he would say in repeating a familiar refrain in the military profession, “the lessons from the last war.”  For students at the Army Command and General Staff College it was the critique that doctrine (and therefore the organization and construction of the force) was based on the 1967 Arab-Israeli War; probably the only analogue that could be used in Iraq and–unfortunately for Russia–if they decide to turn their armor on Ukraine or any Article V NATO countries.

Aside from these few exceptions, however, the American way of total warfare that we learned first in our own Civil War and then perfected on the battlefields of Europe and Asia–and our success in its use–has rendered it largely obsolete.  It has been obsolete for quite some time because warfare has changed; its practitioners have evolved.  It has changed because its present incarnation is being prosecuted by people and groups that have no significant power and so use violence to destroy power.  This is the purpose of the terrorist.  Even the strength of this new form of warfare–Low Intensity Conflict–is transient–evident only in tactical situations.  What it cannot do is establish legitimacy or power.  Thus, meeting violence with violence only exacerbates the situation in these cases because power is further eroded and–along with it–legitimacy.  We see the results of the vacuum caused by this inability to adjust to the new warfare in the political instability in both Iraq and Afghanistan–and the rise of ISIS.

While I would argue that the use of economic balance sheets are not what we need in assessing the needs to ensure global stability and peace, we do require a new theory of war that infuses greater rationality into the equation.  Clausewitz–and his intellectual successor Antoine-Henri Jomini–in looking at the guerilla warfare in Spain against French rule, simply admonishes war’s practitioners not to go there.  It is not until T. E. Lawrence and Mao Zedong that we have a modern theory to address this new, societal form of “revolutionary” warfare and then only from the perspective of the revolutionary that wishes to establish neo-colonial, authoritarian, or totalitarian regimes.

Thus, we possess the old templates and they no longer work.  With the specter of nuclear weapons still held over the head of humanity we can ill afford to view every situation as a nail, needing a hammer.  We must, I think, follow the lead as advocated by Hannah Arendt, who distinguished the differences between power, strength, force, violence, and authority.  There is, as John Dewey observed, a connection in consequences between means and ends.  The modern form of violence through terrorism or paramilitary revolution has all too often, in the last quarter of the 20th century and the first decades of the 21st century, led to new oppression and totalitarianism.  This has probably been inevitable given the indiscriminate brutality of the struggles.  Diplomacy backed by credible power and sufficient military force to counteract such violence is the new necessary face of achieving stability.  Contrary to the assertions of neo-cons at the time, the very thing we needed in the wake of 9-11 was an effective police action in lieu of chaotic regional warfare.

Interestingly, the insight between means and ends in warfare was perceived early by George Washington when he won his struggle over the conduct of the war against the methods of Nathaniel Greene.  Greene’s irregular methods of warfare were designed to win the war but to unmake a nation, while Washington’s method–the existence of the disciplined continental army as the strategic center of the revolution–was designed to make a nation once the war was over.  Unfortunately for the French and the Russians, there was no George Washington to see this important distinction in their own revolutions.

So too in the 21st century is this connection between means and ends in the handling of conflict–and terrorism–important.  The years since the fall of the Soviet Union seem to have turned the clock back to 1914 for the pressures and conflicts that were held in check by a bi-polar world: the Balkans, the Middle East, Eastern Europe all have been engulfed in conflict.  The tragedy that can result given the new technologies and approaches for inflicting violence and chaos on civilization require that we not apply 1914 methods in meeting them.

Take Me Out to the Ballgame — Tournaments and Games of Failure

“Baseball teaches us, or has taught most of us, how to deal with failure. We learn at a very young age that failure is the norm in baseball and, precisely because we have failed, we hold in high regard those who fail less often – those who hit safely in one out of three chances and become star players. I also find it fascinating that baseball, alone in sport, considers errors to be part of the game, part of it’s rigorous truth.” — Fay Vincent, former Commissioner of Baseball (1989-1992)

“Baseball is a game of inches.”  — Branch Rickey, Quote Magazine, July 31, 1966

I have been a baseball fan just about as long as I have been able to talk.  My father played the game and tried out for both what were the New York Giants and Yankees–and was a pretty well known local hero in Weehawken back in the 1930s and 1940s.  I did not have my father’s athletic talents–a four letter man in high school–but I was good at hitting a baseball from the time he put a bat in my hands and so I played–and was sought after–into my college years.  Still, like many Americans who for one reason or another could not or did not pursue the game, I live vicariously through the players on the field.  We hold those who fail less in the game in high regard.  Some of them succeed for many years and are ensconced in the Hall of Fame.

Others experienced fleeting success.  Anyone who watches ESPN’s or the Yes Channel’s classic games, particularly those from the various World Series, can see this reality in play.  What if Bill Buckner in 1986 hadn’t missed that ball?  What if Bobby Richardson had not been in perfect position to catch what would have been a game and series winning liner by Willie McCovey in 1962?  Would Brooklyn have every won a series if Amoros hadn’t caught Berra’s drive down the left field line in 1955?  The Texas Rangers might have their first World Series ring if not for a plethora of errors, both mental and physical, in the sixth game of the 2011 Series.  The list can go on and it takes watching just a few of these games to realize that luck plays a big part in who is the victor.

There are other games of failure that we deal with in life, though oftentimes we don’t recognize them as such.  In economics these are called “tournaments,” and much like their early Medieval predecessor (as opposed to the stylized late Medieval and Renaissance games), the stakes are high.  In pondering the sorry state of my favorite team–the New York Yankees–as I watched seemingly minor errors and failures cascade into a humiliating loss, I came across a blog post by Brad DeLong, distinguished professor of economics at U.C. Berkeley, entitled “Over at Project Syndicate/Equitable Growth: What Do We Deserve Anyway?”  Dr. DeLong makes the very valid point, verified not only by anecdotal experience but years of economic research, that most human efforts, particularly economic ones, fail, and that the key determinants aren’t always–or do not seem in most cases–to be due to lack of talent, hard work, dedication, or any of the attributes that successful people like to credit for their success.

Instead, much of the economy, which in its present form is largely based on a tournament-like structure, allows only a small percentage of entrants to extract their marginal product from society in the form of extremely high levels of compensation.  The fact that these examples exist is much like a lottery, as the following quote from Dr. DeLong illustrates.

“If you win the lottery–and if the big prize in the lottery that is given to you is there in order to induce others to overestimate their chances and purchase lottery tickets and so enrich the lottery runner–do you “deserve” your winnings? It is not a win-win-win transaction: you are happy being paid, the lottery promoter is happy paying you, but the others who purchase lottery tickets are not happy–or, perhaps, would not be happy in their best selves if they understood what their chances really were and how your winning is finely-tuned to mislead them, for they do voluntarily buy the lottery tickets and you do have a choice.”  — Brad DeLong, Professor of Economics, U.C. Berkeley

So even though participants have a “choice,” it is one that is based on an intricately established system based on self-delusion.  It was about this time that I came across the excellent HBO Series “Silicon Valley.”  The tournament aspect of the software industry is apparent in the conferences and competitions for both customers and investors in which I have participated over the years.  In the end, luck and timing seem to play the biggest role in success (apart from having sufficient capital and reliable business partners).

I hope this parody ends my colleagues’ (and future techies’) claims to making the claim to “revolutionize” and “make the world a better place” through software.

We Gotta Get Out of This Place — Are Our Contracting Systems Agile Enough?

The question in the title refers to agile in the “traditional” sense and not the big “A” appropriated sense.  But I’ll talk about big “A” Agile also.

It also refers to a number of discussions I have been engaged in recently among some of the leading practitioners in the program and project management community. Here are few data points:

a.  GAO and other oversight agencies have been critical of changing requirements over the life cycle of a project, particularly in DoD and other federal agencies, that contribute to cost growth.  The defense of these changes has been that many of them were necessary in order to meet new circumstances.  Okay, sounds fair enough.

But to my way of thinking, if the change(s) were necessary to keep the project from being obsolete upon deployment of the system, or were to correct an emergent threat that would have undermined project success and its rationale, then by all means we need to course correct.  But if the changes were not to address either of those scenarios, but simply to improve the system at more than marginal cost, then it was unnecessary.

How can I make such a broad statement and what is the alternative? we may ask.  My rationale is that the change or changes, if representing a new development involving significant funding, should stand on its own merits, since it is essentially a new project.

All of us who have been involved in complex projects have seen cases where, as a result of development (and quite often failure), that oftentimes we discover new methods and technologies within the present scope that garner an advantage not previously anticipated.  This doesn’t happen as often as we’d like but it does happen.  In my own survey and project in development of a methodology for incorporating technical performance into project cost, schedule and risk assessments, it was found that failing a test, for example, had value since it allowed engineers to determine pathways for not only achieving the technical objective but, oftentimes, exceeding the parameter.  We find that for x% more in investment as a result of the development, test, milestone review, etc. that we can improve the performance of some aspect of the system.  In that case, if the cost or effort is marginal then, the improvement is part of the core development process within the original scope.  Limited internal replanning may be necessary to incorporate the change but the remainder of the project can largely go along as planned.

Alternatively, however, inserting new effort in the form of changes to major subsystems involves major restructuring of the project.  This disrupts the business rhythm of the project, causing a cultural shift within the project team to socialize the change, and to incorporate the new work.  Change of this type not only causes what is essentially a reboot of the project, but also tends to add risk to the project and program.  This new risk will manifest itself as cost risk initially, but given risk handling, will also manifest itself into technical and schedule risk.

The result of this decision, driven solely by what may seem to be urgent operational considerations, is to undermine project and program timeliness since there is a financial impact to these decisions.  Thus, when you increase risk to a program the reaction of the budget holder is to provide an incentive to the program manager to manage risk more closely.  This oftentimes will invite what, in D.C. parlance, is called a budget mark, but to the rest of us is called a budget cut.  When socialized within the project, such cuts usually are then taken out of management reserve or non-mandatory activities that were put in place as contingencies to handle overall program risk at inception.  The mark is usually equal to the amount of internal risk caused by the requirements change.  Thus, adding risk is punished, not rewarded, because money is finite and must be applied to projects and programs that demonstrate that they can execute the scope against the plan and expend the funds provided to them.  So the total scope (and thus cost) of the project will increase, but the flexibility within the budget base will decrease since all of that money is now committed to handle risk.  Unanticipated risk, therefore, may not be effectively handled in the future.

At first the application of a budget mark in this case may seem counterintuitive, and when I first went through the budget hearing process it certainly did to me.  That is until one realizes that at each level the budget holder must demonstrate that the funds are being used for their intended purpose.  There can be no “banking” of money since each project and program must compete for the dollars available at any one time–it’s not the PM’s money, he or she has use of that money to provide the intended system.  Unfortunately, piggybacking significant changes (and constructive changes) to the original scope is common in project management.  Customers want what they want and business wants that business.  (More on this below).  As a result, the quid pro quo is: you want this new thing?  okay, but you will now have to manage risk based on the introduction of new requirements.  Risk handling, then, will most often lead to increased duration.  This can and often does result in a non-virtuous spiral in which requirements changes lead to cost growth and project risk, which lead to budget marks that restrict overall project flexibility, which tend to lead to additional duration.  A project under these circumstances finds itself either pushed to the point of not being deployed, or being deployed many years after the system needed to be in place, at much greater overall cost than originally anticipated.

As an alternative, by making improvements stand on their own merits a proper cost-benefit analysis can be completed to determine if the improvement is timely and how it measures up against the latest alternative technologies available.  It becomes its own project and not a parasite feeding off of the main effort.  This is known as the iterative approach and those in software development know it very well: you determine the problem that needs to be solved, figure out the features and approach that provides the 80% solution, and work to get it done.  Improvements can come after version 1.0–coding is not a welfare program for developers as the Agile Cult would have it.  The ramifications for project and program managers is apparent: they must not only be aware of the operational and technical aspects of their efforts, but also know the financial impacts of their decisions and take those into account.  Failure to do so is a recipe for self-inflicted disaster.

This leads us to the next point.

b.  In the last 20+ years major projects have found that the time from initial development to production has increased several times.  For example, the poster child for this phenomenon in the military services is the F35 Lightning II jet fighter, also known as the Joint Strike Fighter (JSF), which will continue to be in development at least through 2019 and perhaps into 2021.  From program inception in 2001 to Initial Operational Capability (IOC) it will be 15 years, at least, before the program is ready to deploy and go to production.  This scenario is being played out across the board in both government and industry for large projects of all types with few exceptions.  In particular, software projects tend to either fail or to meet their operational goals in the overwhelming majority of cases.  This would suggest that, aside from the typical issues of configuration control, project stability, and rubber baselining, (aside from the self-reinforcing cost growth culture of the Agile Cult) that there are larger underlying causes involved than simply contracting systems, though they are probably a contributing factor.

From a hardware perspective in terms of military strategy there may be a very good reason why it doesn’t matter that certain systems are not deployed immediately.  That reason is that, once deployed, they are expensive to maintain logistically.  Logistics of deployed systems will compete for dollars that could be better spent in developing–but not deploying–new technologies.  The answer, of course, is somewhere in between.  You can’t use that notional jet fighter when you needed it half a world away yesterday.

c.  Where we can see the effects on behavior from an acquisition systems perspective is in the comparison of independent estimates to what is eventually negotiated.  For example, one military service recently gave the example of a program in which the confidential independent estimate was $2.1 billion.  The successful commercial contractor team, let’s call them Team A, whose proposal was deemed technically acceptable, made an offer at $1.2 billion while the unsuccessful contractor team, Team B, offered near the independent estimate.  Months later, thanks to constructive changes, the eventual cost of the contract will be at or slightly above the independent estimate based on an apples-to-apples comparison of the scope.  Thus it is apparent that Team A bought into the contract.  Apparently, honesty in proposal pricing isn’t always the best policy.

I have often been asked what the rationale could be for a contractor to “buy-in” particularly for such large programs involving so much money.  The answer, of course, is “it depends.”  Team A could have the technological lead in the systems being procured and were defending their territory, thus buying-in, even without constructive changes, was deemed to be worth the tradeoff.  Perhaps Team A was behind in the technologies involved and would use the contract as a means of financing their gap.  Team A could have an excess of personnel with technical skills that are complimentary to those needed for the effort but who are otherwise not employed within their core competency, so rather than lose them it was worth bidding at or near cost for the perceived effort.  These are, of course, the most charitable assumed rationales, though the ones that I have most often encountered.

The real question in this case would be how, even given the judgment of the technical assessment team, the contracting officer would keep a proposal so far below the independent estimate to fall within the competitive range?  If the government’s requirements are so vague that two experienced contracting teams can fall so far apart, it should be apparent that the solicitation either defective or the scope is not completely understood.

I think it is this question that leads us to the more interesting aspects of acquisition, program, and project management.  For one, I am certain that a large acquisition like the one described is highly visible and of import to the political system and elected officials.  In the face of such scrutiny it would have to be a procuring contacting officer (PCO) of great experience and internal fortitude, confident in their judgment, to reset the process after proposals had been received.

There is also pressure in contracting from influencers within the requiring organizations that are under pressure to deploy systems to meet their needs as expeditiously as possible–especially after a fairly lengthy set of activities that must occur prior to the issuance of a solicitation.  The development of a good set of requirements is a process that involves multiple stakeholders on highly technical issues is one that requires a great deal of coordination and development by a centralized authority.  Absent such guidance the method of approaching requirements can be defective from the start.  For example, does the requiring organization write a Statement of Work, a Performance Work Statement, or a Statement of Objectives?  Which is most appropriate contract type for the work being performed and the risk involved?  Should there be one overriding approach or a combination of approaches based on the subsystems that make up the entire system?

But even given all of these internal factors there are others that are unique to our own time.  I think it would be interesting to see how these factors have affected the conditions that everyone in our discipline deems to be problematic.  This includes the reduced diversity of the industrial and information verticals upon which the acquisition and logistics systems rely; the erosion of domestic sources of expertise, manufactured materials, and commodities; the underinvestment in training and personnel development and retention within government that undermines necessary expertise; specialization within the contracting profession that separates the stages of acquisition into stovepipes that undermines continuity and cohesiveness; the issuance of patent monopolies that stifle and restrict competition and innovation; and unproductive rent seeking behavior on the part of economic elites that undermine the effectiveness of R&D and production-centric companies.  Finally, this also includes those government policies instituted since the early 1980s that support these developments.

The importance of any of these cannot be understated but let’s take the issue of rent seeking that has caused the “financialization” of almost all aspects of economic life as it relates to what a contracting officer must face when acquiring systems.  Private sector R&D, which mostly fell in response to economic dislocations in the past–but in a downward trend since the late 1960s overall and especially since the mid 1980s–has fallen precipitously since the bursting of the housing bubble and resultant financial crisis in 2007 with no signs of recovery.  Sequestration and other austerity measures in FY 2015 will at the same time will also negatively impact public R&D, continuing the trend overall with no offset.  This fall in R&D has a direct impact on productivity and undercuts the effectiveness of using all of the tools at hand to find existing technologies to offset the ones that require full R&D.  This appears to have caused a rise in intrinsic risk in the economy as a whole for efforts of this type, and it is this underlying risk that we see at the micro and project management level.