Alito and the Unraveling of Originalism

During the current period of internal debate on Dobbs v. Jackson Women’s Health Organization, the public was provided a draft of the thinking of the “conservative” justices of the Supreme Court on the issue of abortion and personal liberty through a leaked version of Justice Alito’s draft majority opinion. I use quotes around the word conservative, because the decision is anything but conservative in its scope and effect.

Furthermore, despite much hand-wringing, the document provides much needed insight into the thinking of the Court majority on an issue of great public weight: the issue of personal liberty. Despite much criticism and an announced investigation, there is virtually no defensible reason why the process of the Court should not be open, as it is for almost every other branch of government. As such, it seems the issue of a leak of its working documents on such a weighty matter is a small price to pay for transparency and accountability. Secondly, this is not the first time a Supreme Court decision has been scooped and it won’t be the last.

The only difference is the amount of hot air spent on the leak because the decision itself is so odious. The pattern of the current court has been to make guerilla docket decisions, like a thief in the night. Forewarning of the denial of liberty to more than half of the country’s population has, rightly, caused a reaction that is only a preview of the storm to come.

It is true that, in theory, judicial responsibility is a contemplative and deliberate process that requires some measure of reflection and give-and-take. Under this ideal view, reconciling the perspectives of nine justices to produce a cohesive opinion is both a fraught and sensitive process. Were that we were to live in such an ideal, theoretical world.

Were the theoretical view of the judicial process true, any revocation of a liberty through judicial fiat would be considered both unusual and require extraordinary circumstances and subtlety. There is none of that in the draft opinion or in the case before the Court. The reason for this condition is the extraordinary lengths the reactionary right in this country has gone to degrade previously trusted institutions.

The reality is that the Supreme Court is a political institution made up of a mix of respected and leveled-headed jurists, fanatics with a political agenda, and mediocre political tools. This argues for a more transparent process that is paired with a strong ethical law of judicial conduct that applies to the Court’s membership. Sunshine and democracy bolstered by the balance and separation of powers is a curative for most civic ailments. This is no exception.

The degradation of the Court didn’t begin with Donald Trump, but no one, except perhaps Richard Nixon, has been so effective at degrading the integrity and effectiveness of anything that he touched as effectively as Trump. The man epitomizes the concept of the inverse Midas Touch. Everything he touches turns to excrement—and it is only excrement in the form of judicial opinion that seems to come from this Court.

This current state can only be expected given that he—given aid and comfort by the likes of Senators Lindsay Graham and Mitch McConnell—has more than any previous president appointed the proponents of banal thoughtlessness to the Court and to the courts, selecting clearly unqualified candidates to lifetime judicial appointments, and demanding personal loyalty and political fealty in the area of jurisprudence: like a mob boss or tin-torn caudillo.

If there is any defining judicial philosophy that the Court’s “conservatives” assert, it is that they are following the concept of originalism. Taking these assertions at face value, it is therefore fair to evaluate the efficacy and validity of this concept as it is applied.

What Art Thou Originalism? Liberty and Other Rights in Dobbs

According to Justice Barrett, the latest of its adherents appointed to the Court, originalism is defined as an approach that the “constitutional text means what it did at the time it was ratified and that this original public meaning is authoritative.” The late Justice Antonin Scalia stated, in a 1988 lecture explaining why he is an originalist, “The main danger in judicial interpretation of the Constitution is that the judges will mistake their own predilections for the law.”

While good in theory, the Constitution doesn’t always stand up to a clear language or clear meaning test that is suspended in time. Many sections and amendments to the document were written expansively with the intent that future generations and Congresses would find the balance in applying the broad principles enumerated. Archaic language also plays a role, which in many cases has many meanings, not just one.

Barrett herself, before rising to the Bench, noted this conundrum when viewed through an originalist’s lens in approaching modern developments: “Adherence to originalism arguably requires, for example, the dismantling of the administrative state, the invalidation of paper money, and the reversal of Brown v. Board of Education.” But, she asserts, there are past decisions that “no serious person would propose to undo even if they are wrong.”

Apparently, the vague “serious person” rule does not apply to the 49-year-old Roe decision. But the most appalling part of her assertions lies in the certainty that rises to the level of arrogance that she possesses superior jurisprudential knowledge than what the various justices over almost fifty years ruled in Roe and Casey, and in the cases that have upheld them.

Her background, while demonstrating competence in her various limited assignments and appointments, do not reveal a particularly brilliant or independent legal mind. Over the course of her private practice, her academic career, and her work at the Seventh Circuit Court of Appeals, nothing would mark her as eminently qualified to serve on the U.S. Supreme Court, apart from her fawning obsequiousness to the originalist philosophy of the late Justice Scalia, for whom she clerked. Overall, her writing shows that she is polemical, as opposed to analytical, in her approach to legal issues.

While she seems to be workmanlike and generally likeable, nothing distinguishes her except for an adherence to a questionable and rigid legal philosophy, a pro-large business, anti-environmental and anti-labor bias, and the occasional interjection of her personal religious beliefs into legal issues. Her opinions on the Seventh Circuit were neither consistent nor did they articulate a coherent legal approach to justice. On the contrary, they seemed to align with the recent trend of other originalist jurists to base their arguments on a pre-determined outcome. This is known as “begging the question.”

But let’s give the devil his due for the sake of argument. What are these elements that make an originalist reverse a 49-year-old precedent? The Court discusses them as such, which I have outlined as follows:

  • Up until Roe, the various States addressed the issue of abortion and women’s reproductive liberty.
  • The Constitution “makes no mention of abortion” and that “no such right is implicitly protected…”
  • The Court issued a set of rules regarding application of the decision based on pregnancy trimesters and related to fetal “viability,” which have the appearance of statutory language.
  • The Court’s decision ended the ability of States to regulate abortion and women’s reproductive liberty in this manner.
  • Planned Parenthood v. Casey revised Roe, which included negating the trimester standard and, in its place, substituting an “undue burden” standard on State action.
  • That the Due Process Clause of the Fourteenth Amendment does not protect abortion as one of the rights not enumerated in the Constitution and that the case of Washington v. Glucksberg (1997) determines what does.
  • That under Glucksberg, a right must be “deeply rooted in this Nation’s history and tradition” and implicit in the concept of ordered liberty.”
  • That abortion doesn’t fall under the Glucksberg test since it was not decided until 1973.
  • That the Equal Protection provisions of the Fourteenth Amendment do not apply either, especially to women as it relates to abortion.
  • That the right to “privacy” is not absolute and doesn’t apply to abortion, since there are competing interests and that these should be resolved by the States.

In order to support the denial of a Fourteenth Amendment protection for abortion, Alito queries the state of abortion in 1868—the date when the amendment was passed. In doing so, he cites tradition as processed through the selective legal writings of Bracton, Coke, Hale, Blackstone and others to assert that abortion prior to “quickening,” that is the viability of the fetus used in the original Roe decision, was not a consideration in common law.

In supporting his critique that abortion does not meet the Glucksberg test, Alito goes back to the 13th century, to a time where the rights accepted at this country’s founding in the 18th century would also not be recognized. In criticizing both legal precedent and common law precedent used in Roe, Alito simply then draws a line in favor of these archaic beliefs and says that the Court in 1973 was erroneous, and in classic bit of hand-waving, that its reasoning “makes no sense.” He also asserts that the issue was extremely contentious in 1973, which is against the reality that anyone who was alive in that year could tell you. On the contrary, the finding in Roe seemed all but a foregone conclusion, with even most Catholics and the majority of Protestant religious organizations supporting the right to various degrees.

The Conceit of Originalism

This opinion, as noted, is a mess of historical inaccuracies, tautology, and hand waving. Good thing it is a draft; and one cannot help but surmise whether this is just a strawman argument and that the Court will uphold Roe in general, but modify the rule as was done in Casey. But let’s take the assertions in groupings:

First, the statement that the States regulated abortion until 1973 is both a statement of fact and of the problem that Roe sought to address. It adds no value to Alito’s argument and does not pass the “so-what” test. A legal opinion isn’t one until it is made. The Court decided in 1973 to take up the case because of the compelling and important underlying issues presented.

Also in this group are the general statements that Roe established guidelines for trimesters based on fetal viability, which addressed the issue of personhood (discussed in more detail below), and that the ruling ended the variability of States to write laws on abortion. Again, a statement from Captain Obvious. After all, this last was the purpose of the decision. The Casey revisions are also a matter of record, expanding the discretion in State regulation.

Second, the statement regarding abortion not being mentioned in the Constitution is an equivocation. This assertion was also made by Senator Graham and other “conservatives” during the confirmation hearing of Judge Ketanji Brown Jackson, and so one cannot help but wonder about the collusion among like-minded officials in Washington and their eagerness to repeat this canard.

The problem here is that anyone educated about our Constitutional government understands that our system is one of enumerated (that is, stated) governmental powers, and both enumerated and implied individual rights. In addition, our Constitution was not frozen in time in 1789 because that document has been amended 27 times. The amendments that expanded individual rights are generally understood to be the 13th through 15th amendments, the 19th, and the 26th amendments.

When the first ten amendments that became the Bill of Rights were being debated, there were generally two camps—the Federalists and the Anti-federalists. One of the core differences between the factions was the fear, articulated by the Federalists, that enumerating rights would mislead future Americans in believing that those were the only rights reserved by the people. The Antifederalists sought a Bill of Rights so that at least some of the most cherished rights were codified in the Constitution, which was the antecedent charter of the American people.

The latter argument won out with an implicit agreement, articulated in newspapers across the country, that both factions believed that there were rights that were not enumerated, but that those that were enumerated are the most cherished and a wall against tyranny. This is a core defining principle of this country. This Court is the example they sought to guard against.

The concept of bodily autonomy is one that is recognized since the founding of the country and was first clearly articulated in the Declaration of Independence’s broad statement to “life, liberty, and the pursuit of happiness.” This is our deeply rooted tradition in “ordered liberty,” which emphasizes liberty and doesn’t mention order. No doubt an originalist like Alito would defer to English common law that allowed the sovereign to quarter troops and suspend habeas corpus. But that is not *our* tradition.

Furthermore, the Sixth Amendment establishes the “right of the people to be secure in their persons…” The Seventh Amendment establishes that “No person shall…be deprived of life, liberty, or property, without due process of law…” The Fourteenth Amendment applied these rights to persons and prohibited the States from denying them.

It is here that Alito, applying originalism, stops. He relies on conceptions of liberty at the time of the amendment’s passage. What is insidiously ironic about this approach, is that the conditions among the States in 1868 are the very conditions which the amendment abolishes. In this case, it ostensibly was the condition of the freedmen and any other *person* who lived in the various States. But it needs to be noted that the amendment was also a broader recitation of the concept of individual rights and which the States could not undermine. Thus, the concept of personhood was introduced and in 1868 it is admittedly a fairly narrow one, but that was not as it was seen when it was ratified.

Nor does the story end there. In 1870, the 15th Amendment established universal male suffrage regardless of race, color, or previous condition of servitude. In 1919 the 19th Amendment extended the franchise to women. Thus, by 1919, the rights of women were only beginning to be realized, but now they could begin to exercise the power that the vote provides even though the courts were not yet ready to afford them the equal status that voting would seem to imply. In 1971, the right to the franchise was extended down to the age of 18.

When we establish or recognize rights, the originalist approach would be to go back to 1870, 1919, or 1971. This is not only faulty, it is a conceit, as if someone today can place themselves in the minds of someone in the past—like a clairvoyant. Furthermore, the Alito originalist (and his co-jurists) would approach each one in a vacuum, not as a continuation or extension–an amendment and therefore new interpretation—to the country’s charter. The intellectual dishonesty in this approach, at this level of judicial governance, is both breathtaking and troubling.

Prior to this Court, the pattern—which sits on firm ground—has been to note that the purpose or intent of the ever-expanding recognition of previously ignored or unsettled rights. The perspective has been: “from this time forward.” The application of these rights must, by necessity, be judiciously determined under actual conditions, not some mythical reification of an ideal or in the mind of the judge. But, of course, this has not always been the case, particularly as it relates to the rights of previously disenfranchised groups, women among them.

Alito’s opinion makes much of not following precedent (stare decisis), and in a bit of self-congratulatory language, compares it to Brown overturning Plessey. The focus on Brown among a conservative movement that considered it and other Warren Court cases to be judicial overreach is ironic and troubling in a “thou protest too much” sort of way. This is doubly troubling given the writing of Justice Barrett and the draft Alito opinion that undermines and/or denies many of the same rights asserted as the basis for Brown.

Since just prior to the middle of the 20th century, our society has become somewhat complacent in believing in the concept that the arc of history always bends toward justice; that the progress ennobling democratic and human rights was inevitable. But the reality behind the milestones we celebrate are, instead, the work of untold others who sacrificed and fought to gain a measure of dignity, justice, and human rights either for themselves, or for generations that followed.

Our nation, for all of its blessings, was born with the defects of 18th century thinking regarding race, gender, and caste. It was a pre-scientific time, which our legal and social institutions must keep in mind, and to make adjustments to our present thinking, as new knowledge has become known. We have had and will have to continue to work hard to confront those historical defects, including those who would suppress or prevent its acknowledgment.

Rather than the draft opinion of Dobbs being comparable to Brown overturning Plessey, the more appropriate analogy is the Supreme Court’s notorious nullification of the rights found in the 13th through 15th amendments in the Civil Rights Cases of 1883, which caused the rise of Jim Crow and the Black Codes across the south and many border states. It is these that then made Plessey possible. This comparison is the more apropos one because there is no mention of women’s rights in the Alito draft opinion, or an acknowledgment of their personhood, with all of the rights that attend to that finding. The unwritten effect of driving originalism’s Wayback Machine to 1868 is the underlying and implicit doctrine that a woman’s body, and in particular the womb, belonged to her husband.

The concern here is that the Court has assumed that women do not have an interest in whether to remain pregnant and have inserted their own judgment—or that of individual States—for that most personal of all decisions. During oral arguments of the case, Justice Barrett suggested that the existence of safe-haven laws and adoption in general rendered moot the pro-choice argument that abortion access protects women from “forced motherhood.” Instead, she asserted, “it doesn’t seem to me to follow that pregnancy and then parenthood are all part of the same burden,” said Barrett. “The choice, more focused, would be between, say, the ability to get an abortion at 23 weeks, or the state requiring the woman to go 15, 16 weeks more and then terminate parental rights at the conclusion.” This is a cavalier and casually extreme statement that nullifies the personal autonomy of women. It also ignores the reality of female childbirth mortality and complications.

At heart, originalism isn’t about jurisprudence at all, but part and parcel of the widespread trend to fundamentalist and literal thinking and interpretation that first arose in the 1970s across various religious groups, but is now conflated to the political and legal spheres.

In her book, The Battle for God, Karen Armstrong noted a strong family resemblance between the various forms of religious fundamentalism, but these characteristics are also troublingly familiar in today’s politics. According to Armstrong, “They are embattled forms of spirituality, which have emerged as a response to a perceived crisis. They are engaged in a conflict with enemies whose secularist policies and beliefs seem inimical to religion itself. Fundamentalists do not regard this battle as a conventional political struggle, but experience it as a cosmic war between the forces of good and evil. They fear annihilation, and try to fortify their beleaguered identity by means of a selective retrieval of certain doctrines and practices of the past. To avoid contamination, they often withdraw from mainstream society to create a counterculture; yet fundamentalists are not impractical dreamers. They have absorbed the pragmatic rationalism of modernity, and, under the guidance of their charismatic leaders, they refine these “fundamentals” so as to create an ideology that provides the faithful with a plan of action. Eventually they fight back and attempt to re-sacralize an increasingly skeptical world.”

The originalist argument against rights being asserted by previously disenfranchised people is an argument against modernity, denying everything that followed 1868. If the Constitution of 1789 was the Old Covenant, which replaced both English aristocracy and the unworkable Articles of Confederation, the post-Civil War Amendments are the New Covenant. It established individual rights and liberty, and the primacy of the national government enforcing these rights over the States.

Originalism is a dialing back of the clock and a denial of this New Covenant which has expanded democracy and human dignity in order to achieve the previously unfulfilled promises of the Declaration. A fantastical channeling of original intent is not unlike the charlatans that peddled Spiritualism in days gone by.

For example, Alito does not speak for Senators John Bingham or Jacob Howard who authored the 14th Amendment, who both clearly stated that it applied the Bill of Rights to the States—a view finally adopted after years of denial by the Court in the face of contrary facts and clear language: a prior untenable and contradictory position that this Court seems to have revived.

The heart of the issue, which the Court’s draft opinion avoids at all cost, is whether a woman is a person or a citizen of the United States as defined under the Sixth and Fourteenth Amendments. Ruth Bader Ginsberg argued five cases before the Supreme Court beginning in 1971 that both men and woman are such and are therefore entitled to protection under the Equal Protection Clause of the 14th Amendment. This is really the issue that the Alito draft opinion—signed by at least five so-called conservative justices—would overturn.

Where Do We Go From Here?

The core liberty being attacked under this ruling is the basic acknowledgment that American women are both persons and citizens of the United States, and that, as such, they—and all of us—possess rights to privacy and physical autonomy. This is a basic question that needs to be answered directly.

This is not simply a question of abortion, nor is it solely the concern of women or of people of color because of its immediate impact. This is of interest to all people who wish to ensure that the Constitutional order instituted in support of civil rights and civil liberties against oppressive State action at any level of government is supported by the rule of law. To subject these rights to the whims of the States without due process or an unusual overarching state interest—is to allow justice on such fundamental rights to be inconsistently enforced and justice denied. Undermining them leads to the abrogation of other rights essential to liberty. It leads to oppression, theocracy, and autocracy.

Thus, the issue must be engaged on several fronts.

The first of these, of course, is to assert that a woman is both a person and that one naturalized or born here is a citizen under the law. The most direct way to enshrine this into law is through passage of an Equal Rights Amendment.

For all of Ruth Bader Ginsberg’s brilliant legal approach under the existing Constitutional language, there was always a way for those who would reverse these cases to invent an argument or theory to come to an opposite conclusion. For almost a century before the Ginsburg cases, the Court had done just that.

Much has been said and written in the press and media about the increasing political divisiveness in the United States, as if it were new or permanent. Its form is new, but its causes, impetus, and application whistle a familiar tune. The fact of the matter is that things change and can be changed as they have in the past. The 2020 election, despite all of the propaganda by the losers, proved that 91 million Americans can come together to take back their democracy.

The latest strategy of the pro-equality movement has been to rely on the 1972 ERA and to challenge its sunset clause. While their cause is laudable, their strategy and tactics are self-defeating, because they fail to build the message that will form the coalition needed to deliver their goal. The unfounded belief that the Court would not overturn women’s rights and the resulting political complacency also undermined the impetus for the Amendment. But this will be a new battle.

To win a battle, the lines must be drawn, the goals made clear, the purpose articulated and the opposition engaged at all levels. Democracy and its animating ideas are the culmination of thousands of years of human struggle, hope, and aspiration. Its appeal is universal, and so it is a war where ideas are the most effective weapon. Still, it takes extraordinary effort and determination to bring it about.

I am both an historian and political scientist, among my other vocations. When we look back at human civilization before 1776 all we can see, with a few minor exceptions, is a world of theocracy, monarchy, feudalism, dictatorship, autocracy, and oppression. This is what made the idea that became the reality of the United States so exceptional.

Still, there were artifacts of society that the founders under the Declaration of Independence—and later the framers of the new Constitution that replaced the Articles of Confederation—could and could not address, given their own political realities.

The first issue they did address was the separation of Church and State, given that up to 1776, the issues that could best rip a nation apart, or could oppress others, were conflicts and disagreements between religious creeds, the merging of government and religion, and the banning or interference with religion and belief. These provisions are both in the core document and the First Amendment.

But most significantly, the issues they decided not to address were those related to universal male suffrage, slavery, the position of women, and caste. These they left to future generations to sort out and they did so consciously. We know this because following the series of letters in which Abigail Adams wrote to John Adams to “remember the ladies,” Adams did attempt to address the issue and it is interesting that this issue, more than others, touched on all four of those listed above.

For many of the founders, the issues that raised the most profound challenges were those that caused them to determine how to overcome the laws and practices that originated in English and European feudalism.

For example, many Americans have not learned through their primary school history that Thomas Jefferson had actually included in the Declaration the abolition of slavery and the buying and selling of human beings. This passage was removed when those states that profited from the trade objected to it.

But most significantly, even the philosophy of John Locke was found to be too restrictive in its formulation of natural rights. Jefferson modified and democratized Locke’s phrase “life, liberty, and property” to a more expansive concept: “life, liberty, and the pursuit of happiness.” While Locke’s philosophical writings mentions the right of property throughout, the Declaration does not.

This universalist definition of liberty was left to future generations to address, particularly in regard to issues regarding property as a requirement of voting, for the position of women and others in society, and for the buying, selling, and enslavement of other human beings. These artifacts of feudalism and medieval law, as James Sullivan, whom Adams consulted noted, would need to be addressed. I would note the abundance of references in the Alito draft opinion to feudal and medieval thought and law.

So, we must pick up the baton of liberty again and do the hard work of convincing our fellow citizens through campaigning, picking responsible representatives that share the spark of equality and liberty. This must be done in every state, in every district, at every level of government. The concept of liberty and equality must be taught in every school, enforced and supported by every official of government.

This movement must be independent of political party. But it must be a coalition of like-minded groups concerned with individual liberty and autonomy. Our arguments must be factual and convincing. They must avoid polarizing language, but make clear the motivations and the consequences of the opposition’s position to both men and women.

Congress will need to write a new ERA which they know will fail in the short term, given the current state of politics. The movement for equality will need to take the responsibility to mobilize the people in a unified and concerted effort on this one issue alone. Through this one issue will follow other rights to be enforced and secured—and this is why the opposition to it has been so forceful.

The second front that must be opened is to confront the current political power structure. We must challenge in Court and in the political arena *anyone* who would assume power to abrogate the rights to personhood and citizenship of women and of their equality. The strategy of the opposition is to spread scare tactics. Those that were employed in the 1970s are quaint in the perspective today’s world: women in the workplace, women in the military, shared custody of children and child care, the option to an abortion pre-viability, gay marriage, and gender equality and fluidity.

We live with these realities of human nature now and the world has not fallen apart. This reveals the lie that undergirds the opposition to equality. For we are back to artifacts of medieval and feudalistic thought. The fear of empowering women is the fear of paternity and the presumption of male authority, but only certain males and certain authority. There are women all too willing to bolster this thinking at the expense of other women to gain advantage. Even our language is tilted toward demeaning terminology of women that have no equivalents when referring to men.

Here in Florida, we can see the new scary stories spread at the state governmental level regarding gay and transgender students and parents: the alleged horrors of gender-neutral bathrooms and women’s sports. One must ask oneself two questions when addressing these concerns, since they seem to resonate with the public: how likely is the worst case put forth to occur? And, even if true, would you give up your constitutional liberties in exchange for a sign on a public bathroom and the sexual designation of a competitive sport?

The third front to open against the opponents of equality is on the social and religious level. This will place abortion—which has been used as the main issue to undermine women’s rights—in its proper moral, ethical, and historical position. This removes the popular lexicon of these merely being “values issues” rather than a fundamental issue of equality and liberty.

The first step is to challenge the fundamentalist assertion of abortion. As the historian Garry Wills, who also happens to be a Catholic, has noted, it was not until 1930—in the face of modern medicine—that Pope Pius XI issued the encyclical Casti Connubii to forbid all ways to prevent procreation. This undermines Alito’s entire argument of a moral issue.

No major Roman Catholic theologian prior to the 1930 encyclical: not Dante, not Matthew, Mark, Luke, Paul, or even John that old misogynist; not St. Augustine nor St. Thomas Aquinas found a biblical prohibition against abortion. The latest argument since the encyclical is that it is an issue of natural law. But no major proponent of natural law considers a fetus to be a person independent of the mother. Only fanatics claim that a human person begins at “conception,” ignoring that about half of such fertilizations fail, making God the Chief Abortionist.

Such assertions of fetal personhood also fail medical and scientific scrutiny because a fetus before viability outside of the womb is still only a grouping of cells fully dependent on and part of the mother, just like any other cells, especially given the developing modern science of in vitro fertilization and cloning. A miscarriage in the Catholic Church is not prescribed to be baptized, given last rites, nor buried in consecrated ground. Alito and Barrett’s opinions on this topic are the most extreme among others of their faith, which seems to have infected their dishonest foray into secular jurisprudence.

Armed with these facts, we must align with traditional and universalist faiths, and with secular ethicists, that do not agree with the fundamentalist view on abortion. This includes the majority of Catholics, Jews, Buddhists, Muslims, Hindus, Unitarians, African-American Protestant sects, and mainline Protestant faiths.

At the end of the day, the question comes down to the fact that a woman is an equal, autonomous person under the law. We do not have to have the answers to everything else follows, just as the founders of the nation and the framers of the Constitution allowed that they could not. We simply need to enshrine this one core right which is under attack, and do the hard work to make it impossible to deny.

Send in the Clowns–Putin, Trump, the Fifth Column, and the importance of history

When Benito Mussolini was seen on newsreels in the United States it was as a “rotund, strutting clown, who struck pompous poses from his Roman balcony and tried to upstage Adolf Hitler when they first met, in Venice in 1934.” Much of this impression was established thanks to Charlie Chaplin’s 1940 film “The Great Dictator,” where the Il Duce character was effectively lampooned by the actor Jack Oakie. This reinforced the widely held public prejudice in England and among many Americans that “Mediterranean Peoples,” especially Italians, were inferior oafs, unintelligent and clownish hotheads. Intellectuals and world leaders would come to regret underestimating the Italian autocrat, who was just the first of fascist and quasi-fascist leaders to wreak havoc on the world.

As with Mussolini, Hitler and the Nazis were first seen as a clown show with their thuggish behavior, mediocre backgrounds, buffoonish uniform-like garb, their appropriation of mythic symbols, and their completely unintelligible and ignorant interpretations of history and society. It was noted that Hitler himself was ignorant, lazy, a self-absorbed narcissist, and overly theatrical in his public appearances. He also effectively tapped into the grudges and nationalistic lies that cleaved German society, finding convenient scapegoats, especially Jews and other vulnerable groups, lumping them in with Communists and external threats to “German-speakers.”

Yet, so effective was the disinformation and propagandist jumble of Nazi ideas and assertions to define themselves and intentions, that even today intellectuals debate the definition of fascism, and whether clearly quasi-fascist governments like those under Franco in Spain and Pinochet in Chile were really fascist governments, despite the not only clear linkage to the methods and ideas of fascism, but the evidence of the dead bodies.

One would think that this debate was closed with Umberto Eco’s definition of Ur-fascism, (who knew about fascism first-hand) but we still get hand-wringing and pearl-clutching from intellectual Mugwumps who continue to want to obfuscate this issue. And for the uneducated on this subject–no–liberals and socialists aren’t fascists (or communists) or even close; especially given that they are usually among the first ones rounded up and murdered by fascist and quasi-fascist thugs along with Jews and others outside the fascist tribe.

Thus, about two weeks ago, I was struck by the comments of Sergey Lavrov, the Foreign Minister of the Russian Federation, regarding Ukraine and the West. He posited an alternative and counter-factual interpretation of history that is astounding in its echo of fascist movements. As he cautioned the West from coming to Ukraine’s aid, he referenced Russian historical continuity, the importance of uniting “Russian speakers,” and warning the West of the so-called lessons from the latest version of Russian ultra-nationalist self-delusional history that puts forth the notion of a single Eurasian people under Russian rule. Here is what he is saying: the various countries and ethnic groups from the border of Sweden down to Germany, and to Moldova, and, perhaps, to Serbia are within Russia’s orbit. He would sweep away these countries and peoples, murdering anyone in his way.

Vladimir Putin has shown himself to be an obviously corrupt kleptocrat, sociopath, narcissist and mediocrity. What he lacks in intellectual honesty and character, he more than compensates with ruthlessness, murderous intent, and links to the powerful oligarchs that keep him in near-totalitarian power. He is a fanatic. Fanatics, as Aldous Huxley noted, can be defeated because they compensate for a secret doubt, but they are also dangerous since any opposition or disagreement is interpreted as a hostile act. When such an individual has nuclear weapons, which this tyrant has been all too vocal in referencing, the situation is dangerous.

But the United States and its allies have overcome existential threats in the past, and make no mistake, Putin and the current Russian government is an existential threat. I served my country in the U. S. Navy for almost 23 years during the Cold War. I deployed to the theaters of contention close to the Soviet Union and Communist China. I also served, at times, as a special weapons courier. My military colleagues, and anyone paying attention, knew and developed the steel to oppose the imperial ambitions of those nations knowing that we were under nuclear threat by hostile powers. We also accepted the fact that in case of an attack on us, that retaliation against these countries would be swift and overwhelming.

Last night Mr. Putin delivered an unhinged and self-deluded speech. He is a clown. The clowns in this world have caused a tremendous amount of human suffering and destruction. As Hannah Arendt noted, evil is banal. The clowns of this world, augmented by our so-called post-truth era, create an environment for banality to thrive. They spread the sickness–the pandemic–of their deranged minds onto the world.

Thus, almost in a caricature of the comedy of life, Donald Trump, the dictator’s sycophant and a man of personal cowardice unknown in any previous U. S. president, praises the evisceration of Ukraine. The man is a Quisling and compromised. He and his followers, who act as a cult, and cult-like, change and contort their minds on a dime to align with his statements, represent a dangerous anti-democratic and–yes–anti-American Fifth Column.

The election of Joe Biden and the corrective of American democracy is the trigger that caused this crisis because Putin thought he had neutralized the United States. His Quisling is in exile, though he has more than a few mini-quislings waiting in line. As all extreme narcissists, he actually believed his own delusion. This is an attack borne of desperation. Unfortunately Ukraine is the pawn he has chosen. It is timed to undermine American democracy and seed division, so that he can realize his other imperial ambitions in threatening our allies and defeating democratic self-government–government of the people, by the people, and for the people–in Europe and elsewhere. He hates what we are. Anyone who supports him shares in that hate.

Wake up. History is calling.

Shake it Out – Embracing the Future of Program Management – Part Two: Private Industry Program and Project Management in Aerospace, Space, and Defense

In my previous post, I focused on Program and Project Management in the Public Interest, and the characteristics of its environment, especially from the perspective of the government program and acquisition disciplines. The purpose of this exploration is to lay the groundwork for understanding the future of program management—and the resulting technological and organizational challenges that are required to support that change.

The next part of this exploration is to define the motivations, characteristics, and disciplines of private industry equivalencies. Here there are commonalities, but also significant differences, that relate to the relationship and interplay between public investment, policy and acquisition, and private business interests.

Consistent with our initial focus on public interest project and program management (PPM), the vertical with the greatest relationship to it is found in the very specialized fields of aerospace, space, and defense. I will therefore first begin with this industry vertical.

Private Industry Program and Project Management

Aerospace, Space & Defense (ASD). It is here that we find commercial practice that comes closest to the types of structure, rules, and disciplines found in public interest PPM. As a result, it is also here where we find the most interesting areas of conflict and conciliation between private motivations and public needs and duties. Particularly since most of the business activity in this vertical is generated by and dependent on federal government acquisition strategy and policy.

On the defense side, the antecedent policy documents guiding acquisition and other measures are the National Security Strategy (NSS), which is produced by the President’s staff, the National Defense Strategy (NDS), which further translates and refines the NSS, and the National Military Strategy (NMS), which is delivered to the Secretary of Defense by the Joint Chiefs of Staff of the various military services, which is designed to provide unfettered military advise to the Secretary of Defense.

Note that the U.S. Department of Defense (DoD) and the related agencies, including the intelligence agencies, operate under a strict chain of command that ensures civilian control under the National Military Establishment. Aside from these structures, the documents and resulting legislation from DoD actions also impact such civilian agencies as the Department of Energy (DOE), Department of Homeland Security (DHS), the National Aeronautics and Space Administration (NASA), and the Federal Aviation Administration (FAA), among others.

The countervailing power and checks-and-balances on this Executive Branch power lies with the appropriation and oversight powers of the Congress. Until the various policies are funded and authorized by Congress, the general tenor of military, intelligence, and other operations have tangential, though not insignificant effects, on the private economy. Still, in terms of affecting how programs and projects are monitored, it is within the appropriation and authorization bills that we find the locus of power. As one of my program managers reminded me during my first round through the budget hearing process, “everyone talks, but money walks.”

On the Aerospace side, there are two main markets. One is related to commercial aircraft, parts, and engines sold to the various world airlines. The other is related to government’s role in non-defense research and development, as well as activities related to private-public partnerships, such as those related to space exploration. The individual civilian departments of government also publish their own strategic plans based on their roles, from which acquisition strategy follows. These long terms strategic plans, usually revised at least every five years, are then further refined into strategic implementation plans by various labs and directorates.

The suppliers and developers of the products and services for government, which represents the bulk of ASD, face many of the same challenges delineated in surveying their government counterparts. The difference, of course, is that these are private entities where the obligations and resulting mores are derived from business practice and contractual obligations and specifications.

This is not to imply a lack of commitment or dedication on the part of private entities. But it is an important distinction, particularly since financial incentives and self-interest are paramount considerations. A contract negotiator, for example, in order to be effective, must understand the underlying pressures and relative position of each of the competitors in the market being addressed. This individual should also be familiar with the particular core technical competencies of the competitors as well as their own strategic plans, the financial positions and goals that they share with their shareholders in the case of publicly traded corporations, and whether actual competition exists.

The Structure of the Market. Given the mergers and acquisitions of the last 30 years, along with the consolidation promoted by the Department of Defense as unofficial policy after the fall of the Berlin Wall and the lapse of antitrust enforcement, the portion of ASD and Space that rely on direct government funding, even those that participate in public-private ventures where risk sharing is involved, operate in a monopsony—the condition in which a single buyer—the U.S. government—substantially controls the market as the main purchaser of supplies and services. This monopsony market is then served by a supplier market that is largely an oligopoly—where there are few suppliers and limited competition—and where, in some technical domains, some suppliers exert monopoly power.

Acknowledging this condition informs us regarding the operational motivators of this market segment in relation to culture, practice, and the disciplines and professions employed.

In the first case, given the position of the U.S. government, the normal pressures of market competition and market incentives do not apply to the few competitors participating in the market. As a result, only the main buyer has the power to recreate, in an artificial manner, an environment which replicate the market incentives and penalties normally employed in a normative, highly diverse and competitive market.

Along these lines, for market incentives, the government can, and often does, act as the angel investor, given the rigorous need for R&D in such efforts. It can also lower the barriers to participation in order to encourage more competition and innovation. This can be deployed across the entire range of limited competitors, or it can be expansive in its approach to invite new participants.

Market penalties that are recreated in this environment usually target what economists call “rent-seeking behavior.” This is a situation where there may be incumbents that seek to increase their own wealth without creating new benefits, innovation, or providing additional wealth to society. Lobbying, glad-handing, cronyism, and other methods are employed and, oftentimes, rampant under monosponistic systems. Revolving-door practices, in which the former government official responsible for oversight obtains employment in the same industry and, oftentimes, with the same company, is too often seen in these cases.

Where there are few competitors, market participants will often play follow-the-leader and align themselves to dominate particular segments of the market in appealing to the government or elected representatives for business. This may mean that, in many cases, they team with their ostensible competitors to provide a diverse set of expertise from the various areas of specialty. As with any business, profitability is of paramount importance, for without profit there can be no business operations. It is here: the maximization of profit and shareholder value, that is the locus of power in understanding the motivation of these and most businesses.

This is not a value judgment. As faulty and risky as this system may be, no better business structure has been found to provide value to the public through incentives for productive work, innovation, the satisfaction of demand, and efficiency. The challenge, apart from what political leadership decides to do regarding the rules of the market, is to make those rules that do exist work in the public interest through fair, ethical, and open contracting practices.

To do this successfully requires contracting and negotiating expertise. To many executives and non-contracting personnel, negotiations appear to be a zero-sum game. No doubt, popular culture, mass media and movies, and self-promoting business people help mold this perception. Those from the legal profession, in particular, deal with a negotiation as an extension of the adversarial processes through which they usually operate. This is understandable given their education, and usually disastrous.

As an attorney friend of mine once observed: “My job, if I have done it right, is to ensure that everyone walking out of the room is in some way unhappy. Your job, in contrast, is to ensure that everyone walking out of it is happy.” While a generalization—and told tongue-in-cheek—it highlights the core difference in approach between these competing perspectives.

A good negotiator has learned that, given two motivated sides coming together to form a contract, that there is an area of intersection where both parties will view the deal being struck as meeting their goals, and as such, fair and reasonable. It is the job of the negotiator to find that area of mutual fairness, while also ensuring that the contract is clear and free of ambiguity, and that the structure of the instrument—price and/or cost, delivery, technical specification, statement of work or performance specification, key performance parameters, measures of performance, measures of effectiveness, management, sufficiency of capability (responsibility), and expertise—sets up the parties involved for success. A bad contract can no more be made good than the poorly prepared and compacted soil and foundation of a house be made good after the building goes up.

The purpose of a good contract is to avoid litigation, not to increase the likelihood of it happening. Furthermore, it serves the interests of neither side to obtain a product or service at a price, or under such onerous conditions, where the enterprise fails to survive. Alternatively, it does a supplier little good to obtain a contract that provides the customer with little financial flexibility, that fails to fully deliver on its commitments, that adversely affects its reputation, or that is perceived in a negative light by the public.

Effective negotiators on both sides of the table are aware of these risks and hazards, and so each is responsible for the final result, though often the power dynamic between the parties may be asymmetrical, depending on the specific situation. It is one of the few cases in which parties having both mutual and competing interests are brought together where each side is responsible for ensuring that the other does not hazard their organization. It is in this way that a contract—specifically one that consists of a long-term R&D cost-plus contract—is much like a partnership. Both parties must act in good faith to ensure the success of the project—all other considerations aside—once the contract is signed.

In this way, the manner of negotiating and executing contracts is very much a microcosm of civil society as a whole, for good or for bad, depending on the practices employed.

Given that the structure of aerospace, space, and defense consists of one dominant buyer with few major suppliers, the disciplines required relate to the details of the contract and its resulting requirements that establish the rules of governance.

As I outlined in my previous post, the characteristics of program and project management in the public interest, which are the products of contract management, are focused on successfully developing and obtaining a product to meet particular goals of the public under law, practice, and other delineated specific characteristics.

As a result, the skill-sets that are of paramount importance to business in this market prior to contract award are cost estimating, applied engineering expertise including systems engineering, financial management, contract negotiation, and law. The remainder of disciplines regarding project and program management expertise follow based on what has been established in the contract and the amount of leeway the contracting instrument provides in terms of risk management, cost recovery, and profit maximization, but the main difference is that this approach to the project leans more toward contract management.

Another consideration in which domains are brought to bear relates to position of the business in terms of market share and level of dominance in a particular segment of the market. For example, a company may decide to allow a lower than desired target profit. In the most extreme cases, the company may allow the contract to become a loss leader in order to continue to dominate a core competency or to prevent new entries into that portion of the market.

On the other side of the table, government negotiators are prohibited by the Federal Acquisition Regulation (the FAR) from allowing companies to “buy-in” by proposing an obviously lowball offer, but some do in any event, whether it is due to lack of expertise or bowing to the exigencies of price or cost. This last condition, combined with rent-seeking behavior mentioned earlier, where they occur, will distort and undermine the practices and indicators needed for effective project and program management. In these cases, the dysfunctional result is to create incentives to maximize revenue and scope through change orders, contracting language ambiguity, and price inelasticity. This also creates an environment that is resistant to innovation and rewards inefficiency.

But apart from these exceptions, the contract and its provisions, requirements, and type are what determine the structure of the eventual project or program management team. Unlike the commercial markets in which there are many competitors, the government through negotiation will determine the manner of burdening rate structures and allowable profit or margin. This last figure is determined by the contract type and the perceived risk of the contract goals to the contractor. The higher the risk, the higher the allowed margin or profit. The reverse applies as well.

Given this basis, the interplay between private entities and the public acquisition organizations, including the policy-setting staffs, are also of primary concern. Decision-makers, influences, and subject-matter experts from these entities participate together in what are ostensibly professional organizations, such as the National Defense Industrial Association (NDIA), the Project Management Institute (PMI), the College of Scheduling (CoS), the College of Performance Management (CPM), the International Council on Systems Engineering (INCOSE), the National Contract Management Association (NCMA), and the International Cost Estimating and Analysis Association (ICEAA), among the most frequently attended by these groups. Corresponding and associated private and professional groups are the Project Control Academy and the Association for Computing Machinery (ACM).

This list is by no means exhaustive, but from the perspective of suppliers to public agencies, NDIA, PMI, CoS, and CPM are of particular interest because much of the business of influencing policy and the details of its application are accomplished here. In this manner, the interests of the participants from the corporate side of the equation relate to those areas always of concern: business certainty, minimization of oversight, market and government influence. The market for several years now has been reactive, not proactive.

There is no doubt that business organizations from local Chambers of Commerce to specialized trade groups that bring with them the advantages of finding mutual interests and synergy. All also come with the ills and dysfunction, to varying degrees, borne from self-promotion, glad-handing, back-scratching, and ossification.

In groups where there is little appetite to upend the status quo, innovation and change, is viewed with suspicion and as being risky. In such cases the standard reaction is cognitive dissonance. At least until measures can be taken to subsume or control the pace and nature of the change. This is particularly true in the area of project and program management in general and integrated project, program and portfolio management (IPPM), in particular.

Absent the appetite on the part of DoD to replicate market forces that drive the acceptance of innovative IPPM approaches, one large event and various evolutionary aviation and space technology trends have upended the ecosystem of rent-seeking, reaction, and incumbents bent on maintaining the status quo.

The one large event, of course, came about from the changes wrought by the Covid pandemic. The other, evolutionary changes, are a result of the acceleration of software technology in capturing and transforming big(ger) dataset combined with open business intelligence systems that can be flexibly delivered locally and via the Cloud.

I also predict that these changes will make hard-coded, purpose-driven niche applications obsolete within the next five years, as well as those companies that have built their businesses around delivering custom, niche applications, and MS Excel spreadsheets, and those core companies that are comfortable suboptimizing and reacting to delivering the letter, if not the spirit, of good business practice expected under their contracts.

Walking hand-in-hand with these technological and business developments, the business of the aerospace, space and defense market, in general, is facing a window opening for new entries and greater competition borne of emergent engineering and technological exigencies that demand innovation and new approaches to old, persistent problems.

The coronavirus pandemic and new challenges from the realities of global competition, global warming, geopolitical rivalries; aviation, space and atmospheric science; and the revolution in data capture, transformation, and optimization are upending a period of quiescence and retrenchment in the market. These factors are moving the urgency of innovation and change to the left both rapidly and in a disruptive manner that will only accelerate after the immediate pandemic crisis passes.

In my studies of Toynbee and other historians (outside of my day job, I am also credentialed in political science and history, among other disciplines, through both undergraduate and graduate education), I have observed that societies and cultures that do not embrace the future and confront their challenges effectively, and that do not do so in a constructive manner, find themselves overrun by it and them. History is the chronicle of human frailty, tragedy, and failure interspersed by amazing periods of resilience, human flourishing, advancement, and hope.

As it relates to our more prosaic concerns, Deloitte has published an insightful paper on the 2021 industry outlook. Among the identified short-term developments are:

  1. A slow recovery in passenger travel may impact aircraft deliveries and industry revenues in commercial aviation,
  2. The defense sector will remain stable as countries plan to sustain their military capabilities,
  3. Satellite broadband, space exploration and militarization will drive growth,
  4. Industry will shift to transforming supply chains into more resilient and dynamic networks,
  5. Merger and acquisitions are likely to recover in 2021 as a hedge toward ensuring long-term growth and market share.

More importantly, the longer-term changes to the industry are being driven by the following technological and market changes:

  • Advanced aerial mobility (AAM). Both FAA and NASA are making investments in this area, and so the opening exists for new entries into the market, including new entries in the supply chain, that will disrupt the giants (absent a permissive M&A stance under the new Administration in Washington). AAM is the new paradigm to introduce safe, short-distance, daily-commute flying technologies using vertical lift.
  • Hypersonics. Given the touted investment of Russia and China into this technology as a means of leveraging against the power projection of U.S. forces, particularly its Navy and carrier battle groups (aside from the apparent fact that Vladimir Putin, the president of Upper Volta with Missiles and Hackers, really hates Disney World), the DoD is projected to fast-track hypersonic capabilities and countermeasures.
  • Electric propulsion. NASA is investing in cost-sharing capabilities to leverage electric propulsion technologies, looking to benefit from the start-up growth in this sector. This is an exciting development which has the potential to transform the entire industry over the next decade and after.
  • Hydrogen-powered aircraft. OEMs are continuing to pour private investment money into start-ups looking to introduce more fuel-efficient and clean energy alternatives. As with electric propulsion, there are prototypes of these aircraft being produced and as public investments into cost-sharing and market-investment strategies take hold, the U.S., Europe, and Asia are looking at a more diverse and innovative aerospace, space, and defense market.

Given the present condition of the industry, and the emerging technological developments and resulting transformation of flight, propulsion, and fuel sources, the concept and definitions used in project and program management require a revision to meet the exigencies of the new market.

For both industry and government, in order to address these new developments, I believe that a new language is necessary, as well as a complete revision to what is considered to be the acceptable baseline of best business practice and the art of the possible. Only then will organizations and companies be positioned to address the challenges these new forms of investment and partnering systems will raise.

The New Language of Integrated Program, Project, and Portfolio Management (IPPM).

First a digression to the past: while I was on active duty in the Navy, near the end of my career, I was assigned to the staff of the Office of the Undersecretary of Defense for Acquisition and Technology (OUSD(A&T)). Ostensibly, my assignment was to give me a place to transition from the Service. Thus, I followed the senior executive, who was PEO(A) at NAVAIR, to the Pentagon, simultaneously with the transition of NAVAIR to Patuxent River, Maryland. In reality, I had been tasked by the senior executive, Mr. Dan Czelusniak, to explore and achieve three goals:

  1. To develop a common schema by supporting an existing contract for the collection of data from DoD suppliers from cost-plus R&D contracts with the goal in mind of creating a master historical database of contract performance and technological development risk. This schema would first be directed to cost performance, or EVM;
  2. To continue to develop a language, methodology, and standard, first started and funded by NAVAIR, for the integration of systems engineering and technical performance management into the program management business rhythm;
  3. To create and define a definition of Integrated Program Management.

I largely achieved the first two during my relatively brief period there.

The first became known and the Integrated Digital Environment (IDE), which was refined and fully implemented after my departure from the Service. Much of this work is the basis for data capture, transformation, and load (ETL) today. There had already been a good deal of work by private individuals, organizations, and other governments in establishing common schemas, which were first applied to the transportation and shipping industries. But the team of individuals I worked with were able to set the bar for what followed across datasets.

The second was completed and turned over to the Services and federal agencies, many of whom adopted the initial approach, and refined it as well to inform, through the identification of technical risk, cost performance and technical achievement. Much of this knowledge already existed in the Systems Engineering community, but working with INCOSE, a group of like-minded individuals were able to take the work from the proof-of-concept, which was awarded the Acker in Skill in Communication award at the DAU Acquisition Research Symposium, and turn it into the TPM and KPP standard used by organizations today.

The third began with establishing my position, which hadn’t existed until my arrival: Lead Action Officer, Integrated Program Management. Gary Christle, who was the senior executive in charge of the staff, asked me “What is Integrated Program Management?” I responded: “I don’t know, sir, but I intend to find out.” Unfortunately, this is the initiative that has still eluded both industry and government, but not without some advancement.

Note that this position with its charter to define IPM was created over 24 years ago—about the same time it takes, apparently, to produce an operational fighter jet. I note this with no flippancy, for I believe that the connection is more than just coincidental.

When spoken of, IPM and IPPM are oftentimes restricted to the concept of cost (read cost performance or EVM) and schedule integration, with aggregated portfolio organization across a selected number of projects thrown in, in the latter case. That was considered advancement in 1997. But today, we seem to be stuck in time. In light of present technology and capabilities, this is a self-limiting concept.

This concept is technologically supported by a neutral schema that is authored and managed by DoD. While essential to data capture and transformation—and because of this fact—it is currently the target by incumbents as a means of further limiting even this self-limited definition in practice. It is ironic that a technological advance that supports data-driven in lieu of report-driven information integration is being influenced to support the old paradigm.

The motivations are varied: industry suppliers who aim to restrict access to performance data under project and program management, incumbent technology providers who wish to keep the changes in data capture and transformation restricted to their limited capabilities, consulting companies aligned with technology incumbents, and staff augmentation firms dependent on keeping their customers dependent on custom application development and Excel workbooks. All of these forces work through the various professional organizations which work to influence government policy, hoping to establish themselves as the arbiters of the possible and the acceptable.

Note that oftentimes the requirements under project management are often critiqued under the rubric of government regulation. But that is a misnomer: it is an extension of government contract management. Another critique is made from the perspective of overhead costs. But management costs money, and one would not (or at least should not) drive a car or own a house without insurance and a budget for maintenance, much less a multi-year high-cost project involving the public’s money. In addition, as I have written previously which is supported by the literature, data-driven systems actually reduce costs and overhead.

All of these factors contribute to ossification, and impose artificial blinders that, absent reform, will undermine meeting the new paradigms of 21st Century project management, given that the limited concept of IPM was obviously insufficient to address the challenges of the transitional decade that broached the last century.

Embracing the Future in Aerospace, Space, and Defense

As indicated, the aerospace and space science and technology verticals are entering a new and exciting phase of technological innovation resulting from investments in start-ups and R&D, including public-private cost-sharing arrangements.

  1. IPM to Project Life-Cycle Management. Given the baggage that attends the acronym IPM, and the worldwide trend to data-driven decision-making, it is time to adjust the language of project and program management to align to it. In lieu of IPM, I suggest Project Life-Cycle Management to define the approach to project and program data and information management.
  2. Functionality-Driven to Data-Driven Applications. Our software, systems and procedures must be able to support that infrastructure and be similarly in alignment with that manner of thinking. This evolution includes the following attributes:
    • Data Agnosticism. As our decision-making methods expand to include a wider, deeper, and more comprehensive interdisciplinary approach, our underlying systems must be able to access data in this same manner. As such, these systems must be data agnostic.
    • Data neutrality. In order to optimize access to data, the overhead and effort needed to access data must be greatly reduced. Using data science and analysis to restructure pre-conditioned data in order to overcome proprietary lexicons—an approach used for business intelligence systems since the 1980s—provides no added value to either the data or the organization. If data access is ad hoc and customized in every implementation, the value of the effort cannot either persist, nor is the return on investment fully realized. It backs the customer into a corner in terms of flexibility and innovation. Thus, pre-configured data capture, extract, transformation, and load (ETL) into a non-proprietary and objective format, which applies to all data types used in project and program management systems, is essential to providing the basis for a knowledge-based environment that encourages discovery from data. This approach in ETL is enhanced by the utilization of neutral data schemas.
    • Data in Lieu of Reporting and Visualization. No doubt that data must be visualized at some point—preferably after its transformation and load into the database with other, interrelated data elements that illuminate information to enhance the knowledge of the decisionmaker. This implies that systems that rely on physical report formats, charts, and graphs as the goal are not in alignment with the new paradigm. Where Excel spreadsheets and PowerPoint are used as a management system, it is the preparer is providing the interpretation, in a manner that predisposes the possible alternatives of interpretation. The goal, instead, is to have data speak for itself. It is the data, transformed into information, interrelated and contextualized to create intelligence that is the goal.
    • All of the Data, All of the Time. The cost of 1TB of data compared to 1MB of data is the marginal cost of the additional electrons to produce it. Our systems must be able to capture all of the data essential to effective decision-making in the periodicity determined by the nature of the data. Thus, our software systems must be able to relate data at all levels and to scale from simplistic datasets to extremely large ones. It should do so in such a way that the option for determining what, among the full menu of data options available, is relevant rests in the consumer of that data.
    • Open Systems. Software solution providers beginning with the introduction of widespread CPU capability have manufactured software to perform particular functions based on particular disciplines and very specific capabilities. As noted earlier, these software applications are functionality-focused and proprietary in structure, method, and data. For data-driven project and program requirements, software systems must be flexible enough to accommodate a wide range of analytical and visualization demands in allowing the data to determine the rules of engagement. This implies systems that are open in two ways: data agnosticism, as already noted, but also open in terms of the user environment.
    • Flexible Application Configuration. Our systems must be able to address the needs of the various disciplines in their details, while also allowing for integration and contextualization of interrelated data across domains. As with Open Systems to data and the user environment, openness through the ability to roll out multiple specialized applications from a common platform places the subject matter expert and program manager in the driver’s seat in terms of data analysis and visualization. An effective open platform also reduces the overhead associated with limited purpose-driven, disconnected and proprietary niche applications.
    • No-Code/Low-Code. Given that data and the consumer will determine both the source and method of delivery, our open systems should provide an environment that supports Agile development and deployment of customization and new requirements.
    • Knowledge-Based Content. Given the extensive amount of experience and education recorded and documented in the literature, our systems must, at the very least, provide a baseline of predictive analytics and visualization methods usually found in the more limited, purpose-built hardcoded applications, if not more expansive. This knowledge-based content, however, must be easily expandable and refinable, given the other attributes of openness, flexibility, and application configuration. In this manner, our 21st century project and program management systems must possess the attributes of a hybrid system: providing the functionality of the traditional niche systems with the flexibility and power of a business intelligence system enhanced by COTS data capture and transformation.
    • Ease of Use. The flexibility and power of these systems must be such that implementation and deployment are rapid, and that new user environment applications can be quickly deployed. Furthermore, the end user should be able to determine the level of complexity or simplicity of the environment to support ease of use.
  1. Focus on the Earliest Indicator. A good deal of effort since the late 1990s has been expended on defining the highest level of summary data that is sufficient to inform earned value, with schedule integration derived from the WBS, oftentimes summarized on a one-to-many basis as well. This perspective is biased toward believing that cost performance is the basis for determining project control and performance. But even when related to cost, the focus is backwards. The project lifecycle in its optimized form exists of the following progression:

    Project Goals and Contract (framing assumptions) –> Systems Engineering, CDRLs, KPPs, MoEs, MoPs, TPMs –> Project Estimate –> Project Plan –> IMS –> Risk and Uncertainty Analysis –> Financial Planning and Execution –> PMB –> EVM

    As I’ve documented in this blog over the years, DoD studies have shown that, while greater detail within the EVM data may not garner greater early warning, proper integration with the schedule at the work package level does. Program variances first appear in the IMS. A good IMS, thus, is key to collecting and acting as the main execution document. This is why many program managers who are largely absent in the last decade or so from the professional organizations listed, tend to assert that EVM is like “looking in the rearview mirror.” It isn’t that it is not essential, but it is true that it is not the earliest indicator of variances from expected baseline project performance.

    Thus, the emphasis going forward under this new paradigm is not to continue the emphasis and a central role for EVM, but a shift to the earliest indicator for each aspect of the program that defines its framing assumptions.
  1. Systems Engineering: It’s not Space Science, it’s Space Engineering, which is harder.
    The focus on start-up financing and developmental cost-sharing shifts the focus to systems engineering configuration control and technical performance indicators. The emphasis on meeting expectations, program goals, and achieving milestones within the cost share make it essential to be able to identify fatal variances, long before conventional cost performance indicators show variances. The concern of the program manager in these cases isn’t so much on the estimate at complete, but whether the industry partner will be able to deploy the technology within the acceptable range of the MoEs, MoPs, TPPs, and KPPs, and not exceed the government’s portion of the cost share. Thus, the incentive is to not only identify variances and unacceptable risk at the earliest indicator, but to do so in terms of whether the end-item technology will be successfully deployed, or whether the government should cut its losses.
  1. Risk and Uncertainty is more than SRA. The late 20th century approach to risk management is to run a simulated Monte Carlo analysis against the schedule, and to identify alternative critical paths and any unacceptable risks within the critical path. This is known as the schedule risk analysis, or SRA. While valuable, the ratio of personnel engaged in risk management is much smaller than the staffs devoted to schedule and cost analysis.

    This is no doubt due to the specialized language and techniques devoted to risk and uncertainty. This segregation of risk from mainstream project and program analysis has severely restricted both the utility and the real-world impact of risk analysis on program management decision-making.

    But risk and uncertainty extend beyond the schedule risk analysis, and their utility in an environment of aggressive investment in new technology, innovation, and new entries to the market will place these assessments at center stage. In reality, our ability to apply risk analysis techniques extends to the project plan, to technical performance indicators, to estimating, to the integrated master schedule (IMS), and to cost, both financial and from an earned value perspective. Combined with the need to identify risk and major variances using the earliest indicator, risk analysis becomes pivotal to mainstream program analysis and decision-making.

Conclusions from Part Two

The ASD industry is most closely aligned with PPM in the public interest. Two overarching trends that are transforming this market that are overcoming the inertia and ossification of PPM thought are the communications and information systems employed in response to the coronavirus pandemic, which opened pathways to new ways of thinking about the status quo, and the start-ups and new entries into the ASD market, borne from the investments in new technologies arising from external market, geo-political, space science, global warming, and propulsion trends, as well as new technologies and methods being employed in data and information technology that drive greater efficiency and productivity. These changes have forced a new language and new expectations as to the art of the necessary, as well as the art of the possible, for PPM. This new language includes a transition to the concept of the optimal capture and use of all data across the program management life cycle with greater emphasis on systems engineering, technical performance, and risk.

Having summarized the new program paradigm in Aerospace, Space, and Defense, my next post will assess the characteristics of program management in various commercial industries, the rising trends in these verticals, and what that means for the project and program management discipline.

Shake it Out – Embracing the Future in Program Management – Part One: Program and Project Management in the Public Interest

I heard the song from which I derived the title to this post sung by Florence and the Machine and was inspired to sit down and write about what I see as the future in program management.

Thus, my blogging radio silence has ended as I begin to process and share my observations and essential achievements over the last couple of years.

Some of my reticence in writing has been due to the continual drumbeat of both outrageous and polarizing speech that had dominated our lives for four years. Combined with the resulting societal polarization, I was overwhelmed by the hyper-politicized environment which has fostered disinformation and dysfunction. Those who wish to seek my first and current word on this subject need only visit my blog post, “In Defense of Empiricism” at the AITS Blogging Alliance here.

It is hard to believe that I published that post four years ago. I stand by it today and believe that it remains as valid, if not more so, than it did when I wrote and shared it.

Finally, the last and most important reason for my relative silence has been that I have been hard at work putting my money and reputation where my blogging fingers have been—in the face of a pandemic that has transformed and transfigured our social and economic lives.

My company—the conduit that provides the insights I share here—is SNA Software LLC. We are a small, veteran-owned company and we specialize in data capture, transformation, contextualization and visualization. We do it in a way that removes significant effort in these processes, ensures reliability and trust, to incorporate off-the-shelf functionality that provides insight, and empowers the user by leveraging the power of open systems, especially in program and project management.

Program and Project Management in the Public Interest

There are two aspects to the business world that we inhabit: commercial and government; both, however, usually relate to some aspect of the public interest, which is our forte.

There are also two concepts about this subject to unpack.

The first is distinguishing between program and project management. In this concept, a program is an overarching effort that may consist of individual efforts that, together, will result in the production or completion of a system, whether that is a weapons system, a satellite, a spacecraft, or an engine. It could even be a dam or some other aspect of public works.

A project under this concept is a self-contained effort separated organizationally from the larger entity, which possesses a clearly defined start and finish, a defined and allocated budget, and a set of plans, a performance management feedback system, and overarching goals or “framing assumptions” that define what constitutes the state of being “done.”

Oftentimes the terms “program” and “project” are used interchangeably, but the difference for these types of efforts is important and goes beyond a shallow understanding of the semantics. A program will also consider the lifecycle of the program: the follow-on logistics, the interrelationship of the end item to other components that will constitute the deployed system or systems, and any iterative efforts relating to improvement, revision, and modernization.

A word on the term “portfolio” is also worth a mention in the context of our theme. A portfolio is simply a summary of the projects or programs under an organizational entity that has both reporting and oversight responsibility for them. They may be interrelated or independent in their efforts, but all must report in some way, either due to fiduciary, resource, or oversight concerns, to that overarching entity.

The second concept relates to the term “public interest.” Programs and projects under this concept are those that must address the following characteristics: legality, governance, complexity, integrity, leadership, oversight, and subject matter expertise. I placed these in no particular order.

What we call in modern times “public interest” was originally called “public virtue” by the founders of the United States, which embody the ideals of the American Revolution, and upon which our experiment in democratic republicanism is built. It consists of conducting oneself in a manner in which the good of the whole—the public—outweighs personal interests and pursuits. Self-dealing need not apply.

This is no idealistic form of self-delusion: I understand, as do my colleagues, that we are, at heart, a commercial profit-making enterprise. But the manner in which we engage with government requires a different set of rules and many of these rules are codified in law and ethical practice. While others do not always feel obliged to live by these rules, we govern ourselves and so choose to apply these virtues—and to seek to support and change our system to encourage such behavior to as to be the norm—even in direct interactions with government personnel where we feel these virtues have been violated.

Characteristics of Public Interest Programs

Thus, the characteristics outlined above apply to program and project management in the public interest in the following manner:

Legality: That Public Interest Programs are an artifact of law and statute and are specifically designed to benefit the public as a whole.

At heart, program and project management are based on contractual obligations, whether those instruments apply internally or externally. As a result, everyone involved in the program and project management discipline is, by default, part of the acquisition community and the acquisition process. The law that applies to all government acquisition systems is based on the Federal Acquisition Regulation (FAR). There are also oversight and fiduciary responsibilities that apply as a result of the need for accountability under the Congressional appropriations process as well as ethical standards that apply, such as those under the Truth in Negotiations Act (TINA). While broad in the management flexibility they allow, violations of these statutes come with serious consequences. Thus, as a basis for establishing hard and fast guardrails in the management of programs and projects. Individual government agencies and military services also publish additional standards that supplement the legal requirements. An example is the Department of Defense FAR Supplement (DFARS). Commercial entities that hold government contracts in relation to Program Management Offices (PMOs) must sign on to both FAR and agency contractual clauses, which will then flow down to their subcontractors. Thus, the enforcement of these norms is both structured and consistent.

Governance: That the Organizational Structure and Disciplines deriving from Public Interest Programs are a result of both Contract and Regulatory Practice under the concept of Government Sovereignty.

The government and supplier PMOs are formed as a result of a contractual obligation for a particular purpose. Government contracting is unique since government entities are the sovereign. In the case of the United States, the sovereign is the elected government of the United States, which derives its legitimacy from the people of the United States as a whole. Constitutionally, the Executive Branch is tasked with the acquisition responsibility, but the manner and method of this responsibility is defined by statute.

Thus, during negotiations and unlike in commercial practice, the commercial entity is always the offeror and the United States always the party that either accepts or rejects the offer (the acceptor). This relationship has ramifications in contract enforcement and governance of the effort after award. It also allows the government to dictate the terms of the award through its solicitations. Furthermore, provisions from law establish cases where the burden for performance is on the entity (the supplier) providing the supplies and services.

Thus, the establishment of the PMO and oversight organizations have a legal basis, aside from considerations of best business practice. The details of governance within the bounds of legal guidance are those that apply through agency administrative law and regulation, oftentimes based on best business practice. These detailed practices of governance are usually established as a result of hard-learned experience: establishment of disciplines (systems engineering and technical performance, planning, performance management, cost control, financial execution, schedule, and progress assessment), the periodicity of reporting, the manner of oversight, the manner of liaison between the supplier and government PMOs, and alignment to the organization’s goals.

Complexity: That Public Interest Programs possess a level of both technical and organizational complexity unequaled in the private sector.

Program and project management in government involves a level of complexity rarely found in similar non-governmental commercial efforts. Aligning the contractual requirements, as an example, to an assessment of the future characteristics of a fighter aircraft needed to support the U.S. National Defense Strategy, built on the assessments by the intelligence agencies regarding future threats, is a unique aspect of government acquisition.

Furthermore, while relying on the expertise of private industry of such systems that support national defense, as well as those that support space exploration, energy, and a host of other needs, the items being acquired, which require cost type R&D contracts that involve program management, by definition are those where the necessary solutions are not readily available as commercial end items.

Oftentimes these requirements are built onto and extend existing off-the-shelf capabilities. But given that government investment in R&D represents the majority of this type of spending in the economy, absent it, technology and other efforts directed to meeting defense, economic, societal, climate, and space exploration challenges of the future would most likely not be met—or those that do will benefit only a portion of the populace. The federal government uniquely possesses the legal legitimacy, resources, and expertise to undertake such R&D that, pushing the envelope on capabilities, involves both epistemic and aleatory risk that can be managed through the processes of program management.

Integrity: The conduct of Public Interest Programs demands the highest level of commitment to a culture of accountability, impartiality, ethical conduct, fiduciary responsibility, democratic virtues, and honesty.

The first level of accountability resides in the conduct of the program manager, who is the locus of integrity within the program management office. This requires a focus on the duties the position demands as a representative of the Government of the United States. Furthermore, the program manager must ensure that the program team operate within the constraints established by the program’s or project’s contractual commitments, and that it continues to work to meeting the program goals that align with the stated interests and goals of the organization. That these duties are exercised regardless of self-interest is the basis of integrity.

This is not an easy discipline, and individuals oftentimes cannot separate their own interests from those of their duties. Yet, without this level of commitment, the legitimacy of the program office and the governmental enterprise itself is threatened.

In prior years, as an active-duty Supply Corps officer, I came across cases where individuals in civil service or among the commissioned officer community confused their own interests—for promotion, for self-aggrandizement, for ego—with those duties demanded of their rank or position. Such confusions of interests are serious transgressions. With contracted-out positions within program offices adding consulting and staffing firms into the mix, with their oftentimes diversified interests and portfolios, an additional layer of challenges is presented. Self-promotion, competition, and self-dealing have all too often become blatant, and program managers would do well to enforce strict rules regarding such behavior.

The pressures of exigency are oftentimes the main cause of the loss of integrity of the program or project. Personal interrelationships and human resource management issues can also undermine good order and discipline necessary for the program or project to organize itself into a cohesive, working team that is focused on a common vision.

Key elements mentioned in our opening thesis regarding ethical conduct, adherence to democratic virtues which include acceptance of all members of the team regardless of color, ethnicity, race, sexual identity, religion, or place of national origin. People deserve the respect and decency deriving from their basic human rights to enjoy human dignity, as well as of their position. Adding to these elements include honesty and the willingness to accept and report bad news, which is essential to integrity.

An organization committed to the principle of accountability will seek to measure and ensure that the goals of the program or project are being met, and that ameliorative measures are taken to correct any deficiencies. Since these efforts oftentimes involve years of effort involving significant sums of public monies, fiduciary integrity is essential to this characteristic.

All of these elements can and should exist in private, commercial practices. The difference that makes this a unique characteristic to program management in the public interest is the level of scrutiny, reporting, and review that is conducted: from oversight agencies within the Executive Department of the government, to the Congressional oversight, hearing and review processes, agency review, auditing and reporting, and inquires and critiques by the press and the public. Public interest program management is life in a fishbowl, except in the most secret efforts, and even those will eventually be subject to scrutiny.

As with a U.S. Navy ship that makes a port of call in a foreign country, the actions of the conduct of crew will not only reflect on themselves or their ship, but on the United States; so it is also with our program offices. Thus, systems of programmatic governance and business management must anticipate in their structure the level of adherence required. Given the inherent level of risk involved in these efforts, and given the normal amount of error human systems create even with good intentions and expertise, establishing a system committed to the elements of integrity creates a self-correcting one better prepared to meet the program’s or project’s challenges.

Leadership: Programs in the Public Interest differ from equivalent commercial efforts in that management systems and incentives based on profit- and shareholder-orientations do not exist. Instead, a special kind of skillset is required that includes good business management principles and skills combined with highly developed leadership traits.

Management skills tend to be a subset of leadership, though in business schools and professional courses they tend to be addressed as co-equal. This is understandable in commercial enterprises that focus on the capitalistic pressures regarding profit and market share.

Given the unique pressures imposed by the elements of integrity, the program manager and the program team are thrown into a situation that requires a focus on the achievement of organizational goals. In the case of program and project management, this will be expressed in the form of a set of “framing assumptions” that roll into an overarching vision.

A program office, of course, is more than a set of systems, practices, and processes. It is, first and foremost, a collection of individuals consisting of subject matter experts and professionals who must be developed into a team committed to the vision. The effort to achieve this team commitment is one of the more emotional and compelling elements that comprise leadership.

Human systems are adaptive ones, complex, which react and are created by both incentives and sanctions. Every group, especially involving creative and talented people, starts out being a collection of individuals with the interrelations among the members in an immature state. Underlying the expression of various forms of ambition and self-identification among mature individuals is the basic human need for social acceptance, born from the individual personal need for love. This motivation exists psychologically in all individuals except for sociopaths. It is also the basis for empathy and the acceptance of the autonomy of others, which form the foundation for team building.

The goal of the leader is to encourage maturity among the members of the group. The result is to create that overused term “synergy.” This is accomplished by doing those things as a leader necessary to develop members of the group that fosters trust, acceptance, and mutual respect. Admiral James L. Holloway, Jr., in his missive on Naval Leadership, instructed his young officers to eschew any concept of perfectionism in people. People make mistakes. We know this if we are to be brutally honest about our own experiences and actions.

Thus, intellectual honesty and an understanding on what motivates people within their cultural mores, above all else, is essential to good leadership. Americans, by nature, tend to be skeptical and independently minded. They require a level of explanation and due diligence that is necessary to win over their commitment to a goal or vision. When it comes to professionals operating within public service in government—who take an oath to the Constitution and our system of laws—the ability to lead tends to be more essential than just good management skills, though the latter are by no means unimportant. Management in private enterprise assumes a contentious workplace of competing values and interests, and oftentimes fosters it.

Program and project management in the public interest cannot succeed in such an environment. It requires a level of commitment to the goals of the effort regardless of personal values or interests among the individual members of the team. That they must be convinced to this level of commitment ensures that the values of leadership not only operate at the top of the management chain, but also at each of the levels and lateral relationships that comprise the team.

The shorthand for leadership in this culture is that the leader is “working their way out of their job,” and “that in order to be a good leader one must be a good follower,” meaning that all members of the team are well-informed, that their contributions, expertise and knowledge is acknowledged and respected, that individual points of failure through the irreplaceable person syndrome are minimized, and that each member of a team or sub-team can step in or step up to keep the operation functioning. The motivating concept in these situations are the interests of the United States, in lieu of a set of stockholders or some fiduciary reward.

Finally, there is the concept of the burden of leadership. Responsibility can be can be delegated, but accountability cannot. Leadership in this context entails an obligation to take responsibility for both the mission of the organization and the ethical atmosphere established in its governance.

Oversight: While the necessity for integrity anticipates the level of accountability, scrutiny, oversight, and reporting for Programs in the Public Interest, the environment this encompasses is unique compared to commercial entities.

The basis for acquisition at the federal level resides in the Article Two powers of the president as the nation’s Chief Executive. Congress, however, under its Article One powers, controls appropriations and passes laws related to the processes, procedures and management of the Executive Branch.

Flowing from these authorities, the agencies within the federal government have created offices for the oversight of the public’s money, the methods of acquisition of supplies and services, and the management of contracts. Contracting Officers are given authority through a warrant to exercise their acquisition authority under the guidance and management of a senior acquisition authority.

Unlike in private business, the government operates under the concept of Actual Authority. That is, no one may commit the government except those possessing a warrant. Program Managers are appointed to provide control and administration of cost type efforts, especially those containing R&D, to shepherd these efforts over the course of what usually constitutes a multi-year effort. The Contracting Officer and/or the senior acquisition authority in these cases will delegate contract administration authority to the Program Manager. As such, it is a very powerful position.

The inherent powers of the Executive Branch and the Legislative Branches of government create a tension that is resolved through a separation of powers and the ability of one branch to—at least in most cases—check the excesses and abuses of the other: the concept of checks and balances, especially through the operation of oversight.

When these tensions cannot be resolved within the processes established for separation of powers, the third branch of government becomes involved: this is the Judicial Branch. The federal judiciary has the ability to review all laws of the United States, their constitutionality, and their adherence to the letter of the law in the case of statute.

Wherever power exists within the federal government there exists systems of checks and balances. The reason for this is clear, and Lord Acton’s warning about power corrupting and absolute power corrupting absolutely is the operational concept.

Congress passes statutes and the Judiciary interprets the law, but it is up to the Executive Branch through the appointed heads of the various departments of government down through the civil service and, in the case of the Department of Defense, the military chain of command under civilian authority, to carry out the day-to-day activities in executing the laws and business of the government. This creates a large base of administrative law and procedure.

Administrative Law and the resulting procedures in their implementation come about due to the complexities in the statutes themselves, the tests of certain provisions of the statutes in the interplay between the various branches of government, and the practicalities of execution. This body of law and procedure is oftentimes confused with “regulation” in political discussions, but it is actually the means of ensuring that the laws are faithfully executed without undue political influence. It is usually supplemented by ethical codes and regulations as well.

As a part of this ecosystem, the Program in the Public Interest must establish a discipline related to self-regulation, due diligence, good business practice, fiduciary control, ethical and professional conduct, responsibility, and accountability. Just as the branches of the federal government are constructed to ensure oversight and checks-and-balances, this also exists with normative public administration within the Executive Branch agencies.

This is often referred to both positively and, mostly among political polemicists in the negative, as the bureaucracy. The development of bureaucracies in government is noted by historians and political scientists as an indication of political stability, maturity, and expertise. Without bureaucracies, governments tend to be capricious and their policies uncertain. The practice of stare decisis—the importance of precedent in legal decisions—is also part and parcel of stability. Government power can be beneficial or coercive. Resting action on laws and not the whims or desires of the individual person is essential to the good order and discipline of the federal government.

As such, program and project managers, given the extensive latitude and inherent powers of their position, are subject to rigorous reporting, oversight, and accountability regimes in the performance of their duties. In R&D cost-type program and project management efforts, the risk is shared between the supplier and the government. And the government flows down this same regime to the contractor to ensure the integrity of the effort in the expenditure of public monies and under the performance and delivery of public contacts.

This leads us to the last important aspect of oversight: public scrutiny, which also includes the press as the Fourth Estate. When I was a young Lieutenant in the Navy working in contracts the senior officer to whom I was assign often remarked: “Never do anything that would cause you to be ashamed were it to end up being read by your grandmother in the Washington Post.”

Unlike private business where law, contractual obligation, and fiduciary responsibility are the main pressures on tolerated behavior, the government and its actions are—and must be—under constant public scrutiny. It is expected. Senior managers who champ against the bit of this check on official conduct misunderstand their role. Even the appearance of malfeasance or abuse can cause one to steer into the rocks and shoals.

Subject Matter Expertise: Given the interrelated characteristics of legality, governance, complexity, integrity, leadership, and oversight—linked to the development of a professional, permanent bureaucracy acting through a non-partisan civil service—the practices necessary to successfully shepherd such efforts has produced areas of expertise and specialization. These areas provide a basis for leveraging technology in gaining insight into meeting all of the requirements necessary to the good administration and control of Program Management in the Public Interest.

The structures and practices of program and project management are reflected in the private economy. Some of this is contractually prescribed and some of it is based on best business practice learned through hard experience. In the interplay of government and industry, most often an innovation in one has been refined and improved in the other, only to find its way back to practice on the originating “side” of the transaction.

Initially in our history this cross-fertilization occurred through extraordinary wartime measures: standardization of rifled weaponry passed down by Thomas Jefferson and Eli Whitney, and for railroad track gauge standards issued by the Union government during the Civil War, are just two examples that turned out to provide a decisive advantage against laissez faire and libertarian approaches.

As the complexity of private business concerns, particularly in the international sphere, began to mimic—and in many cases surpass—the size and technical complexity of many individual government efforts, partnerships with civil authorities and private businesses saw the need for industry standardization for both electrical and non-electrical components and processes. The former was particularly important in the “Current Wars” between Edison and Westinghouse.

These simple and earlier examples highlight the great conundrum of standardization of supply, practice and procedure in acquisition: the need for economy through competition of many sources for any particular commodity or item weighed against the efficiency and interoperability needed to continue operations. Buying multiple individual items with the same function but produced using differing standards creates a nightmare of suboptimization. Overly restrictive standards can and have had the effect of reducing competition and stifling innovation, especially if the standard is proprietary.

In standards setting there are several interests involved that must be taken into account: the technical expertise (technical, qualitative, etc.) that underlies the standard, the public interest in ensuring a healthy marketplace that rewards innovation, diversity, and price competitiveness, the need for business-to-business cooperation and synergy in the marketplace, and the preponderance of practice, among others. In the Defense industry this also includes national security concerns.

This last consideration provides an additional level of tension between private industry and government interests. In the competition for market share and market niches, businesses are playing a zero-sum game that shifts between allies and competitors. Still, the interest of individual actors is focused on making a proprietary product or service dominant in the target market.

Government, on the other hand, particularly one that operates as a republic based on democratic processes and virtues and a commitment to equal rights, has a different set of interests that are, in many cases, diametrically opposed to those of individual players in the marketplace. Government needs and desires a broad choice of sources for what it needs, while ensuring that qualitative standards are met under a fair and reasonable price. When it does find innovation, it seeks to reward it, but only for the limited terms, conditions, and period of the contractual instrument.

The greater the risk in these cases—especially when cost risk is shared—the greater the need for standards, especially qualitative ones. The longer the term of the effort, the greater the need for checks and balances through evaluation, review, and oversight. The greater the dollar value, the greater importance for fiduciary and contractual accountability.

Thus, subject matter expertise has evolved over time, aligned with the functions and end items being developed and delivered. These areas include:

Estimating – A critical part of program and project management, this is a discipline with highly specialized quantitative methods for estimating and projecting project costs, resources, and duration. It is part of the planning phase prior to program or project inception. It can be used to support budget planning prior to program approval, during negotiations and, after award, to inform the project plan.

Systems Engineering – as described by the International Council of Systems Engineering, “a transdisciplinary and integrative approach to enable the successful realization, use, and retirement of engineered systems, using systems principles and concepts, and scientific, technological, and management methods.”

As it relates to program and project management, the technical documents related to providing the basis and structure of the lifecycle management of the end item application, including the application of technical standards, measures of effectiveness, measures of performance, key performance parameters, and technical performance measures. In simplistic terms, systems engineering defines when the item under R&D reaches the state of “done.”

Financial Management – at the program and project management level, the planning, organizing, directing and controlling the financial activities such as procurement and utilization of funds to adhere to the limitations of law and consistent with the terms and conditions of the contract and the its ancillary planning and execution documents.

At its core, financial management within this discipline includes the planning, programming, budgeting, and execution process for the financial requirements of successful program execution. As with any individual enterprise, cashflow for required activities with the right type of money determined by Congressional appropriation presents a unique and specialized skillset under program management in the public interest. Oftentimes the lack of funds necessary to address a particular programmatic risk or challenge can be just as decisive to program execution and success as any technical challenge.

Risk and Uncertainty – the concept of risk and uncertainty have evolved over time. Under classical economics (both Keynes and Knight), risk is where all of the future events and consequences of an action are known, but where specific outcomes are unknown. As such, probability calculus is applied to determine the risk management: mitigation and handling. Uncertainty, under this definition, is unknowable events that will result from our actions and is implicit in human action. There is no probability calculus or risk buy-down that can address areas of uncertainty. These definitions are also accepted under the concept of complexity economics.

My good colleague Glen Alleman (2013) at his blog, Herding Cats, casts risk as a product of uncertainty. This is a reordering of definitions, but not unuseful. Under Glen’s approach, uncertainty is broken into aleatory and epistemic uncertainty. The first—aleatory—comes from a random process, what Keynes, Knight, et al. would define as classical uncertainty. The second—epistemic—comes from lack of knowledge. The first is irreducible, which is consistent with classical economics and complexity economics; the second is subject to probability analysis and risk handling methodologies.

Both risk and uncertainty—aleatory and epistemic—occur within all phases and under each discipline within the project management environment. Any human action involves these forces of cause-and-effect and uncertainty—and limit our actions under the concept of “free will.”

Planning and Scheduling – usually these have been viewed as separate entities, but they are, in fact, part of a continuum, as are all of the disciplines mentioned, but more on that later in these blogs.

Planning involves the ability to derive the products of both the contract terms and conditions, and the systems engineering process. The purpose is to develop a high-level, time-phased plan that captures program events, deliverables, requirements, significant accomplishment criteria, and basic technical performance management achievement that will be the basis for a more detailed integrated master schedule.

The scheduling discipline is tasked with further delineating the summary tasks into schedule activities based on critical path methodology. A common refrain when I worked on the government side of program management was that you cannot eat an elephant in one gulp: you have to eat it one piece at a time.

As it relates to this portion of project methodology, I have, over the years, heard people say that planning and scheduling is more of an art instead of a science. Yet, the artifacts upon which our planning documents rest exist as part of the acquisition process and our systems and procedures are mature and largely standardized. The methods of systems engineering are precise and consistent.

The lexicon of planning and scheduling, regardless of the software applications or manual methods used, describe the same phenomenon and concepts, despite slightly different—and oftentimes proprietary—terminology. The concept of critical path analysis is well documented in the literature with slight, though largely insignificant, differences in application.

What appears as art is, in reality, a process that involves a great deal of complexity because these are the documents upon which all of the moving parts of the program are documented. Rather than art, it is a discipline that requires attention to detail and collaboration, aside from the power of computing.

Resource Management – as with planning and scheduling, resource management consists of a detailed accounting of the people, equipment, monies, and suppliers that are required to achieve the activities detailed in the program schedule.

In the detailed and specialized planning of projects and programs in the public interest, these efforts are cross-referenced and further delineated to the actual work that needs to be completed. A Work Breakdown Structure (or WBS), is the method of time-phasing the work using detailed tasks that integrate scope, cost, and schedule at the lowest level of achievement.

Baselining and Performance Management – are essential for project control in this environment. In this case, project and program schedule, cost, and resources are (ideally) risk adjusted and a performance management baseline is established: the basis for the assessment and control of the project.

This leads us to the methodology that is always on the cusp of being the Ozymandias of program management: earned value management or EVM. The discipline of EVM arose out of the Space Age era of the 1960s. The premise is simple: when undertaking any complex effort there is a finite amount of money and resources, and a target date for the needed end item. We need a method to determine whether the actual work performed in terms of budgeted resources and time is tracking to the plan to produce the desired end item application.

When looking at the utility of EVM, one must ask: while each of the disciplines noted above also track achievement over the lifecycle of the project or program, do any combine an analysis against budgeted time and resources? The answer is no, and so EVM is essential to management of these efforts.

Still, our other disciplines also track important information that is not captured by EVM. Thus, the entire corpus of our disciplines represents the project and program ecosystem. These processes, procedures, and the measures derived from them are interconnected. It is this salient fact that points us in the direction regarding the future of program management.

Conclusions from Part One

Given that we have outlined the unique and distinctive characteristics of public interest program management, the environment and basis upon which such program management rests, and the highly developed disciplines that have evolved as a result of the experience in system development, deployment, and lifecycle management, our inquiry must next explore the evolutionary nature of the program organization itself. Once identified and delineated, we must then determine the place of program organization within the context of developments in systems and information theory which will give us insight into the future of program management.

Open: Strategic Planning, Open Data Systems, and the Section 809 Panel

Sundays are usually days reserved for music and the group Rhye was playing in the background when this topic came to mind.

I have been preparing for my presentation in collaboration with my Navy colleague John Collins for the upcoming Integrated Program Management Workshop in Baltimore. This presentation will be a non-proprietary/non-commercial talk about understanding the issue of unlocking data to support national defense systems, but the topic has broader interest.

Thus, in advance of that formal presentation in Baltimore, there are issues and principles that are useful to cover, given that data capture and its processing, delivery, and use is at the heart of all systems in government, and private industry and organizations.

Top Data Trends in Industry and Their Relationship to Open Data Systems

According to Shohreh Gorbhani, Director, Project Control Academy, the top five data trends being pursued by private industry and technology companies. My own comments follow as they relate to open data systems.

  1. Open Technologies that transition from 2D Program Management to 3D and 4D PM. This point is consistent with the College of Performance Management’s emphasis on IPM, but note that the stipulation is the use of open technologies. This is an important distinction technologically, and one that I will explore further in this post.
  2. Real-time Data Capture. This means capturing data in the moment so that the status of our systems is up-to-date without the present delays associated with manual data management and conditioning. This does not preclude the collection of structured, periodic data, but also does include the capture of transactions from real-time integrated systems where appropriate.
  3. Seamless Data Flow Integration. From the perspective of companies in manufacturing and consumer products, technologies such as IoT and Cloud are just now coming into play. But, given the underlying premises of items 1 and 2, this also means the proper automated contextualization of data using an open technology approach that flows in such a way as to be traceable.
  4. The use of Big Data. The term has lost a good deal of its meaning because of its transformation into a buzz-phrase and marketing term. But Big Data refers to the expansion in the depth and breadth of available data driven by the economic forces that drive Moore’s Law. What this means is that we are entering a new frontier of data processing and analysis that will, no doubt, break down assumptions regarding the validity and strength of certain predictive analytics. The old assumptions that restrict access to data due to limitations of technology and higher cost no longer apply. We are now in the age of Knowledge Discovery in Data (KDD). The old approach of reporting assumed that we already know what we need to know. The use of data challenges old assumptions and allows us to follow the data where it will lead us.
  5. AI Forecasting and Analysis. No doubt predictive AI will be important as we move forward with machine learning and other similar technologies. But this infant is not yet a rug rat. The initial experiences with AI are that they tend to reflect the biases of the creators. The danger here is that this defeats KDD, which results in stagnation and fugue. But there are other areas where AI can be taught to automate mundane, value-neutral tasks relating to raw data interpretation.

The 809 Panel Recommendation

The fact that industry is the driving force behind these trends that will transform the way that we view information in our day-to-day work, it is not surprising that the 809 Panel had this to say about existing defense business systems:

“Use existing defense business system open-data requirements to improve strategic decision making on acquisition and workforce issues…. DoD has spent billions of dollars building the necessary software and institutional infrastructure to collect enterprise wide acquisition and financial data. In many cases, however, DoD lacks the expertise to effectively use that data for strategic planning and to improve decision making. Recommendation 88 would mitigate this problem by implementing congressional open-data mandates and using existing hiring authorities to bolster DoD’s pool of data science professionals.”

Section 809 Volume 3, Section 9, p. 477

At one point in my military career, I was assigned as the Materiel, Fuels, and Transportation Officer of Naval Air Station, Norfolk. As a major naval air base, transportation hub, and home to a Naval Aviation Depot, we shipped and received materiel and supplies across the world. In doing so, our transportation personnel would use what at the time was new digital technology to complete an electronic bill of lading that specified what and when items were being shipped, the common or military carrier, the intended recipient, and the estimated date of arrival, among other essential information.

The customer and receiving end of this workflow received an open systems data file that contained these particulars. The file was an early version of open data known as an X12 file, for which the commercial transportation industry was an early adopter. Shipping and receiving activities and businesses used their own type of local software: and there were a number of customized and commercial choices out there, as well as those used by common carriers such various trucking and shipping firms, the USPS, FEDEX, DHS, UPS, and others. The X12 file was the DMZ that made the information open. Software manufacturers, if they wanted to stay relevant in the market, could not impose a proprietary data solution.

Furthermore, standardization of terminology and concepts ensured that the information was readable and comprehensible wherever the items landed–whether across receiving offices in the United States, Japan, Europe, or even Istanbul. Understanding that DoD needs the skillsets to be able to optimize data, it didn’t require an army of data scientists to achieve this end-state. It required the right data science expertise in the right places, and the dictates of transportation consumers to move the technology market to provide the solution.

Over the years both industry and government have developed a number of schema standards focused on specific types of data, progressing from X12 to XML and now projected to use JSON-based schemas. Each of them in their initial iterations automated the submission of physical reports that had been required by either by contract or operations. These focused on a small subset of the full dataset relating to program management and project controls.

This progression made sense.

When digitized technology is first introduced into an intensive direct-labor environment, the initial focus is to automate the production of artifacts and their underlying processes in order to phase in the technology’s acceptance. This also allows the organization to realize immediate returns on investment and improvements in productivity. But this is the first step, not the final one.

Currently for project controls the current state is the UN/CEFACT XML for program performance management data, and the contract cost and labor data collection file known as the FlexFile. Clearly the latter file, given that the recipient is the Office of the Secretary of Defense Cost Assessment and Program Evaluation (OSD CAPE), establish it as one of many feedback loops that support that office’s role in coordinating the planning, programming, budgeting, and evaluation (PPBE) system related to military strategic investments and budgeting, but only one. The program performance information is also a vital part of the PPBE process in evaluation and in future planning.

For most of the U.S. economy, market forces and consumer requirements are the driving force in digital innovation. The trends noted by Ms. Gorbhani can be confirmed through a Google search of any one of the many technology magazines and websites that can be found. The 809 Panel, drawn as it was from specialists and industry and government, were tasked “to provide recommendations that would allow DoD to adapt and deliver capability at market speeds, while ensuring that DoD remains true to its commitment to promote competition, provide transparency in its actions, and maintain the integrity of the defense acquisition system.”

Given that the work of the DoD is unique, creating a type of monopsony, it is up to leadership within the Department to create the conditions and mandates necessary to recreate in microcosm the positive effects of market forces. The DoD also has a very special, vital mission in defending the nation.

When an individual business cobbles together its mission statement it is that mission that defines the necessary elements in data collection that are then essential in making decisions. In today’s world, best commercial sector practice is to establish a Master Data Management (MDM) approach in defining data requirements and practices. In the case of DoD, a similar approach would be beneficial. Concurrent with the period of the 809 Panel’s efforts, RAND Corporation delivered a paper in 2017 (link in the previous sentence) that made recommendations related to data governance that are consistent with the 809 Panel’s recommendations. We will be discussing these specific recommendations in our presentation.

Meeting the mission and readiness are the key components to data governance in DoD. Absent such guidance, specialized software solution providers, in particular, will engage in what is called “rent-seeking” behavior. This is an economic term that means that an “entity (that) seeks to gain added wealth without any reciprocal contribution of productivity.”

No doubt, given the marketing of software solution providers, it is hard for decision-makers to tell what constitutes an open data system. The motivation of a software solution is to make itself as “sticky” as possible and it does that by enticing a customer to commit to proprietary definitions, structures, and database schemas. Usually there are “black-boxed” portions of the software that makes traceability impossible and that complicates the issue of who exactly owns the data and the ability of the customer to optimize it and utilize it as the mission dictates.

Furthermore, data visualization components like dashboards are ubiquitous in the market. A cursory stroll through a tradeshow looks like a dashboard smorgasbord combined with different practical concepts of what constitutes “open” and “integration”.

As one DoD professional recently told me, it is hard to tell the software systems apart. To do this it is necessary to understand what underlies the software. Thus, a proposed honest-broker definition of an open data system is useful and the place to start, given that this is not a notional concept since such systems have been successfully been established.

The Definition of Open Data Systems

Practical experience in implementing open data systems toward the goal of optimizing essential information from our planning, acquisition, financial, and systems engineering systems informs the following proposed definition, which is based on commercial best practice. This proposal is also based on the principle that the customer owns the data.

  1. An open data system is one based on non-proprietary neutral schemas that allow for the effective capture of all essential elements from third-party proprietary and customized software for reporting and integration necessary to support both internal and external stakeholders.
  2. An open data system allows for complete traceability and transparency from the underlying database structure of the third-party software data, through the process of data capture, transformation, and delivery of data in the neutral schema.
  3. An open data system targets the loading of the underlying source data for analysis and use into a neutral database structure that replicates the structure of the neutral schema. This allows for 100% traceability and audit of data elements received through the neutral schema, and ensures that the receiving organization owns the data.

Under this definition, data from its origination to its destination is more easily validated and traced, ensuring quality and fidelity, and establishing confidence in its value. Given these characteristics, integration of data from disparate domains becomes possible. The tracking of conflicting indicators is mitigated, since open system data allows for its effective integration without the bias of proprietary coding or restrictions on data use. Finally, both government and industry will not only establish ownership of their data–a routine principle in commercial business–but also be free to utilize new technologies that optimize the use of that data.

In closing, Gahan Wilson, a cartoonist whose work appeared in National Lampoon, The New Yorker, Playboy, and other magazines recently passed.

When thinking of the barriers to the effective use of data, I came across this cartoon in The New Yorker:

Open Data is the key to effective integration and reporting–to the optimal use of information. Once mandated and achieved, our defense and business systems will be better informed and be able to test and verify assumed knowledge, address risk, and eliminate dogmatic and erroneous conclusions. Open Data is the driver of organizational transformation keyed to the effective understanding and use of information, and all that entails. Finally, Open Data is necessary to the mission and planning systems of both industry and the U.S. Department of Defense.

Sledgehammer: Pisano Talks!

My blogging hiatus is coming to an end as I take a sledgehammer to the writer’s block wall.

I’ve traveled far and wide over the last six months to various venues across the country and have collected a number of new and interesting perspectives on the issues of data transformation, integrated project management, and business analytics and visualization. As a result, I have developed some very strong opinions regarding the trends that work and those that don’t regarding these topics and will be sharing these perspectives (with the appropriate supporting documentation per usual) in following posts.

To get things started this post will be relatively brief.

First, I will be speaking along with co-presenter John Collins, who is a Senior Acquisition Specialist at the Navy Engineering & Logistics Office, at the Integrated Program Management Workshop at the Hyatt Regency in beautiful downtown Baltimore’s Inner Harbor 10-12 December. So come on down! (or over) and give us a listen.

The topic is “Unlocking Data to Improve National Defense Systems”. Today anyone can put together pretty visualizations of data from Excel spreadsheets and other sources–and some have made quite a bit of money doing so. But accessing the right data at the right level of detail, transforming it so that its information content can be exploited, and contextualizing it properly through integration will provide the most value to organizations.

Furthermore, our presentation will make a linkage to what data is necessary to national defense systems in constructing the necessary artifacts to support the Department of Defense’s Planning, Programming, Budgeting and Execution (PPBE) process and what eventually becomes the Future Years Defense Program (FYDP).

Traditionally information capture and reporting has been framed as a question of oversight, reporting, and regulation related to contract management, capital investment cost control, and DoD R&D and acquisition program management. But organizations that fail to leverage the new powerful technologies that double processing and data storage capability every 18 months, allowing for both the depth and breadth of data to expand exponentially, are setting themselves up to fail. In national defense, this is a condition that cannot be allowed to occur.

If DoD doesn’t collect this information, which we know from the reports of cybersecurity agencies that other state actors are collecting, we will be at a serious strategic disadvantage. We are in a new frontier of knowledge discovery in data. Our analysts and program managers think they know what they need to be viewing, but adding new perspectives through integration provide new perspectives and, as a result, will result in new indicators and predictive analytics that will, no doubt, overtake current practice. Furthermore, that information can now be processed and contribute more, timely, and better intelligence to the process of strategic and operational planning.

The presentation will be somewhat wonky and directed at policymakers and decisionmakers in both government and industry. But anyone can play, and that is the cool aspect of our community. The presentation will be non-commercial, despite my day job–a line I haven’t crossed up to this point in this blog, but in this latter case will be changing to some extent.

Back in early 2018 I became the sole proprietor of SNA Software LLC–an industry technology leader in data transformation–particularly in capturing datasets that traditionally have been referred to as “Big Data”–and a hybrid point solution that is built on an open business intelligence framework. Our approach leverages the advantages of COTS (delivering the 80% solution out of the box) with open business intelligence that allows for rapid configuration to adapt the solution to an organization’s needs and culture. Combined with COTS data capture and transformation software–the key to transforming data into information and then combining it to provide intelligence at the right time and to the right place–the latency in access to trusted intelligence is reduced significantly.

Along these lines, I have developed some very specific opinions about how to achieve this transformation–and have put those concepts into practice through SNA and delivered those solutions to our customers. Thus, the result has been to reduce both the effort and time to capture large datasets from data that originates in pre-processed data, and to eliminate direct labor and the duration to information delivery by more than 99%. The path to get there is not to apply an army of data scientists and data analysts that deals with all data as if it is flat and to reinvent the wheel–only to deliver a suboptimized solution sometime in the future after unnecessarily expending time and resources. This is a devolution to the same labor-intensive business intelligence approaches that we used back in the 1980s and 1990s. The answer is not to throw labor at data that already has its meaning embedded into its information content. The answer is to apply smarts through technology, and that’s what we do.

Further along these lines, if you are using hard-coded point solutions (also called purpose-built software) and knitted best-of-breed, chances are that you will find that you are poorly positioned to exploit new technology and will be obsolete within the next five years, if not sooner. The model of selling COTS solutions and walking away except for traditional maintenance and support is dying. The new paradigm will be to be part of the solution and that requires domain knowledge that translates into technology delivery.

More on these points in future posts, but I’ve placed the stake in the ground and we’ll see how they hold up to critique and comment.

Finally, I recently became aware of an extremely informative and cutting-edge website that includes podcasts from thought leaders in the area of integrated program management. It is entitled InnovateIPM and is operated and moderated by a gentleman named Rob Williams. He is a domain expert in project cost development, with over 20 years of experience in the oil, gas, and petrochemical industries. Robin has served in a variety of roles throughout his career and is now focuses on cost estimating and Front-End Loading quality assurance. His current role is advanced project cost estimator at Marathon Petroleum’s Galveston Bay Refinery in Texas City.

Rob was also nice enough to continue a discussion we started at a project controls symposium and interviewed me for a podcast. I’ll post additional information once it is posted.

Take Me To The River, Part 3, Technical Performance and Risk Management Digital Elements of Integrated Program Management

Part three of this series of articles on the elements of Integrated Program and Project Management will focus on two additional areas of IPM: technical performance and risk management. Prior to jumping in, however–and given the timeframe over which I’ve written this series–a summary to date is in order.

The first part of our exploration into IPM digital inventory concerned cost elements. Cost in this sense was broadly defined as any cost elements that need to be of interest to a project or program managers and their  teams. I first clarified our terms by defining the differences between project and program management–and how those differences will influence our focus. Then I outlined the term cost as falling into the following categories:

  1. Contract costs and the cost categories within the organizational hierarchy;
  2. Cost estimates, “colors” of money where such distinctions exist, and cashflow;
  3. Additional costs that relate to the program or project effort that are not always directly attributed to the effort, such as PMA, furnished materials or labor, corollary and supporting efforts on the part of the customer, and other overhead and G&A type costs;
  4. Contract cost performance under earned value management (EVM); and
  5. Portfolio management considerations and total cost of ownership.

The second part of this exposition concerned schedule elements, that is, time-phased planning and performance that is essential to any project or program effort. The article first discussed the primacy of the schedule in project and program planning and execution, given its ties in defining the basis for the cost elements addressed in the first part of the series. I then discussed the need for integrated planning as the basis for a valid executable schedule and PMB, the detailed elements and citations of the sources of that information in the literature and formal guidance, the role of framing assumptions in the construction of schedule and cost plans with its holistic approach to go/no-go decision-making, and, finally, the role of the schedule in establishing the project and program battle rhythm.

Now, in this final section, we will determine the other practical elements of IPM beyond even my expansive view of cost and schedule integration.

Technical Performance Management

Given this paper that resulted from a programmatic effort in Navy regarding Technical Performance Management (TPM), it is probably not surprising that I will start here. My core paper in the link above represents what I viewed as an initial effort at integration of TPM to determine impacts of that performance within program cost performance (EVM) projections. But this approach was based on the following foundations:

a. That the solution needed to tie technical achievement to EVM so that it represented greater fidelity to performance than what I viewed as indirect and imprecise methods; such as WBS elements that contained partial or tangential relationships to technical performance measures, and more subjective and arbitrary methods, such as percent complete.

b. That the approach needed to be tied to established systems engineering methods of technical risk management.

c. That the solution should be simple to implement and be statistically valid in its results, tested by retrospective analyses that performed forensic what-if analysis against the ultimate results.

One need only to look at the extensive bibliography that accompanied my paper to understand that there were clear foundations for TPM, but it remained–and in some quarters remains–a controversial concept that provoked resistance, though programs clearly note achievement of technical requirements. For example, the foundations of technical risk management and tracking that the paper cited were in use at what was Martin Marietta for many years. Thus, why the resistance to change?

First, I think, is that the domain of project performance has rested too long in the hands of the EVM community with its historical foundations in cost and financial management, with a risk averse approach to new innovations. Second, given this history, the natural differences between program management, systems engineering, and earned value SMEs created a situation where there just wasn’t the foundation necessary for any one group to take ownership of this development in systems and business intelligence improvement. Even in industry, such cross-domain initiatives tend to initially garner both skepticism, if not outright cynicism, and resistance by personnel unsure of how the new measures will affect assessment of their work.

But keep in mind that, dating myself a bit, this is the same type of reaction that organizations experienced during the first wave of digitization of work. The reaction to each initiative that I witnessed, from the introduction of desktop computers connected to a central server, to the introduction of the first PCs, to the digitization of work products were met with the common refrain at the time that it was too experimental, or too transient, or too unstable, or too unproven, until it wasn’t any of those things.

I also overstate this resistance a bit. Over the last 20 years organizations within the military services adopted this method–or a variation–of TPM integration, as have some commercial companies. Furthermore, thinking and contributions on TPM have advanced in the intervening years.

The elements of technical performance management can be found in the language of the scope being planned. The brilliant paper authored by Glen B. Alleman, Thomas J. Coonce, and Rick A. Price entitled “Building a Credible Performance Measurement Baseline”, establishes the basis for tying project and program performance to technical achievement. These elements are measures of effectiveness (MoEs), measures of performance (MoPs), technical performance measures (TPMs), and key performance parameters and indicators (KPPs and KPIs). Taken together these define the framing assumptions for the project or program.

When properly constructing the systems, procedures, and artifacts from the decomposition of planning documents and performance language, the proper assignment of these elements to the WBS and specific work packages establishes a strong foundation for tying project and program success to both overall technical performance and the framing assumptions implicit in the effort.

What this means is that there also may be a technical performance baseline, which acts in parallel to the cost-focused performance management baseline. This technical performance baseline is the same as the work that is planned at the work package level for planned work. The assessment of progress is further decomposed to look at the timeframe at that point of progress within the context of the integrated master schedule (the IMS). We ask ourselves as a function of risk: what is the chance of achieving the next threshold in our technical performance plan?

As with all elements of work, our MoEs, MoPs, TPMs, KPPs, and KPIs do not reside at the same level of overall performance management and tracking within the WBS hierarchy. Some can be tracked to the lowest level, usually at work package, some will have contributions from lower levels and be summarized at the control account level, and others are at the total project or program level, with contributors from specific lower levels of the WBS structure.

A common example of what is claimed is a difficult technical performance measure is the factor of weight in aircraft design and production. Weight is an essential factor and must be in alignment with the mission of the aircraft. For example, if an aircraft is being built for the Navy, chances are high that the expectation is for it to be able to take off and land on a moving carrier deck. Take off requires coming up to airspeed very quickly. Landings are especially hard, since they are essentially controlled crashes augmented by an arresting gear. Airframes, avionics, and engines must operate in a salt water environment that involves a metal ship. The electro-magnetic effects alone, if they are not mitigated in the design and systems on both aircraft and ship, will significantly degrade the ability of the aircraft to operate as intended. Controlling weight in this case is essential, especially when one considers the need for fuel, ordnance, and avoiding being detected and shot down.

In current practice, the process of tracking weight over the life of aircraft design and development is tightly controlled. It is a function of tradeoff analysis and decision-making with contributors from many sub-elements of the WBS hierarchy. Thus, the use of the factor of weight as an argument to defeat the need to tightly integrate technical measures to the performance measurement baseline is a canard. On the contrary, it is an argument for tighter and broader integration of IPM data and, in particular, ties our systems to–and thus making the projections and the basis of our decision-making a function of– risk management, which is the next topic.

Risk Management Elements and Integration

There is a good deal of literature on risk, so I will confine this section to how risk in terms of integrated project and program management.

For many subdomains within the project and program management, when one mentions the term “risk management” the view often encountered is that the topic at hand is applying Monte Carlo analysis using non-random random numbers to the integrated master schedule (IMS) to determine the probabilities of a range of task durations and completions. This is known as a Schedule Risk Analysis or SRA.

Most of the correlations today are based on the landmark paper by Philip M. Lurie and Matthew S. Goldberg with the sexy title, “An approximate method for sampling correlated random variables from partially specified distributions”. With Monte Carlo informed by Lurie-Goldberg (for short) we then can make inferences as to alternative critical paths and near-critical paths for time-phasing our work. Also, the contribution of each task in terms of its criticality and contribution to the critical path can be measured. Sensitivity analysis elements identifies the most critical risk elements.

If the integrated master schedule is truly integrated to resource and cost, Lurie-Goldberg allows us to defeat the single-point estimate heavy projections of EVM to calculate a range of cost outcomes by probability distribution. This same type of analysis can be done against the time-phased PMB.

But that is just one area of risk management, which is known as quantitative risk. Another area of risk which should be familiar to project and program managers is qualitative risk. The project and programmatic risk analysis of qualitative risk involves the following steps:

1. Risk identification

2. Risk evaluation

3. Risk handling, and

4. Continual risk management

This is a closed loop system, which garners a risk register, risk ranking, a risk matrix, risk handling and mitigation plans, and a risk handling waterfall chart. These artifacts of risk analysis will also require the monitoring of risk triggers, and cross-referencing to risk ownership.

Once again, though cost impacts are also calculated, with their probability of manifesting, the strongest tie of risk management begins with the integrated master schedule. Thus, conditional and probabilistic branching will provide the project and program team with a step-by-step what-if? analysis that provides alternative schedules that will also provide ranges of cost impact.

Mainstreaming Risk Management and TPM into IPM

In reality, project and program management is simply monitoring and forecasting without technical performance and risk management. Yet, these sub-domains are oftentimes confined to a few specialists or viewed as a dichotomous and independent processes under the general duties of the team.

The economic urgency and essentiality of integrated project and program management is the realization that technical achievement of the product, and the assessment and handling of risks along the course of that achievement, are at the core of project and program management.

Ch- Ch- Changes–What I Learned at the NDIA IPMD Meeting and Last Thoughts on POGO DCMA

Hot Topics at the National Defense Industrial Association’s Integrated Program Management Division (NDIA-IPMD)

For those of you who did not attend, or who have a passing interest in what is happening in the public sphere of DoD acquisition, the NDIA IPMD meeting held last week was a great importance. Here are the highlights.

Electronic Submission of Program Management Information under the New Proposed DoD Schema

Those who have attended meetings in the past, and who read this blog, know where I stand on this issue, which is to capture all of the necessary information that provides a full picture of program and project performance among all of its systems and subsystems, but to do so in an economically feasible manner that reduces redundancy, reduces data streams, and improves timeliness of submission. Basic information economics state that a TB of data is only incrementally more expensive–as defined by the difference in the electricity generated–as a MB of data. Basic experience in IT management demonstrates that automating a process that eliminates touch labor in data production/validation improves productivity and speed.

Furthermore, if a supplier in complex program and project management is properly managing–and has sufficient systems in place–then providing the data necessary for the DoD to establish accountability and good stewardship, to ensure that adequate progress is being made under the terms of the contract, to ensure that contractually required systems that establish competency are reliable and accurate, and to utilize in future defense acquisition planning–should not be a problem. We live in a world of 0s and 1s. What we expect of our information systems is to do the grunt work handling ever large systems in providing information. In this scenario the machine is the dumb one and the person assessing the significance and context of what is processed into intelligence is the smart one.

The most recent discussions and controversies surrounded the old canard regarding submission at Control Account as opposed to the Work Package level of the WBS. Yes, let’s party like it’s 1997. The other issue was whether cumulative or current data should be submitted. I have issues with both of these items, which continue to arise like bad zombie ideas. You put a stake in them, but they just won’t die.

To frame the first issue, there are some organizations/project teams that link budget to control account, and others to work package. So practice is the not determinant, but it speaks to earned value management (EVM).The receiving organization is going to want the lowest level for reporting where there is foot-and-tie to not only budget, but to other systems. This is the rub.

I participated in an still-unpublished study for DoD that indicated that if one uses earned value management (EVM) exclusively to manage that it doesn’t matter. You get a bit more fidelity and early warning at the work package level, but not much.

But note my conditional.

No one exclusively uses EVM to manage projects and programs. That would be foolish and seems to be the basis of the specious attack on the methodology when I come upon it, especially by baby PMs. The discriminator is the schedule, and the early warning is found there. The place where you foot-and-tie schedule to the WBS is at the work package level. If you are restricted to the control account for reporting you have a guessing game–and gaming of the system–given that there will be many schedule activities to one control account.

Furthermore, the individual reviewing EVM and schedule will want to ensure that the Performance Measurement Baseline (PMB) and the Integrated Master Schedule (IMS) were not constructed in isolation from one another. There needs to be evidence that the work planned under the cost plan matches the work in time.

Regarding cumulative against current dollar submission the issue is one of accuracy. First, consecutive cumulative submissions require that the latest figure be subtracted from the last, which causes round-up errors–which are exacerbated if reporting is restricted to the control account level. NDIA IPMD had a long discussion on the intrinsic cumulative-to-cumulative error at a meeting last year, which was raised by Gary Humphreys of Humphreys & Associates. Second, cumulative submissions often hide retroactive changes. Third, to catch items in my second point, one must execute cross checks for different types of data, rather than getting a dump from the system of record and rolling up. The more operations and manipulation made to data, the harder it becomes to ensure fidelity and get everyone to agree on one trusted source, that is, in reading off of the same page.

When I was asked about my opinion on these issues, my response was twofold. First, as the head of a technology company it doesn’t matter to me. I can handle data in accordance with the standard DoD schema in any way specified. Second, as a former program management type and as an IT professional with an abiding dislike of inefficient systems, the restrictions proposed are based on the limitations of proprietary systems in use by suppliers that, in my opinion, need to be retired. The DoD and A&D market is somewhat isolated from other market pressures, by design. So the DoD must artificially construct incentives and an ecosystem that pushes businesses (and its own organizations) to greater efficiency and innovation. We don’t fly F-4s anymore, so why continue to use IT business systems designed in 1997 that are solely supported by sunk-cost arguments and rent seeking behavior?

Thus, my recommendation was that it was up to the DoD to determine the information required to achieve their statutory and management responsibilities, and it is up to the software solution providers to provide the, you know, solutions that meet them.

I was also asked if I agreed with another solution provider to have the software companies have another go at the schema prior to publication. My position was consistent in that regard: we don’t work the refs. My recommendation to OSD, also given that I have been in a similar position regarding an earlier initiative long the same lines back when I wore a uniform, is to explore the art of the possible with suppliers. The goals are to reduce data streams, eliminate redundancy, and improve speed. Let the commercial software guys figure out how to make it work.

Current projection is three to four weeks before a final schema is published. We will see if the corresponding documentation will also be provided simultaneously.

DCMA EVAS – Data-driven Assessment and Surveillance

This is a topic for which I cannot write without a conflict of interest since the company that is my day job is the solution provider, so I will make this short and sweet.

First, it was refreshing to see three Hub leads at the NDIA IPMD meeting. These are the individuals in the field who understand the important connection between government acquisition needs and private industry capabilities in the logistics supply chain.

Second, despite a great deal of behind-the-scenes speculation and drama among competitors in the solution provider market, DCMA affirmed that it had selected its COTS solution and that it was working with that provider to work out any minor issues now that MIlestone B has been certified and they are into full implementation.

Third, DCMA announced that the Hubs would be collecting information and that the plan for a central database for EVAS that would combine other DoD data has been put on hold until management can determine the best course for that solution.

Fourth, the Agency announced that the first round of using the automated metrics was later this month and that effort would continue into October.

Fifth, the Agency tamped down some of the fear related to this new process, noting that tripping metrics may simply indicate that additional attention was needed in that area, including those cases where it simply needed to be documented that the supplier’s System Description deviated from the standard indicator. I think this will be a process of familiarization as the Hubs move out with implementation.

DCMA EVAS, in my opinion, is a significant reform of the way the agency does business. It not only drives process and organizational improvement within the agency by eliminating uneven and arbitrary determinations of contract non-compliance (as well as improvements in data management), but opens a dialogue regarding systems improvement, driving similar changes to the supplier base.

NDAA Section 804

There were a couple of public discussions on NDAA Section 804 which, if you are not certain what it is, should go to this link. Having kept track of developments in the NDAA for this coming fiscal year, what I can say is that the final language of Section 804 doesn’t say what many think it says when it was in draft.

What it doesn’t authorize is a broad authority to overrule other statutory requirements for government accountability, oversight, and reporting, including the requirement for earned value management on large programs. This statement is supported by both OSD speakers that addressed the issue in the meeting.

The purpose of Section 804 was to provide the ability to quickly prototype and field new technologies in the wake of 911, particularly as it related to identifying, tracking, and preventing terrorist acts. But the rhetoric behind this section, which was widely touted by elected representatives long before the final version of the current NDAA had been approved, implied a broader mandate for more prosaic acquisitions. My opinion in having seen programs like this before (think Navy A12 program) is that, if people use this authority too broadly that we will be discussing more significant issues than a minor DCMA program that ends this blog post.

Thus, the message coming from OSD is that there is no carte blanche get-out-of-jail card for covering yourself under Section 804 and deciding that lack of management is a substitute for management, and that failure to obtain timely and necessary program performance information does not mean that it cannot be forensically traced in an audit or investigation, especially if things go south. A word to the wise, while birds of a feather catch cold.

DoD Reorganization

The Department of Defense has been undergoing reorganization and the old Office of the Undersecretary of Defense for Acquisition, Technology, and Logistics (OUSD (AT&L) has been broken up and reassigned largely to a new Undersecretary of Defense for Acquisition and Sustainment (USD (A&S).

As a result of this reorganization there were other points indicated:

a. Day-to-day program management will be pushed to the military services. No one really seems to understand what this means. The services already have PMOs in place that do day-to-day management. The policy part of old AT&L will be going intact to A&S as well as program analysis. The personnel cuts that are earmarked for some DoD departments was largely avoided in the reorganization, except at the SES level, which I will address below.

b. Other Transaction Authority (OTA) and Section 804 procurements are getting a lot of attention, but they seem ripe for abuse. I had actually was a member of a panel regarding Acquisition Reform at the NDIA Training and Simulation Industry Symposium held this past June in Orlando. I thought the focus would be on the recommendations from the 809 panel but, instead, turned out to be on OTA and Section 804 acquisitions. What impressed me the most was that even companies that had participated in these types of contracting actions felt that they were unnecessarily loosely composed, which would eventually impede progress upon review and audit of the programs. The consensus in discussions with the audience and other panel members was that the FAR and DFARS already possessed sufficient flexibility if Contracting Officers were properly trained to know how to construct such a requirement and still stay between the lines, absent a serious operational need that cannot be met through normal acquisition methods. Furthermore, OTA SME knowledge is virtually non-existent. Needless to say, things like Nunn-McCurdy and new Congressional reporting requirements in the latest NDAA still need to be met.

c. The emphasis in the department, it was announced, would also shift to a focus on portfolio analysis, but–again–no one could speak to exactly what that means. PARCA and the program analysis personnel on the OSD staffs provide SecDef with information on the entire portfolio of major programs. That is why there is a DoD Central Repository for submission of program data. If the Department is looking to apply some of the principles in DoD that provide flexibility in identifying risks and tradeoffs across, then that would be most useful and a powerful tool in managing resources. We’ve seen efforts like Cost as an Independent Variable (CAIV) and other tradeoff methods come and go, it would be nice if the department would reward the identification of programmatic risk early and often in program go/no-go/tradeoff/early production decisions.

d. To manage over $7 trillion dollars of program PARCA’s expense is $4.5M. The OSD personnel made this point, I think, to emphasize the return on investment in their role regarding oversight, risk identification, and root cause analysis with an eye to efficiency in the management of DoD programs. This is like an insurance policy and a built-in DoD change agent. But from my outside reading, there was a move by Representative Mac Thornberry, who is Chairman of House Armed Services, to hollow out OSD by eliminating PARCA and much of the AT&L staffs. I had discussions with staffs for other Congressional members of the Armed Services Committee when this was going on, and the cause seemed to be that there is a lack of understanding to the extent that DoD has streamlined its acquisition business systems and how key PARCA, DCMA, and the analysis and assessment staffs are to the acquisition ecosystem, and how they foot and tie to the service PEOs and PMOs. Luckily for the taxpayer, it seems that Senate Armed Services members were aware of this and took the language out during markup.

Other OSD Business — Reconciling FAR/DFARS, and Agile

a.. DoD is reconciling differences between overlapping FAR and DFARS clauses. Given that DoD is more detailed and specific in identifying reporting and specifying oversight of contracts by dollar threshold, complexity, risk, and contract type, it will be interesting how this plays out over time. The example given by Mr. John McGregor of OSD was the difference between the FAR and DFARS clauses regarding the application of earned value management (EVM). The FAR clause is more expansive and cut-and-dried. The DFARS clause distinguishes the level of EVM reporting and oversight (and surveillance) that should apply based on more specific criteria regarding the nature of the program and the contract characteristics.

b. The issue of Agile and how it somehow excuses using estimating, earned value management, risk management, and other proven program management controls was addressed. This contention is, of course, poppycock and Glen Alleman on his blog has written extensively about this zombie idea. The 809 Panel seemed to have been bitten by it, though, where its members were convinced that Agile is a program or project management method, and that there is a dichotomy between Agile and the use of EVM. The prescient point in critiquing this assertion was effectively made by the OSD speakers. They noted that they attend many forums and speak to various groups about Agile, and that there is virtually no consensus about what exactly it is and what characteristics define it, but everyone pretty much recognizes it as an approach to software development. Furthermore, EVM is used today on programs that at least partially use Agile software development methodology and do so very effectively. It’s not like crossing the streams.

Gary Bliss, PARCA – Fair Winds and Following Seas

The blockbuster announcement at the meeting was the planned retirement of Gary Bliss, who has been and is the Director of PARCA, on 30 September 2018. This was due to the cut in billets at the Senior Executive Service (SES) level. He will be missed.

Mr. Bliss has transformed the way that DoD does business and he has done so by building bridges. I have been attending NDIA IPMD meetings (and under its old PMSC name) for more than 20 years. Over that time, from when I was near the end of my uniformed career in attending the government/joint session, and, later, when I attended full sessions after joining private industry, I have witnessed a change for the better. Mr. Bliss leaves behind an industry that has established collaboration with DoD and federal program management personnel as its legacy for now and into the future.

Before the formation of PARCA all too often there were two camps in the organization, which translated to a similar condition in the field in relation to PMOs and oversight agencies, despite the fact that everyone was on the same team in terms of serving the national defense. The issue, of course, as it always is, was money.

These two camps would sometimes break out in open disagreement and expressed disparagement of the other. Mr. Bliss brought in a gentleman by the name of Gordon Kranz and together they opened a dialogue in meeting PARCA’s mission. This dialogue has continued with Mr. Kranz’s replacement, John McGregor.

The dialogue has revolved around finding root causes for long delays between development and production in program management and to recommend ways of streamlining processes and eliminating impediments, to root out redundancy, inefficiency, and waste throughout the program and project management supply chain, and to communicate with industry so that they understand the reasons for particular DoD policies and procedures, to obtain feedback on the effects of those decisions and how they can implemented to avoid arbitrariness, and to provide certainty to those who would seek to provide supplies and services to the national defense–especially innovative ones–in defining the rules of engagement. The focus was on collaborative process improvement–and it has worked. Petty disputes occasionally still arise, but they are the exception to the rule.

Under his watch Mr. Bliss established a common trusted data stream for program management data, and forged policies that drove process improvement from the industrial base through the DoD program office. This was not an easy job. His background as an economist and his long distinguished career in the public service armed him well in this regard. We owe him a debt of gratitude.

We can all hope that the next OSD leadership that assumes that role will be as effective and forward leaning.

Final Thoughts on DCMA report revelations

The interest I received on my last post on the DCMA internal report regarding the IWMS project was very broad, but the comments that I received expressed some confusion on what I took away as the lessons learned in my closing paragraphs. The reason for this was the leaked nature of the reports, which alleged breaches of federal statute and other administrative and professional breaches, some of a reputational nature. They are not the final word and for anyone to draw final conclusions from leaked material of that sort would be premature. But here are some initial lessons learned:

Lesson #1: Do not split requirements and game the system to fall below financial thresholds to avoid oversight and management approval. This is a Contracts 101 issue and everyone should be aware of it.

Lesson #2: Ensure checks and balances in the procurement process are established and maintained. Too much power, under the moniker of acquisition reform and “flexibility”, has given CIOs and PMs the authority to make decisions that require collaboration, checks, and internal oversight. In normative public sector acquisition environments the end-user does not get to select the contractor, the contract type, the funding sources, or the acquisition method involving fair and open competition–or a deviation from it. Nor having directed the procurement, to allow the same individual(s) to certify receipt and acceptance. Establishing checks and balances without undermining operational effectiveness requires a subtle hand, in which different specialists working within a matrix organization, with differing chains of command and responsibility, ensure that there is integrity in the process. All members of this team can participate in planning and collaboration for the organizations’ needs. It appears, though not completely proven, that some of these checks and balances did not exist. We do know from the inspections that Contracting Officer’s Representatives (CORs) and Contracting Officers’s Technical Representatives (COTRs) were not appointed for long-term contracts in many cases.

Lesson #3: Don’t pre-select a solution by a particular supplier. This is done by understanding the organization’s current and future needs and putting that expression in a set of salient characteristics, a performance work statement, or a statement of work. This document is authored to share with the marketplace though a formalized and documented process of discovery, such as a request for information (RFI).

Lesson #4: I am not certain if the reports indicate that a legal finding of the appropriate color of money is or is not a sufficient defense but they seem to. This can be a controversial topic within an organization and oftentimes yields differing opinions. Sometimes the situation can be corrected with the substitution of proper money for that fiscal year by higher authority. Some other examples of Anti-deficiency Act (ADA) violations can be found via this link, published by the Defense Comptroller. I’ve indicated from my own experience how, going from one activity to another as an uniformed Navy Officer, I had run into Comptrollers with different opinions of the appropriate color of money for particular types of supplies and services at the same financial thresholds. They can’t all have been correct. I guess I am fortunate that over 23 years–18 of them as a commissioned Supply Corps Officer* and five before that as an enlisted man–that I never ran into a ADA violation in any transaction in which I was involved. The organizations I was assigned to had checks and balances to ensure there was not a statutory violation which, I may add, is a federal crime. Thus, no one should be cavalierly making this assertion as if it were simply an administrative issue. But everyone in the chain is not responsible, unless misconduct or criminal behavior across that chain contributed to the violation. I don’t see it in these reports.Systemic causes require systemic solutions and education.

Note that all of these lessons learned are taught as basic required knowledge in acquisition classes and in regulation. I also note that, in the reports, there are facts of mitigation. It will be interesting to see what eventually comes out of this.

Back to School Daze Blogging–DCMA Investigation on POGO, DDSTOP, $600 Ashtrays,and Epistemic Sunk Costs

Family summer visits and trips are in the rear view–as well as the simultaneous demands of balancing the responsibilities of a, you know, day job–and so it is time to take up blogging once again.

I will return to my running topic of Integrated Program and Project Management in short order, but a topic of more immediate interest concerns the article that appeared on the website for pogo.org last week entitled “Pentagon’s Contracting Gurus Mismanaged Their Own Contracts.” Such provocative headlines are part and parcel of organizations like POGO, which have an agenda that seems to cross the line between reasonable concern and unhinged outrage with a tinge conspiracy mongering. But the content of the article itself is accurate and well written, if also somewhat ripe with overstatement, so I think it useful to unpack what it says and what it means.

POGO and Its Sources

The source of the article comes from three sources regarding an internal Defense Contract Management Agency (DCMA) IT project known as the Integrated Workflow Management System (IWMS). These consist of a September 2017 preliminary investigative report, an April 2018 internal memo, and a draft of the final report.

POGO begins the article by stating that DCMA administers over $5 trillion in contracts for the Department of Defense. The article erroneously asserts that it also negotiates these contracts, apparently not understanding the process of contract oversight and administration. The cost of IWMS was apparently $46.6M and the investigation into the management and administration of the program was initiated by the then-Commander of DCMA, Lieutenant General Wendy Masiello, shortly before she retired from the government in May 2017.

The implication here, given the headline, seems to be that if there is a problem in internal management within the agency, then that would translate into questioning its administration of the $5 trillion in contract value. I view it differently, given that I understand that there are separate lines of responsibility in the agency that do not overlap, particularly in IT. Of the $46.6M there is a question of whether $17M in value was properly funded. More on this below, but note that, to put things in perspective, $46.6M is .000932% of DCMA’s oversight responsibility. This is aside from the fact that the comparison is not quite correct, given that the CIO had his own budget, which was somewhat smaller and unrelated to the $5 trillion figure. But I think it important to note that POGO’s headline and the introduction of figures, while sounding authoritative, are irrelevant to the findings of the internal investigation and draft report. This is a scare story using scare numbers, particularly given the lack of context. I had some direct experience in my military career with issues inspired by the POGO’s founders’ agenda that I will cover below.

In addition to the internal investigation on IWMS, there was also an inspector general (IG) investigation of thirteen IT services contracts that resulted in what can only be described as pedestrian procedural discrepancies that are easily correctable, despite the typically overblown language found in most IG reports. Thus, I will concentrate on this post on the more serious findings of the internal investigation.

My Own Experience with DCMA

A note at this point on full disclosure: I have done business with and continue to do business with DCMA, both as a paid supplier of software solutions, and have interacted with DCMA personnel at publicly attended professional forums and workshops. I have no direct connection, as far as I am aware, to the IWMS program, though given that the assessment is to the IT organization, it is possible that there was an indirect relationship. I have met Lieutenant General Masiello and dealt with some of her subordinates not only during her time at DCMA, but also in some of her previous assignments in Air Force. I always found her to be an honest and diligent officer and respect her judgment. Her distinguished career speaks for itself. I have talked on the telephone to some of the individuals mentioned in the article on unrelated matters, and was aware of their oversight of some of my own efforts. My familiarity with all of them was both businesslike and brief.

As a supplier to DCMA my own contracts and the personnel that administer them were, from time-to-time, affected by the fallout from what I now know to have occurred. Rumors have swirled in our industry regarding the alleged mismanagement of an IT program in DCMA, but until the POGO article, the reasons for things such as a temporary freeze and review of existing IT programs and other actions were viewed as part and parcel of managing a large organization. I guess the explanation is now clear.

The Findings of the Investigation

The issue at hand is largely surrounding the method of source selection, which may have constituted a conflict of interest, and the type of money that was used to fund the program. In reading the report I was reminded of what Glen Alleman recently wrote in his blog entitled “DDSTOP: The Saga Continues.” The acronym DDSTOP means: Don’t Do Stupid Things On Purpose.

There is actually an economic behavioral principle for DDSTOP that explains why people make and double down on bad decisions and irrational beliefs. It is called epistemic sunk cost. It is what causes people to double down in gambling (to the great benefit of the house), to persist in mistaken beliefs, and, as stated in the link above, to “persist with the option which they have already invested in and resist changing to another option that might be more suitable regarding the future requirements of the situation.” The findings seem to document a situation that fits this last description.

In going over the findings of the report, it appears that IWMS’s program violated the following:

a. Contractual efforts in the program that were appropriate for the use of Research, Development, Test and Evaluation (R,D,T & E) funds as opposed to those appropriate for O&M (Operations and Maintenance) funds. What the U.S. Department of Defense calls “color of money.”

b. Amounts that were expended on contract that exceeded the authorized funding documents, which is largely based on the findings regarding the appropriate color of money. This would constitute a serious violation known as an Anti-Deficiency Act violation which, in layman’s terms, is directed to punish public employees for the misappropriation of government funds.

c. Expended amounts of O&M that exceeded the authorized levels.

d. Poor or non-existent program management and cost performance management.

e. Inappropriate contracting vehicles that, taken together, sidestepped more stringent oversight, aside from the award of a software solutions contract to the same company that defined the agency’s requirements.

Some of these are procedural and some are serious, particularly the Anti-deficiency Act (ADA) violations, are serious. In the Contracting Officer’s rulebook, you can withstand pedestrian procedural and administrative findings that are part and parcel of running an intensive contracting organization that acquires a multitude of supplies and services under deadline. But an ADA violation is the deadly one, since it is a violation of statute.

As a result of these findings, the recommendation is for DCMA to lose acquisition authority over the DoD micro-contracting level ($10,000). Organizationally and procedurally, this is a significant and mission-disruptive recommendation.

The Role and Importance of DCMA

DCMA performs an important role in contract compliance and oversight to ensure that public monies are spent properly and for the intended purpose. They perform this role mostly on contracts that are negotiated and entered into by other agencies and the military services within the Department of Defense, where they are assigned contract administration duties. Thus, the fact that DCMA’s internal IT acquisition systems and procedures were problematic is embarrassing.

But some perspective is necessary because there is a drive by some more extreme elements in Congress and elsewhere that would like to see the elimination of the agency. I believe that this would be a grave mistake. As John F. Kennedy is quoted as having said: “You don’t tear your fences down unless you know why they were put up.”

For those of you who were not around prior to the formation of DCMA or its predecessor organization, the Defense Contract Management Command (DCMC), it is important to note that the formation of the agency is a result of acquisition reform. Prior to 1989 the contract administration services (CAS) capabilities of the military services and various DoD offices varied greatly in capability, experience, and oversight effectiveness.Some of these duties had been assigned to what is now the Defense Logistics Agency (DLA), but major acquisition contracts remained with the Services.

For example, when I was on active duty as a young Navy Supply Corps Officer as part of the first class that was to be the Navy Acquisition Corps, I was taught cradle-to-grave contracting. That is, I learned to perform customer requirements development, economic analysis, contract planning, development of a negotiating position, contract negotiation, and contract administration–soup to nuts. The expense involved in developing and maintaining the skill set required of personnel to maintain such a broad-based expertise is unsustainable. For analogy, it is as if every member of a baseball club must be able to play all nine positions at the same level of expertise; it is impossible.

Furthermore, for contract administration a defense contractor would have contractual obligations for oversight in San Diego, where I was stationed, that were different from contracts awarded in Long Beach or Norfolk or any of the other locations where a contracting office was located. Furthermore, the military services, having their own organizational cultures, provided additional variations that created a plethora of unique requirements that added cost, duplication, inconsistency, and inter-organizational conflict.

This assertion is more than anecdotal. A series of studies were commissioned in the 1980s (the findings of which were subsequently affirmed) to eliminate duplication and inconsistency in the administration of contracts, particularly major acquisition programs. Thus, DCMC was first established under DLA and subsequently became its own agency. Having inherited many of the contracting field office, the agency has struggled to consolidate operations so that CAS is administered in a consistent manner across contracts. Because contract negotiation and program management still resides in the military services, there is a natural point of conflict between the services and the agency.

In my view, this conflict is a healthy one, as all power in the hands of a single individual, such as a program manager, would lead to more fraud, waste, and abuse, not less. Internal checks and balances are necessary in proper public administration, where some efficiency is sacrificed to accountability. It is not just the goal of government to “make the trains run on time”, but to perform oversight of the public’s money so that there is accountability in its expenditure, and integrity in systems and procedures. In the case of CAS, it is to ensure that what is being procured actually gets delivered in conformance to the contract terms and conditions designed to reduce the inherent risk in complex acquisition programs.

In order to do its job effectively, DCMA requires innovative digital systems to allow it to perform its CAS function. As a result, the agency must also possess an acquisition capability. Given the size of the task at hand in performing CAS on over $5 trillion of contract effort, the data involved is quite large, and the number of personnel geographically distributed. The inevitable comparisons to private industry will arise, but few companies in the world have to perform this level of oversight on such a large economic scale, which includes contracts comprising every major supplier to the U.S. Department of Defense, involving detailed knowledge of the management control systems of those companies that receive the taxpayer’s money. Thus, this is a uniquely difficult job. When one understands that in private industry the standard failure rate of IT projects is more than 70% percent, then one cannot help but be unimpressed by these findings, given the challenge.

Assessing the Findings and Recommendations

There is a reason why internal oversight documents of this sort stay confidential–it is because these are preliminary/draft findings and there are two sides to every story which may lead to revisions. In addition, reading these findings without the appropriate supporting documentation can lead one to the wrong impression and conclusions. But it is important to note that this was an internally generated investigation. The checks and balances of management oversight that should occur, did occur. But let’s take a close look at what the reports indicate so that we can draw some lessons. I also need to mention here that POGO’s conflation of the specific issues in this program as a “poster child” for cost overruns and schedule slippage displays a vast ignorance of DoD procurement systems on the part of the article’s author.

Money, Money, Money

The core issue in the findings revolves around the proper color of money, which seems to hinge on the definition of Commercial-Off-The-Shelf (COTS) software and the effort that was expended using the two main types of money that apply to the core contract: RDT&E and O&M.

Let’s take the last point first. It appears that the IWMS effort consisted of a combination of COTS and custom software. This would require acquisition, software familiarization, and development work. It appears that the CIO was essentially running a proof-of-concept to see what would work, and then incrementally transitioned to developing the solution.

What is interesting is that there is currently an initiative in the Department of Defense to do exactly what the DCMA CIO did as part of his own initiative in introducing a new technological approach to create IWMS. It is called Other Transactional Authority (OTA). The concept didn’t exist and was not authorized until the 2016 NDAA and is given specific statutory authority under 10 U.S.C. 2371b. This doesn’t excuse the actions that led to the findings, but it is interesting that the CIO, in taking an incremental approach to finding a solution, also did exactly what was recommended in the 2016 GAO report that POGO references in their article.

Furthermore, as a career Navy Supply Corps Officer, I have often gotten into esoteric discussions in contracts regarding the proper color of money. Despite the assertion of the investigation, there is a lot of room for interpretation in the DoD guidance, not to mention a stark contrast in interpreting the proper role of RDT&E and O&M in the procurement of business software solutions.

When I was on the NAVAIR staff and at OSD I ran into the difference in military service culture where what Air Force financial managers often specified for RDT&E would never be approved by Navy financial managers where, in the latter case, they specified that only O&M dollars applied, despite whether development took place. Given that there was an Air Force flavor to the internal investigation, I would be interested to know whether the opinion of the investigators in making an ADA determination would withstand objective scrutiny among a panel of government comptrollers.

I am certain that, given the differing mix of military and civil service cultures at DCMA–and the mixed colors of money that applied to the effort–that the legal review that was sought to resolve the issue. One of the principles of law is that when you rely upon legal advice to take an action that you have a defense, unless your state of mind and the corollary actions that you took indicates that you manipulated the system to obtain a result that shows that you intended to violate the law. I just do not see that here, based on what has been presented in the materials.

It is very well possible that an inadvertent ADA violation occurred by default because of an improper interpretation of the use of the monies involved. This does not rise to the level of a scandal. But going back to the confusion that I have faced from my own experiences on active duty, I certainly hope that this investigation is not used as a precedent to review all contracts under the approach of accepting a post-hoc alternative interpretation by another individual who just happens to be an inspector long after a reasonable legal determination was made, regardless of how erroneous the new expert finds the opinion. This is not an argument against accountability, but absent corruption or criminal intent, a legal finding is a valid defense and should stand as the final determination for that case.

In addition, this interpretation of RDT&E vs. O&M relies upon an interpretation of COTS. I daresay that even those who throw that term around and who are familiar with the FAR fully understand what constitutes COTS when the line between adaptability and point solutions is being blurred by new technology.

Where the criticism is very much warranted are those areas where the budget authority would have been exceeded in any event–and it is here that the ADA determination is most damning. It is one thing to disagree on the color of money that applies to different contract line items, but it is another to completely lack financial control.

Part of the reason for lack of financial control was the absence of good contracting practices and the imposition of program management.

Contracts 101

While I note that the CIO took an incremental approach to IWMS–what a prudent manager would seem to do–what was lacking was a cohesive vision and a well-informed culture of compliance to acquisition policy that would avoid even the appearance of impropriety and favoritism. Under the OTA authority that I reference above as a new aspect of acquisition reform, the successful implementation of a proof-of-concept does not guarantee the incumbent provider continued business–salient characteristics for the solution are publicized and the opportunity advertised under free and open competition.

After all, everyone has their favorite applications and, even inadvertently, an individual can act improperly because of selection bias. The procurement procedures are established to prevent abuse and favoritism. As a solution provider I have fumed quite often where a selection was made without competition based on market surveys or use of a non-mandatory GSA contract, which usually turn out to be a smokescreen for pre-selection.

There are two areas of fault on IMWS from the perspective of acquisition practice, and another in relation to program management.

These are the initial selection of Apprio, which had laid out the initial requirements and subsequently failed to have the required integration functionality, and then, the selection of Discover Technologies under a non-mandatory GSA Blanket Purchase Agreement (BPA) contract under a sole source action. Furthermore, the contract type was not appropriate to the task at hand, and the arbitrary selection of Discover precluded the agency finding a better solution more fit to its needs.

The use of the GSA BPA allowed managers, however, to essentially spit the requirements to stay below more stringent management guidelines–an obvious violation of acquisition regulation that will get you removed from your position. This leads us to what I think is the root cause of all of these clearly avoidable errors in judgment.

Program Management 101

Personnel in the agency familiar with the requirements to replace the aging procurement management system understood from the outset that the total cost would probably fall somewhere between $20M and $40M. Yet all effort was made to reduce the risk by splitting requirements and failing to apply a programmatic approach to a clearly complex undertaking.

This would have required the agency to take the steps to establish an acquisition strategy, open the requirement based on a clear performance work statement to free and open competition, and then to establish a program management office to manage the effort and to allow oversight of progress and assessment of risks in a formalized environment.

The establishment of a program management organization would have prevented the lack of financial control, and would have put in place sufficient oversight by senior management to ensure progress and achievement of organizational goals. In a word, a good deal of the decision-making was based on doing stupid things on purpose.

The Recommendations

In reviewing the recommendations of the internal investigation, I think my own personal involvement in a very similar issue from 1985 will establish a baseline for comparison.

As I indicated earlier, in the early 1980s, as a young Navy commissioned officer, I was part of the first class of what was to be the Navy Acquisition Corps, stationed at the Supply Center in San Diego, California. I had served as a contracting intern and, after extensive education through the University of Virginia Darden School of Business, the extended Federal Acquisition Regulation (FAR) courses that were given at the time at Fort Lee, Virginia, and coursework provided by other federal acquisition organizations and colleges, I attained my warrant as a contracting officer. I also worked on acquisition reform issues, some of which were eventually adopted by the Navy and DoD.

During this time NAS Miramar was the home of Top Gun. In 1984 Congressman Duncan Hunter (the elder not the currently indicted junior of the same name, though from the same San Diego district), inspired by news of $7,600 coffee maker and a $435 hammer publicized by the founders of POGO, was given documents by a disgruntled employee at the base regarding the acquisition of replacement E-2C ashtrays that had a cost of $300. He presented them to the Base Commander, which launched an investigation.

I served on the JAG investigation under the authority of the Wing Commander regarding the acquisitions and then, upon the firing of virtually the entire chain of command at NAS Miramar, which included the Wing Commander himself, became the Officer-in-Charge of Supply Center San Diego Detachment NAS Miramar. Under Navy Secretary Lehman’s direction I was charged with determining the root cause of the acquisition abuses and given 60-90 days to take immediate corrective action and clear all possible discrepancies.

I am not certain who initiated the firings of the chain of command. From talking with contemporaneous senior personnel at the time it appeared to have been instigated in a fit of pique by the sometimes volcanic Secretary of Defense Caspar Weinberger. While I am sure that Secretary Weinberger experienced some emotional release through that action, placed in perspective, his blanket firing of the chain of command, in my opinion, was poorly advised and counterproductive. It was also grossly unfair, given what my team and I found as the root cause.

First of all, the ashtray was misrepresented in the press as a $600 ashtray because during the JAG I had sent a sample ashtray to the Navy industrial activity at North Island with a request to tell me what the fabrication of one ashtray would cost and to provide the industrial production curve that would reduce the unit price to a reasonable level. The figure of $600 was to fabricate one. A “whistleblower” at North Island took this slice of information out of context and leaked it to the press. So the $300 ashtray, which was bad enough, became the $600 ashtray.

Second, the disgruntled employee who gave the files to Congressman Hunter had been laterally assigned out of her position as a contracting officer by the Supply Officer because of the very reason that the pricing of the ashtray was not reasonable, among other unsatisfactory performance measures that indicated that she was not fit to perform those duties.

Third, there was a systemic issue in the acquisition of odd parts. For some reason there was an ashtray in the cockpit of the E-2C. These aircraft were able to stay in the air an extended period of time. A pilot had actually decided to light up during a local mission and, his attention diverted, lost control of the aircraft and crashed. Secretary Lehman ordered corrective action. The corrective action taken by the squadron at NAS Miramar was to remove the ashtray from the cockpit and store them in a hangar locker.

Four, there was an issue of fraud. During inspection the spare ashtrays were removed and deposited in the scrap metal dumpster on base. The tech rep for the DoD supplier on base retrieved the ashtrays and sold them back to the government for the price to fabricate one, given that the supply system had not experienced enough demand to keep them in stock.

Fifth, back to the systemic issue. When an aircraft is to be readied for deployment there can be no holes representing missing items in the cockpit. A deploying aircraft with this condition is then grounded and a high priority “casuality report” or CASREP is generated. The CASREP was referred to purchasing which then paid $300 for each ashtray. The contracting officer, however, feeling under pressure by the high priority requisition, did not do due diligence in questioning the supplier on the cost of the ashtray. In addition, given that several aircraft deploy, there were a number of these requisitions that should have led the contracting officer to look into the matter more closely to determine price reasonableness.

Furthermore, I found that buying personnel were not properly trained, that systems and procedures were not established or enforced, that the knowledge of the FAR was spotty, and that procurements did not go through multiple stages of review to ensure compliance with acquisition law, proper documentation, and administrative procedure.

Note that in the end this “scandal” was born by a combination of systemic issues, poor decision-making, lack of training, employee discontent, and incompetence.

I successfully corrected the issues at NAS Miramar during the prescribed time set by the Secretary of the Navy, worked with the media to instill public confidence in the system, built up morale, established better customer service, reduced procurement acquisition lead times (PALT), recommended necessary disciplinary action where it seemed appropriate, particularly in relation to the problematic employee, recovered monies from the supplier, referred the fraud issues to Navy legal, and turned over duties to a new chain of command.

NAS Miramar procurement continued to do its necessary job and is still there.

What the higher chain of command did not do was to take away the procurement authority of NAS Miramar. It did not eliminate or reduce the organization. It did not close NAS Miramar.

It requires leadership and focus to take effective corrective action to not only fix a broken system, but to make it better while the corrective actions are being taken. As I outlined above, DCMA performs an essential mission. As it transitions to a data-driven approach and works to reduce redundancy and inefficiency in its systems, it will require more powerful technologies to support its CAS function, and the ability to acquire those technologies to support that function.

Take Me To The River, Part 2, Schedule Elements–A Digital Inventory of Integrated Program Management Elements

Recent attendance at various forums to speak has interrupted the flow of this series on IPM elements. Among these venues I was engaged in discussions regarding this topic, as well as the effects of acquisition reform on the IT, program, and project management communities in the DoD and A&D marketplace.

For this post I will restrict the topic to what are often called schedule elements, though that is a nebulous term. Also, one should not draw a conclusion that because I am dealing with this topic following cost elements, that it is somehow inferior in importance to those elements. On the contrary, planning and scheduling are integral to applying resources and costs, in tracking cost performance, and in our systemic analysis its activities, artifacts, and elements are antecedent to cost element considerations.

The Relative Position of Schedule

But the takeaway here is this: under no circumstances should any program or project manager believe that cost and schedule systems represent a dichotomy, nor a hierarchy, of disciplines. They are interdependent and the behavior noted in one will be manifested in the other.

This is important to keep in mind, because the software industry, more than any other, has been responsible for reinforcing and solidifying this (erroneous) perspective. During the first generation of desktop application development, software solutions were built to automate the functions of traditional line and staff functions. This made a great deal of sense.

From a sales and revenue perspective, it is easier to sell a limited niche software “tool” to an established customer base that will ensure both quick acceptance and immediate realization of productivity and labor savings. The connection from the purchase to ROI was easily traceable in the time span and at the level of the person performing their workaday tasks.

Thus, solutions were built to satisfy the needs of cost analysts, schedule analysts, systems engineers, cost estimators, and others. Where specific solutions left gaps, such spreadsheet solutions such as Microsoft Excel were employed to fill them. It was in no one’s interest to go beyond their core competency. Once a dominant or set of dominant incumbents (a monoposony) inhabited a niche, they employed the usual strategies for “stickiness” to defend territory and raise barriers to new entries.

What was not anticipated by many organizations was the fact that once you automate a function that the nature of the system, if one is to implement the most effective organizational structure, is transformed to conform to the most efficient flow and use of data–and its resulting transformation into information and intelligence. Oftentimes the skill set to use the intelligence does not exist because the resulting insights and synergy involved in taking larger and more comprehensive datasets which themselves are more credible and accurate was not anticipated in adjusting the organizational structure.

This is changing and must change, because the old way of using limited sets of data in the age of big(ger) data that provide a more comprehensive view of business conditions is not tenable. At least, not if a company or organization wants to stay relevant or profitable.

Characteristics and Basic Elements of the Project Schedule

If one were to perform a Google search of project schedule while reading this post, you would find a number of definitions, some of which overlap. For example, the PMBOK defines a schedule as, quite simply, “the planned dates for performing activities and the planned dates for meeting milestones.”

Thus our elements include planned dates, activities, and milestones. But is that all? Under this definition, any kind of plan, from a minor household renovation or upgrade to building an aircraft carrier would contain only these elements.

I don’t think so.

For complex projects and programs, which is the focus on this blog, our definition of a project schedule is a bit more comprehensive. If you go to the aforementioned A Guide for DoD Program Managers mentioned in my last post, you will find even less specificity.

The reason for this is that what we define as a project schedule is part and parcel of the planning phase of a project, which is then further specified in the specific time-phased planning elements for execution of the project through its lifespan into production. It is the schedule that ties together all of the disciplines in putting together a project–acquisition, systems engineering, cost estimating, and project performance management.

In attending scheduled-focused conferences over the years and in talking to program management colleagues is the refrain that:

a. It is hard to find a good scheduler, and

b. Constructing a schedule is more of an art than a science.

I can only say that this cedes the field to a small cadre of personnel who perform an essential function, but who do so with few objective tests of effectiveness or accountability–until it is too late.

But the reality is quite different from the fuzzy perception of schedule that is often assumed. All critical path method (CPM) schedules describe the same phenomena, though the lexicon will vary based on the specific proprietary application employed.

In government-focused and large commercial projects, the schedule is heart of planning and execution. In the DoD world it is known as the Integrated Master Schedule (IMS), which utilize the inherent bottom-up relationships of elements to determine the critical path. The main sources regarding the IMS have a great deal of overlap, but tend to be either aspirational (and unfortunately not prescriptive in defining the basic characteristics of an IMS) or reflect the “art over science” approach. For those following along these are the DoD Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide of 21 October 2005, the NAVAIR Integrated Master Schedule (IMS) Guidebook of February 2010, and the NDIA Planning and Scheduling Excellence Guide (PASEG) of 9 March 2016 (unfortunately no current direct link).

The key elements that comprise an IMS, in addition to what we identified under the PMBOK are that it is networked schedule consisting of specific durations that are assigned to specific work tasks that must be accomplished in discrete work packages. In most cases these durations will be derived by some kind of either fixed, manual method or through the inherent optimization algorithm being applied by the CPM application. More on this below. But these work packages are discrete, meaning that they represent the full scope of the work that must be accomplished to during the specified duration for the creation of an end product. Discrete work is distinguished from level of effort (LOE) work, the latter being effort that is always expended, such as administrative and management tasks, that are not directly tied to the accomplishment of an end product.

These work packages are tied together to illustrate antecedent and progressive work that show predecessor and successor relationships. Long term planning activities, which cannot be fleshed out until more immediate work is completed are set aside as placeholders called planning packages. Each of the elements that are tracked in the IMS are based on the presentation of established criteria that define completion, events, and specific accomplishments.

The most comprehensive IMSs consist of detailed planning that include resources and elements of cost.

Detailed Elements of the IMS

Given these general elements, the best source of identifying the key elements of detailed schedules is also found in Department of Defense documents. The core document in this case is the Data Item Description for the IMS numbered as DI-MGMT-81650. The latest one is dated March 30, 2005. There are a minimum of 32 data elements, some of these already mentioned and which I will not repeat in this post since they are pretty well listed and identified in the source document.

For those not familiar with these documents, Data Item Descriptions (or DiDs–gotta love acronyms) represent the detailed technical documents for artifacts involved in the management of DoD-related operations. Thus, this provides us with a pretty good inventory of elements to source. But there are others that are implied.

For example, the 81650 DiD identifies an element known as “methodology.” What this means is that each scheduling application has an optimization engine where the true differences in schedule construction and intellectual property reside. Elements that affect these calculations are time-based, duration-based, float, and slack, and those related to resources.

These time-based elements consist of early start, early finish, late start, late finish. Duration-based elements consist of shortest time, longest time, greatest rank weight. An additional element related to schedule float identifies minimum slack. Resources are further delineated by the greatest work content and the greatest cumulative resource content.

I would note that the NDIA PASEG adds some sub-elements to this list that are based on the algorithmic result of the schedule engines and, thus, tends to ignore the antecedent salient elements of validating the optimization engine found above. These additional sub-elements are total float, free float, soft constraints, hard constraints, and–also found in the aforementioned DiD–program, task, and resource calendars.

Normally, this is where a survey would end–with schedule-specific data elements focused on the details of the schedule. But we’re going to challenge our assumptions a bit more.

Framing Assumptions of Schedules and Programs

The essential document that provides a definition of the term “framing assumption” was published by RAND Corporation in 2014 entitled Identifying Acquisition Framing Assumptions Through Structured Deliberation by Mark V. Arena and Lauren A. Mayer.  The definition of a framing assumption is “any explicit or implicit assumption that is central is shaping cost, schedule, or performance expectations.”

As I have explored in my prior post, the use of the term “cost” is a fuzzy one. To some it means earned value management, which measures a small part of the costs of development and ownership of a system. To others it means total cost of ownership. Schedule is an implicit part of this definition, and then we have performance expectations, which I will deal with in a separate post.

But we can apply the concept of framing assumptions in two ways.

The first applies to the assumed purpose of the schedule. What do we construct one? This goes back to my earlier statement that “…the schedule…ties together all of the disciplines in putting together a project–acquisition, systems engineering, cost estimating, and project performance management.”

For the NDIA PASEG the IMS is a “tool, not just a report” that “provides an ever-changing window into the progress (or lack of it) of current work effort. The strategic mission of the schedule is to point out future risks and opportunities.”

For the NAVAIR IMS Guide the IMS “At a top level…contain(ing) the networked, detailed tasks necessary to ensure successful program execution…” that “capture(s) project tasks and task relationships”, “show(s) the magnitude and how long each task will take”, “show(s) resources, durations, and constraints for each task” and “show(s) the critical path.”

For the DiD 81650 “The Integrated Master Schedule (IMS) is an integrated schedule containing the networked, detailed tasks necessary to ensure successful program execution.”

But the most comprehensive definition that goes to the core of the purpose of an IMS can be found in paragraph 1.2 of the DoD Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide (IMP/IMS Guide). The elements of this purpose is worth transcribing, because if we have a requirement and cannot ask the “So What?” question, that is, if we cannot effectively determine why something must be done, then it probably does not need to be done (or we need to apply rigor in the development our expertise).

For what the IMP/IMS Guide does is clearly tie the schedule to the programmatic framing assumptions (used in the context in which RAND meant it) from initial acquisition through planning. Thus, the Integrated Master Plan (IMP) is firmly established as an antecedent and intermediate planning process (not merely an artifact or tool), that results in the program R&D execution process.

Taken in whole these processes and the resulting artifacts of the processes provide:

a. Provides offerors and acquiring activities with detailed execution planning, organization, and scheduling information that sets realistic expectations for the resulting contract action.

b. Serves as the execution plan for how the supplier will meet the contract’s performance requirements within cost and schedule constraints.

c. Provides a basis for integrating all of the functions involved in development and deployment of the system being acquired and, after award, sets the framing assumptions of the program.

d. Provides the basis for determining and assessing progress, identifying risks, determining the basis for contractual award fees and penalties, assess progress on Key Performance Parameters (KPPs) and Technical Performance Measures (TPMs), determine alternative paths to project completion, and determine opportunities for innovation and new acquisitions not apparent at the time of the award.

What all of this means is that the Integrated Master Schedule is too important to be left to the master scheduler. Yes, the schedule is a “tool” to those at the most basic tactical level in work execution. Yes, it is also an artifact and record.

But, more importantly, it is the comprehensive notional representation of the project’s or program’s scope, effort, progress, and assessment.

Private and Government-focused Industry Practice

A word has to mentioned here about the difference in practice between purely private industry practice in managing large projects and programs, and the skewing in the posts that focus on those industries that focus on public sector acquisition.

In the listing of schedule elements listed earlier there is reference to resources and elements of cost, yet here is an area that standard practice diverges. In private industry the application of resource assignments to specific work is standard practice and found in the IMS.

In companies focused on the public sector and DoD, the practice is to establish a different set of data outside of the schedule to manage resources. Needless to say this creates problems of validation of data across disparate systems related to the lowest level of planning and execution of a project or program. The basis for it, I think, relates to viewing the schedule as a “tool” and not the basis for project execution. This “tool” mindset also allows for separate “earned value engines” that oftentimes do not synchronize with the execution of the schedule, not only undermining the practical value of both, but also creating systems complexity and inefficiency where none need exist.

Another gap found in many areas of public acquisition concerns the development of an integrated master plan antecedent to the integrated master schedule. The cause here, once again, I believe is viewing the discipline of systems engineering separate; one that is somehow walled off from the continuing assessment of program execution, though that assumption is not supported by program phasing and milestone planning and achievement.

From the perspective of Integrated Program/Project Management, these considerations cannot be ignored, and so our inventory of essential data elements must include elements from these practices.

But Wait! There’s More!

Most discussions at conferences and professional meetings will usually stop at this point–viewing cost and schedule integration as the essence of IPM–with “cost’ limited to EVM. Some will add some “oh by the ways” such as technical performance and risk. I will address these in the next post as well.

But there are also other systems and processes that are relevant to our inventory. But what I have covered thus far in this series should challenge you if you have been paying attention.

I tackled cost first because of the assumptions implicit in equating it with EVM, and then went on to demonstrate that there are other elements of cost that provide a more comprehensive view. This is not denigrate the value of EVM, since it is an essential process in project management, but to demonstrate that its analytics are not comprehensive and, as with any complex system, require the contribution of additional information, depending on the level and type of work performance and progress being recorded and assessed.

In this post I have tacked the IMS, and have demonstrated that it is not supplementary process, but central to all other processes and actions being taken in the execution of the project or program. Many times people enter the schedule from an assessment of cost performance–tracing cost drivers to specific schedule activities and then tasks. But this has it backwards, based on the best technology available sometime in the late 1990s.

It is the schedule that brings together all relevant information from our execution and control processes and systems. It seems to me that perhaps the first place one goes is the schedule, that the first element to trace are those related to schedule slippage and unexpected resource consumption, and then to trace these to contract cost impact.

But, of course, there is more–and these other elements may turn out to be of greater consequence than just cost and schedule considerations. More on these in my next post.

In Closing: Battle Rhythm and the Plans of the Day and Week

When I was on active duty in the Navy we planned our days and weeks around a Plan of the Day or Plan of the Week. This is a posted agenda so that the entire ship or command understands the major events that affect its operations. It establishes focus on the main events at hand and fosters communication both laterally and vertically within the chain of command.

As one rises in rank and responsibility it is important to understand the operational tempo of the unit or ship, its systems, and subsystems. This is important in avoiding crisis management.This is known as Battle Rhythm.

Baked into the schedule (assuming proper construction and effective integrated product teaming) are the major events, milestones, and expected achievement of the program or project. Thus, there are events that should be planned around and anticipation of these items on a daily, weekly, biweekly, monthly, quarterly, and major milestone basis.

Given an effective battle rhythm, a PM should never complain about performance and progress indicators “looking into the rear view mirror”. If that is the case then perhaps the PM should look at the effectiveness and timeliness of the underlying project and program systems.Thus, when a PMO complains of information and intelligence being too late to be actionable, it is actually describing a condition of ineffective, latent, and disjointed information and intelligence systems.

Thus, our next step in our next post is to identify more salient IPM elements that cut to the heart of the matter.