Alito and the Unraveling of Originalism

During the current period of internal debate on Dobbs v. Jackson Women’s Health Organization, the public was provided a draft of the thinking of the “conservative” justices of the Supreme Court on the issue of abortion and personal liberty through a leaked version of Justice Alito’s draft majority opinion. I use quotes around the word conservative, because the decision is anything but conservative in its scope and effect.

Furthermore, despite much hand-wringing, the document provides much needed insight into the thinking of the Court majority on an issue of great public weight: the issue of personal liberty. Despite much criticism and an announced investigation, there is virtually no defensible reason why the process of the Court should not be open, as it is for almost every other branch of government. As such, it seems the issue of a leak of its working documents on such a weighty matter is a small price to pay for transparency and accountability. Secondly, this is not the first time a Supreme Court decision has been scooped and it won’t be the last.

The only difference is the amount of hot air spent on the leak because the decision itself is so odious. The pattern of the current court has been to make guerilla docket decisions, like a thief in the night. Forewarning of the denial of liberty to more than half of the country’s population has, rightly, caused a reaction that is only a preview of the storm to come.

It is true that, in theory, judicial responsibility is a contemplative and deliberate process that requires some measure of reflection and give-and-take. Under this ideal view, reconciling the perspectives of nine justices to produce a cohesive opinion is both a fraught and sensitive process. Were that we were to live in such an ideal, theoretical world.

Were the theoretical view of the judicial process true, any revocation of a liberty through judicial fiat would be considered both unusual and require extraordinary circumstances and subtlety. There is none of that in the draft opinion or in the case before the Court. The reason for this condition is the extraordinary lengths the reactionary right in this country has gone to degrade previously trusted institutions.

The reality is that the Supreme Court is a political institution made up of a mix of respected and leveled-headed jurists, fanatics with a political agenda, and mediocre political tools. This argues for a more transparent process that is paired with a strong ethical law of judicial conduct that applies to the Court’s membership. Sunshine and democracy bolstered by the balance and separation of powers is a curative for most civic ailments. This is no exception.

The degradation of the Court didn’t begin with Donald Trump, but no one, except perhaps Richard Nixon, has been so effective at degrading the integrity and effectiveness of anything that he touched as effectively as Trump. The man epitomizes the concept of the inverse Midas Touch. Everything he touches turns to excrement—and it is only excrement in the form of judicial opinion that seems to come from this Court.

This current state can only be expected given that he—given aid and comfort by the likes of Senators Lindsay Graham and Mitch McConnell—has more than any previous president appointed the proponents of banal thoughtlessness to the Court and to the courts, selecting clearly unqualified candidates to lifetime judicial appointments, and demanding personal loyalty and political fealty in the area of jurisprudence: like a mob boss or tin-torn caudillo.

If there is any defining judicial philosophy that the Court’s “conservatives” assert, it is that they are following the concept of originalism. Taking these assertions at face value, it is therefore fair to evaluate the efficacy and validity of this concept as it is applied.

What Art Thou Originalism? Liberty and Other Rights in Dobbs

According to Justice Barrett, the latest of its adherents appointed to the Court, originalism is defined as an approach that the “constitutional text means what it did at the time it was ratified and that this original public meaning is authoritative.” The late Justice Antonin Scalia stated, in a 1988 lecture explaining why he is an originalist, “The main danger in judicial interpretation of the Constitution is that the judges will mistake their own predilections for the law.”

While good in theory, the Constitution doesn’t always stand up to a clear language or clear meaning test that is suspended in time. Many sections and amendments to the document were written expansively with the intent that future generations and Congresses would find the balance in applying the broad principles enumerated. Archaic language also plays a role, which in many cases has many meanings, not just one.

Barrett herself, before rising to the Bench, noted this conundrum when viewed through an originalist’s lens in approaching modern developments: “Adherence to originalism arguably requires, for example, the dismantling of the administrative state, the invalidation of paper money, and the reversal of Brown v. Board of Education.” But, she asserts, there are past decisions that “no serious person would propose to undo even if they are wrong.”

Apparently, the vague “serious person” rule does not apply to the 49-year-old Roe decision. But the most appalling part of her assertions lies in the certainty that rises to the level of arrogance that she possesses superior jurisprudential knowledge than what the various justices over almost fifty years ruled in Roe and Casey, and in the cases that have upheld them.

Her background, while demonstrating competence in her various limited assignments and appointments, do not reveal a particularly brilliant or independent legal mind. Over the course of her private practice, her academic career, and her work at the Seventh Circuit Court of Appeals, nothing would mark her as eminently qualified to serve on the U.S. Supreme Court, apart from her fawning obsequiousness to the originalist philosophy of the late Justice Scalia, for whom she clerked. Overall, her writing shows that she is polemical, as opposed to analytical, in her approach to legal issues.

While she seems to be workmanlike and generally likeable, nothing distinguishes her except for an adherence to a questionable and rigid legal philosophy, a pro-large business, anti-environmental and anti-labor bias, and the occasional interjection of her personal religious beliefs into legal issues. Her opinions on the Seventh Circuit were neither consistent nor did they articulate a coherent legal approach to justice. On the contrary, they seemed to align with the recent trend of other originalist jurists to base their arguments on a pre-determined outcome. This is known as “begging the question.”

But let’s give the devil his due for the sake of argument. What are these elements that make an originalist reverse a 49-year-old precedent? The Court discusses them as such, which I have outlined as follows:

  • Up until Roe, the various States addressed the issue of abortion and women’s reproductive liberty.
  • The Constitution “makes no mention of abortion” and that “no such right is implicitly protected…”
  • The Court issued a set of rules regarding application of the decision based on pregnancy trimesters and related to fetal “viability,” which have the appearance of statutory language.
  • The Court’s decision ended the ability of States to regulate abortion and women’s reproductive liberty in this manner.
  • Planned Parenthood v. Casey revised Roe, which included negating the trimester standard and, in its place, substituting an “undue burden” standard on State action.
  • That the Due Process Clause of the Fourteenth Amendment does not protect abortion as one of the rights not enumerated in the Constitution and that the case of Washington v. Glucksberg (1997) determines what does.
  • That under Glucksberg, a right must be “deeply rooted in this Nation’s history and tradition” and implicit in the concept of ordered liberty.”
  • That abortion doesn’t fall under the Glucksberg test since it was not decided until 1973.
  • That the Equal Protection provisions of the Fourteenth Amendment do not apply either, especially to women as it relates to abortion.
  • That the right to “privacy” is not absolute and doesn’t apply to abortion, since there are competing interests and that these should be resolved by the States.

In order to support the denial of a Fourteenth Amendment protection for abortion, Alito queries the state of abortion in 1868—the date when the amendment was passed. In doing so, he cites tradition as processed through the selective legal writings of Bracton, Coke, Hale, Blackstone and others to assert that abortion prior to “quickening,” that is the viability of the fetus used in the original Roe decision, was not a consideration in common law.

In supporting his critique that abortion does not meet the Glucksberg test, Alito goes back to the 13th century, to a time where the rights accepted at this country’s founding in the 18th century would also not be recognized. In criticizing both legal precedent and common law precedent used in Roe, Alito simply then draws a line in favor of these archaic beliefs and says that the Court in 1973 was erroneous, and in classic bit of hand-waving, that its reasoning “makes no sense.” He also asserts that the issue was extremely contentious in 1973, which is against the reality that anyone who was alive in that year could tell you. On the contrary, the finding in Roe seemed all but a foregone conclusion, with even most Catholics and the majority of Protestant religious organizations supporting the right to various degrees.

The Conceit of Originalism

This opinion, as noted, is a mess of historical inaccuracies, tautology, and hand waving. Good thing it is a draft; and one cannot help but surmise whether this is just a strawman argument and that the Court will uphold Roe in general, but modify the rule as was done in Casey. But let’s take the assertions in groupings:

First, the statement that the States regulated abortion until 1973 is both a statement of fact and of the problem that Roe sought to address. It adds no value to Alito’s argument and does not pass the “so-what” test. A legal opinion isn’t one until it is made. The Court decided in 1973 to take up the case because of the compelling and important underlying issues presented.

Also in this group are the general statements that Roe established guidelines for trimesters based on fetal viability, which addressed the issue of personhood (discussed in more detail below), and that the ruling ended the variability of States to write laws on abortion. Again, a statement from Captain Obvious. After all, this last was the purpose of the decision. The Casey revisions are also a matter of record, expanding the discretion in State regulation.

Second, the statement regarding abortion not being mentioned in the Constitution is an equivocation. This assertion was also made by Senator Graham and other “conservatives” during the confirmation hearing of Judge Ketanji Brown Jackson, and so one cannot help but wonder about the collusion among like-minded officials in Washington and their eagerness to repeat this canard.

The problem here is that anyone educated about our Constitutional government understands that our system is one of enumerated (that is, stated) governmental powers, and both enumerated and implied individual rights. In addition, our Constitution was not frozen in time in 1789 because that document has been amended 27 times. The amendments that expanded individual rights are generally understood to be the 13th through 15th amendments, the 19th, and the 26th amendments.

When the first ten amendments that became the Bill of Rights were being debated, there were generally two camps—the Federalists and the Anti-federalists. One of the core differences between the factions was the fear, articulated by the Federalists, that enumerating rights would mislead future Americans in believing that those were the only rights reserved by the people. The Antifederalists sought a Bill of Rights so that at least some of the most cherished rights were codified in the Constitution, which was the antecedent charter of the American people.

The latter argument won out with an implicit agreement, articulated in newspapers across the country, that both factions believed that there were rights that were not enumerated, but that those that were enumerated are the most cherished and a wall against tyranny. This is a core defining principle of this country. This Court is the example they sought to guard against.

The concept of bodily autonomy is one that is recognized since the founding of the country and was first clearly articulated in the Declaration of Independence’s broad statement to “life, liberty, and the pursuit of happiness.” This is our deeply rooted tradition in “ordered liberty,” which emphasizes liberty and doesn’t mention order. No doubt an originalist like Alito would defer to English common law that allowed the sovereign to quarter troops and suspend habeas corpus. But that is not *our* tradition.

Furthermore, the Sixth Amendment establishes the “right of the people to be secure in their persons…” The Seventh Amendment establishes that “No person shall…be deprived of life, liberty, or property, without due process of law…” The Fourteenth Amendment applied these rights to persons and prohibited the States from denying them.

It is here that Alito, applying originalism, stops. He relies on conceptions of liberty at the time of the amendment’s passage. What is insidiously ironic about this approach, is that the conditions among the States in 1868 are the very conditions which the amendment abolishes. In this case, it ostensibly was the condition of the freedmen and any other *person* who lived in the various States. But it needs to be noted that the amendment was also a broader recitation of the concept of individual rights and which the States could not undermine. Thus, the concept of personhood was introduced and in 1868 it is admittedly a fairly narrow one, but that was not as it was seen when it was ratified.

Nor does the story end there. In 1870, the 15th Amendment established universal male suffrage regardless of race, color, or previous condition of servitude. In 1919 the 19th Amendment extended the franchise to women. Thus, by 1919, the rights of women were only beginning to be realized, but now they could begin to exercise the power that the vote provides even though the courts were not yet ready to afford them the equal status that voting would seem to imply. In 1971, the right to the franchise was extended down to the age of 18.

When we establish or recognize rights, the originalist approach would be to go back to 1870, 1919, or 1971. This is not only faulty, it is a conceit, as if someone today can place themselves in the minds of someone in the past—like a clairvoyant. Furthermore, the Alito originalist (and his co-jurists) would approach each one in a vacuum, not as a continuation or extension–an amendment and therefore new interpretation—to the country’s charter. The intellectual dishonesty in this approach, at this level of judicial governance, is both breathtaking and troubling.

Prior to this Court, the pattern—which sits on firm ground—has been to note that the purpose or intent of the ever-expanding recognition of previously ignored or unsettled rights. The perspective has been: “from this time forward.” The application of these rights must, by necessity, be judiciously determined under actual conditions, not some mythical reification of an ideal or in the mind of the judge. But, of course, this has not always been the case, particularly as it relates to the rights of previously disenfranchised groups, women among them.

Alito’s opinion makes much of not following precedent (stare decisis), and in a bit of self-congratulatory language, compares it to Brown overturning Plessey. The focus on Brown among a conservative movement that considered it and other Warren Court cases to be judicial overreach is ironic and troubling in a “thou protest too much” sort of way. This is doubly troubling given the writing of Justice Barrett and the draft Alito opinion that undermines and/or denies many of the same rights asserted as the basis for Brown.

Since just prior to the middle of the 20th century, our society has become somewhat complacent in believing in the concept that the arc of history always bends toward justice; that the progress ennobling democratic and human rights was inevitable. But the reality behind the milestones we celebrate are, instead, the work of untold others who sacrificed and fought to gain a measure of dignity, justice, and human rights either for themselves, or for generations that followed.

Our nation, for all of its blessings, was born with the defects of 18th century thinking regarding race, gender, and caste. It was a pre-scientific time, which our legal and social institutions must keep in mind, and to make adjustments to our present thinking, as new knowledge has become known. We have had and will have to continue to work hard to confront those historical defects, including those who would suppress or prevent its acknowledgment.

Rather than the draft opinion of Dobbs being comparable to Brown overturning Plessey, the more appropriate analogy is the Supreme Court’s notorious nullification of the rights found in the 13th through 15th amendments in the Civil Rights Cases of 1883, which caused the rise of Jim Crow and the Black Codes across the south and many border states. It is these that then made Plessey possible. This comparison is the more apropos one because there is no mention of women’s rights in the Alito draft opinion, or an acknowledgment of their personhood, with all of the rights that attend to that finding. The unwritten effect of driving originalism’s Wayback Machine to 1868 is the underlying and implicit doctrine that a woman’s body, and in particular the womb, belonged to her husband.

The concern here is that the Court has assumed that women do not have an interest in whether to remain pregnant and have inserted their own judgment—or that of individual States—for that most personal of all decisions. During oral arguments of the case, Justice Barrett suggested that the existence of safe-haven laws and adoption in general rendered moot the pro-choice argument that abortion access protects women from “forced motherhood.” Instead, she asserted, “it doesn’t seem to me to follow that pregnancy and then parenthood are all part of the same burden,” said Barrett. “The choice, more focused, would be between, say, the ability to get an abortion at 23 weeks, or the state requiring the woman to go 15, 16 weeks more and then terminate parental rights at the conclusion.” This is a cavalier and casually extreme statement that nullifies the personal autonomy of women. It also ignores the reality of female childbirth mortality and complications.

At heart, originalism isn’t about jurisprudence at all, but part and parcel of the widespread trend to fundamentalist and literal thinking and interpretation that first arose in the 1970s across various religious groups, but is now conflated to the political and legal spheres.

In her book, The Battle for God, Karen Armstrong noted a strong family resemblance between the various forms of religious fundamentalism, but these characteristics are also troublingly familiar in today’s politics. According to Armstrong, “They are embattled forms of spirituality, which have emerged as a response to a perceived crisis. They are engaged in a conflict with enemies whose secularist policies and beliefs seem inimical to religion itself. Fundamentalists do not regard this battle as a conventional political struggle, but experience it as a cosmic war between the forces of good and evil. They fear annihilation, and try to fortify their beleaguered identity by means of a selective retrieval of certain doctrines and practices of the past. To avoid contamination, they often withdraw from mainstream society to create a counterculture; yet fundamentalists are not impractical dreamers. They have absorbed the pragmatic rationalism of modernity, and, under the guidance of their charismatic leaders, they refine these “fundamentals” so as to create an ideology that provides the faithful with a plan of action. Eventually they fight back and attempt to re-sacralize an increasingly skeptical world.”

The originalist argument against rights being asserted by previously disenfranchised people is an argument against modernity, denying everything that followed 1868. If the Constitution of 1789 was the Old Covenant, which replaced both English aristocracy and the unworkable Articles of Confederation, the post-Civil War Amendments are the New Covenant. It established individual rights and liberty, and the primacy of the national government enforcing these rights over the States.

Originalism is a dialing back of the clock and a denial of this New Covenant which has expanded democracy and human dignity in order to achieve the previously unfulfilled promises of the Declaration. A fantastical channeling of original intent is not unlike the charlatans that peddled Spiritualism in days gone by.

For example, Alito does not speak for Senators John Bingham or Jacob Howard who authored the 14th Amendment, who both clearly stated that it applied the Bill of Rights to the States—a view finally adopted after years of denial by the Court in the face of contrary facts and clear language: a prior untenable and contradictory position that this Court seems to have revived.

The heart of the issue, which the Court’s draft opinion avoids at all cost, is whether a woman is a person or a citizen of the United States as defined under the Sixth and Fourteenth Amendments. Ruth Bader Ginsberg argued five cases before the Supreme Court beginning in 1971 that both men and woman are such and are therefore entitled to protection under the Equal Protection Clause of the 14th Amendment. This is really the issue that the Alito draft opinion—signed by at least five so-called conservative justices—would overturn.

Where Do We Go From Here?

The core liberty being attacked under this ruling is the basic acknowledgment that American women are both persons and citizens of the United States, and that, as such, they—and all of us—possess rights to privacy and physical autonomy. This is a basic question that needs to be answered directly.

This is not simply a question of abortion, nor is it solely the concern of women or of people of color because of its immediate impact. This is of interest to all people who wish to ensure that the Constitutional order instituted in support of civil rights and civil liberties against oppressive State action at any level of government is supported by the rule of law. To subject these rights to the whims of the States without due process or an unusual overarching state interest—is to allow justice on such fundamental rights to be inconsistently enforced and justice denied. Undermining them leads to the abrogation of other rights essential to liberty. It leads to oppression, theocracy, and autocracy.

Thus, the issue must be engaged on several fronts.

The first of these, of course, is to assert that a woman is both a person and that one naturalized or born here is a citizen under the law. The most direct way to enshrine this into law is through passage of an Equal Rights Amendment.

For all of Ruth Bader Ginsberg’s brilliant legal approach under the existing Constitutional language, there was always a way for those who would reverse these cases to invent an argument or theory to come to an opposite conclusion. For almost a century before the Ginsburg cases, the Court had done just that.

Much has been said and written in the press and media about the increasing political divisiveness in the United States, as if it were new or permanent. Its form is new, but its causes, impetus, and application whistle a familiar tune. The fact of the matter is that things change and can be changed as they have in the past. The 2020 election, despite all of the propaganda by the losers, proved that 91 million Americans can come together to take back their democracy.

The latest strategy of the pro-equality movement has been to rely on the 1972 ERA and to challenge its sunset clause. While their cause is laudable, their strategy and tactics are self-defeating, because they fail to build the message that will form the coalition needed to deliver their goal. The unfounded belief that the Court would not overturn women’s rights and the resulting political complacency also undermined the impetus for the Amendment. But this will be a new battle.

To win a battle, the lines must be drawn, the goals made clear, the purpose articulated and the opposition engaged at all levels. Democracy and its animating ideas are the culmination of thousands of years of human struggle, hope, and aspiration. Its appeal is universal, and so it is a war where ideas are the most effective weapon. Still, it takes extraordinary effort and determination to bring it about.

I am both an historian and political scientist, among my other vocations. When we look back at human civilization before 1776 all we can see, with a few minor exceptions, is a world of theocracy, monarchy, feudalism, dictatorship, autocracy, and oppression. This is what made the idea that became the reality of the United States so exceptional.

Still, there were artifacts of society that the founders under the Declaration of Independence—and later the framers of the new Constitution that replaced the Articles of Confederation—could and could not address, given their own political realities.

The first issue they did address was the separation of Church and State, given that up to 1776, the issues that could best rip a nation apart, or could oppress others, were conflicts and disagreements between religious creeds, the merging of government and religion, and the banning or interference with religion and belief. These provisions are both in the core document and the First Amendment.

But most significantly, the issues they decided not to address were those related to universal male suffrage, slavery, the position of women, and caste. These they left to future generations to sort out and they did so consciously. We know this because following the series of letters in which Abigail Adams wrote to John Adams to “remember the ladies,” Adams did attempt to address the issue and it is interesting that this issue, more than others, touched on all four of those listed above.

For many of the founders, the issues that raised the most profound challenges were those that caused them to determine how to overcome the laws and practices that originated in English and European feudalism.

For example, many Americans have not learned through their primary school history that Thomas Jefferson had actually included in the Declaration the abolition of slavery and the buying and selling of human beings. This passage was removed when those states that profited from the trade objected to it.

But most significantly, even the philosophy of John Locke was found to be too restrictive in its formulation of natural rights. Jefferson modified and democratized Locke’s phrase “life, liberty, and property” to a more expansive concept: “life, liberty, and the pursuit of happiness.” While Locke’s philosophical writings mentions the right of property throughout, the Declaration does not.

This universalist definition of liberty was left to future generations to address, particularly in regard to issues regarding property as a requirement of voting, for the position of women and others in society, and for the buying, selling, and enslavement of other human beings. These artifacts of feudalism and medieval law, as James Sullivan, whom Adams consulted noted, would need to be addressed. I would note the abundance of references in the Alito draft opinion to feudal and medieval thought and law.

So, we must pick up the baton of liberty again and do the hard work of convincing our fellow citizens through campaigning, picking responsible representatives that share the spark of equality and liberty. This must be done in every state, in every district, at every level of government. The concept of liberty and equality must be taught in every school, enforced and supported by every official of government.

This movement must be independent of political party. But it must be a coalition of like-minded groups concerned with individual liberty and autonomy. Our arguments must be factual and convincing. They must avoid polarizing language, but make clear the motivations and the consequences of the opposition’s position to both men and women.

Congress will need to write a new ERA which they know will fail in the short term, given the current state of politics. The movement for equality will need to take the responsibility to mobilize the people in a unified and concerted effort on this one issue alone. Through this one issue will follow other rights to be enforced and secured—and this is why the opposition to it has been so forceful.

The second front that must be opened is to confront the current political power structure. We must challenge in Court and in the political arena *anyone* who would assume power to abrogate the rights to personhood and citizenship of women and of their equality. The strategy of the opposition is to spread scare tactics. Those that were employed in the 1970s are quaint in the perspective today’s world: women in the workplace, women in the military, shared custody of children and child care, the option to an abortion pre-viability, gay marriage, and gender equality and fluidity.

We live with these realities of human nature now and the world has not fallen apart. This reveals the lie that undergirds the opposition to equality. For we are back to artifacts of medieval and feudalistic thought. The fear of empowering women is the fear of paternity and the presumption of male authority, but only certain males and certain authority. There are women all too willing to bolster this thinking at the expense of other women to gain advantage. Even our language is tilted toward demeaning terminology of women that have no equivalents when referring to men.

Here in Florida, we can see the new scary stories spread at the state governmental level regarding gay and transgender students and parents: the alleged horrors of gender-neutral bathrooms and women’s sports. One must ask oneself two questions when addressing these concerns, since they seem to resonate with the public: how likely is the worst case put forth to occur? And, even if true, would you give up your constitutional liberties in exchange for a sign on a public bathroom and the sexual designation of a competitive sport?

The third front to open against the opponents of equality is on the social and religious level. This will place abortion—which has been used as the main issue to undermine women’s rights—in its proper moral, ethical, and historical position. This removes the popular lexicon of these merely being “values issues” rather than a fundamental issue of equality and liberty.

The first step is to challenge the fundamentalist assertion of abortion. As the historian Garry Wills, who also happens to be a Catholic, has noted, it was not until 1930—in the face of modern medicine—that Pope Pius XI issued the encyclical Casti Connubii to forbid all ways to prevent procreation. This undermines Alito’s entire argument of a moral issue.

No major Roman Catholic theologian prior to the 1930 encyclical: not Dante, not Matthew, Mark, Luke, Paul, or even John that old misogynist; not St. Augustine nor St. Thomas Aquinas found a biblical prohibition against abortion. The latest argument since the encyclical is that it is an issue of natural law. But no major proponent of natural law considers a fetus to be a person independent of the mother. Only fanatics claim that a human person begins at “conception,” ignoring that about half of such fertilizations fail, making God the Chief Abortionist.

Such assertions of fetal personhood also fail medical and scientific scrutiny because a fetus before viability outside of the womb is still only a grouping of cells fully dependent on and part of the mother, just like any other cells, especially given the developing modern science of in vitro fertilization and cloning. A miscarriage in the Catholic Church is not prescribed to be baptized, given last rites, nor buried in consecrated ground. Alito and Barrett’s opinions on this topic are the most extreme among others of their faith, which seems to have infected their dishonest foray into secular jurisprudence.

Armed with these facts, we must align with traditional and universalist faiths, and with secular ethicists, that do not agree with the fundamentalist view on abortion. This includes the majority of Catholics, Jews, Buddhists, Muslims, Hindus, Unitarians, African-American Protestant sects, and mainline Protestant faiths.

At the end of the day, the question comes down to the fact that a woman is an equal, autonomous person under the law. We do not have to have the answers to everything else follows, just as the founders of the nation and the framers of the Constitution allowed that they could not. We simply need to enshrine this one core right which is under attack, and do the hard work to make it impossible to deny.

Send in the Clowns–Putin, Trump, the Fifth Column, and the importance of history

When Benito Mussolini was seen on newsreels in the United States it was as a “rotund, strutting clown, who struck pompous poses from his Roman balcony and tried to upstage Adolf Hitler when they first met, in Venice in 1934.” Much of this impression was established thanks to Charlie Chaplin’s 1940 film “The Great Dictator,” where the Il Duce character was effectively lampooned by the actor Jack Oakie. This reinforced the widely held public prejudice in England and among many Americans that “Mediterranean Peoples,” especially Italians, were inferior oafs, unintelligent and clownish hotheads. Intellectuals and world leaders would come to regret underestimating the Italian autocrat, who was just the first of fascist and quasi-fascist leaders to wreak havoc on the world.

As with Mussolini, Hitler and the Nazis were first seen as a clown show with their thuggish behavior, mediocre backgrounds, buffoonish uniform-like garb, their appropriation of mythic symbols, and their completely unintelligible and ignorant interpretations of history and society. It was noted that Hitler himself was ignorant, lazy, a self-absorbed narcissist, and overly theatrical in his public appearances. He also effectively tapped into the grudges and nationalistic lies that cleaved German society, finding convenient scapegoats, especially Jews and other vulnerable groups, lumping them in with Communists and external threats to “German-speakers.”

Yet, so effective was the disinformation and propagandist jumble of Nazi ideas and assertions to define themselves and intentions, that even today intellectuals debate the definition of fascism, and whether clearly quasi-fascist governments like those under Franco in Spain and Pinochet in Chile were really fascist governments, despite the not only clear linkage to the methods and ideas of fascism, but the evidence of the dead bodies.

One would think that this debate was closed with Umberto Eco’s definition of Ur-fascism, (who knew about fascism first-hand) but we still get hand-wringing and pearl-clutching from intellectual Mugwumps who continue to want to obfuscate this issue. And for the uneducated on this subject–no–liberals and socialists aren’t fascists (or communists) or even close; especially given that they are usually among the first ones rounded up and murdered by fascist and quasi-fascist thugs along with Jews and others outside the fascist tribe.

Thus, about two weeks ago, I was struck by the comments of Sergey Lavrov, the Foreign Minister of the Russian Federation, regarding Ukraine and the West. He posited an alternative and counter-factual interpretation of history that is astounding in its echo of fascist movements. As he cautioned the West from coming to Ukraine’s aid, he referenced Russian historical continuity, the importance of uniting “Russian speakers,” and warning the West of the so-called lessons from the latest version of Russian ultra-nationalist self-delusional history that puts forth the notion of a single Eurasian people under Russian rule. Here is what he is saying: the various countries and ethnic groups from the border of Sweden down to Germany, and to Moldova, and, perhaps, to Serbia are within Russia’s orbit. He would sweep away these countries and peoples, murdering anyone in his way.

Vladimir Putin has shown himself to be an obviously corrupt kleptocrat, sociopath, narcissist and mediocrity. What he lacks in intellectual honesty and character, he more than compensates with ruthlessness, murderous intent, and links to the powerful oligarchs that keep him in near-totalitarian power. He is a fanatic. Fanatics, as Aldous Huxley noted, can be defeated because they compensate for a secret doubt, but they are also dangerous since any opposition or disagreement is interpreted as a hostile act. When such an individual has nuclear weapons, which this tyrant has been all too vocal in referencing, the situation is dangerous.

But the United States and its allies have overcome existential threats in the past, and make no mistake, Putin and the current Russian government is an existential threat. I served my country in the U. S. Navy for almost 23 years during the Cold War. I deployed to the theaters of contention close to the Soviet Union and Communist China. I also served, at times, as a special weapons courier. My military colleagues, and anyone paying attention, knew and developed the steel to oppose the imperial ambitions of those nations knowing that we were under nuclear threat by hostile powers. We also accepted the fact that in case of an attack on us, that retaliation against these countries would be swift and overwhelming.

Last night Mr. Putin delivered an unhinged and self-deluded speech. He is a clown. The clowns in this world have caused a tremendous amount of human suffering and destruction. As Hannah Arendt noted, evil is banal. The clowns of this world, augmented by our so-called post-truth era, create an environment for banality to thrive. They spread the sickness–the pandemic–of their deranged minds onto the world.

Thus, almost in a caricature of the comedy of life, Donald Trump, the dictator’s sycophant and a man of personal cowardice unknown in any previous U. S. president, praises the evisceration of Ukraine. The man is a Quisling and compromised. He and his followers, who act as a cult, and cult-like, change and contort their minds on a dime to align with his statements, represent a dangerous anti-democratic and–yes–anti-American Fifth Column.

The election of Joe Biden and the corrective of American democracy is the trigger that caused this crisis because Putin thought he had neutralized the United States. His Quisling is in exile, though he has more than a few mini-quislings waiting in line. As all extreme narcissists, he actually believed his own delusion. This is an attack borne of desperation. Unfortunately Ukraine is the pawn he has chosen. It is timed to undermine American democracy and seed division, so that he can realize his other imperial ambitions in threatening our allies and defeating democratic self-government–government of the people, by the people, and for the people–in Europe and elsewhere. He hates what we are. Anyone who supports him shares in that hate.

Wake up. History is calling.

Shake it Out – Embracing the Future of Program Management – Part Two: Private Industry Program and Project Management in Aerospace, Space, and Defense

In my previous post, I focused on Program and Project Management in the Public Interest, and the characteristics of its environment, especially from the perspective of the government program and acquisition disciplines. The purpose of this exploration is to lay the groundwork for understanding the future of program management—and the resulting technological and organizational challenges that are required to support that change.

The next part of this exploration is to define the motivations, characteristics, and disciplines of private industry equivalencies. Here there are commonalities, but also significant differences, that relate to the relationship and interplay between public investment, policy and acquisition, and private business interests.

Consistent with our initial focus on public interest project and program management (PPM), the vertical with the greatest relationship to it is found in the very specialized fields of aerospace, space, and defense. I will therefore first begin with this industry vertical.

Private Industry Program and Project Management

Aerospace, Space & Defense (ASD). It is here that we find commercial practice that comes closest to the types of structure, rules, and disciplines found in public interest PPM. As a result, it is also here where we find the most interesting areas of conflict and conciliation between private motivations and public needs and duties. Particularly since most of the business activity in this vertical is generated by and dependent on federal government acquisition strategy and policy.

On the defense side, the antecedent policy documents guiding acquisition and other measures are the National Security Strategy (NSS), which is produced by the President’s staff, the National Defense Strategy (NDS), which further translates and refines the NSS, and the National Military Strategy (NMS), which is delivered to the Secretary of Defense by the Joint Chiefs of Staff of the various military services, which is designed to provide unfettered military advise to the Secretary of Defense.

Note that the U.S. Department of Defense (DoD) and the related agencies, including the intelligence agencies, operate under a strict chain of command that ensures civilian control under the National Military Establishment. Aside from these structures, the documents and resulting legislation from DoD actions also impact such civilian agencies as the Department of Energy (DOE), Department of Homeland Security (DHS), the National Aeronautics and Space Administration (NASA), and the Federal Aviation Administration (FAA), among others.

The countervailing power and checks-and-balances on this Executive Branch power lies with the appropriation and oversight powers of the Congress. Until the various policies are funded and authorized by Congress, the general tenor of military, intelligence, and other operations have tangential, though not insignificant effects, on the private economy. Still, in terms of affecting how programs and projects are monitored, it is within the appropriation and authorization bills that we find the locus of power. As one of my program managers reminded me during my first round through the budget hearing process, “everyone talks, but money walks.”

On the Aerospace side, there are two main markets. One is related to commercial aircraft, parts, and engines sold to the various world airlines. The other is related to government’s role in non-defense research and development, as well as activities related to private-public partnerships, such as those related to space exploration. The individual civilian departments of government also publish their own strategic plans based on their roles, from which acquisition strategy follows. These long terms strategic plans, usually revised at least every five years, are then further refined into strategic implementation plans by various labs and directorates.

The suppliers and developers of the products and services for government, which represents the bulk of ASD, face many of the same challenges delineated in surveying their government counterparts. The difference, of course, is that these are private entities where the obligations and resulting mores are derived from business practice and contractual obligations and specifications.

This is not to imply a lack of commitment or dedication on the part of private entities. But it is an important distinction, particularly since financial incentives and self-interest are paramount considerations. A contract negotiator, for example, in order to be effective, must understand the underlying pressures and relative position of each of the competitors in the market being addressed. This individual should also be familiar with the particular core technical competencies of the competitors as well as their own strategic plans, the financial positions and goals that they share with their shareholders in the case of publicly traded corporations, and whether actual competition exists.

The Structure of the Market. Given the mergers and acquisitions of the last 30 years, along with the consolidation promoted by the Department of Defense as unofficial policy after the fall of the Berlin Wall and the lapse of antitrust enforcement, the portion of ASD and Space that rely on direct government funding, even those that participate in public-private ventures where risk sharing is involved, operate in a monopsony—the condition in which a single buyer—the U.S. government—substantially controls the market as the main purchaser of supplies and services. This monopsony market is then served by a supplier market that is largely an oligopoly—where there are few suppliers and limited competition—and where, in some technical domains, some suppliers exert monopoly power.

Acknowledging this condition informs us regarding the operational motivators of this market segment in relation to culture, practice, and the disciplines and professions employed.

In the first case, given the position of the U.S. government, the normal pressures of market competition and market incentives do not apply to the few competitors participating in the market. As a result, only the main buyer has the power to recreate, in an artificial manner, an environment which replicate the market incentives and penalties normally employed in a normative, highly diverse and competitive market.

Along these lines, for market incentives, the government can, and often does, act as the angel investor, given the rigorous need for R&D in such efforts. It can also lower the barriers to participation in order to encourage more competition and innovation. This can be deployed across the entire range of limited competitors, or it can be expansive in its approach to invite new participants.

Market penalties that are recreated in this environment usually target what economists call “rent-seeking behavior.” This is a situation where there may be incumbents that seek to increase their own wealth without creating new benefits, innovation, or providing additional wealth to society. Lobbying, glad-handing, cronyism, and other methods are employed and, oftentimes, rampant under monosponistic systems. Revolving-door practices, in which the former government official responsible for oversight obtains employment in the same industry and, oftentimes, with the same company, is too often seen in these cases.

Where there are few competitors, market participants will often play follow-the-leader and align themselves to dominate particular segments of the market in appealing to the government or elected representatives for business. This may mean that, in many cases, they team with their ostensible competitors to provide a diverse set of expertise from the various areas of specialty. As with any business, profitability is of paramount importance, for without profit there can be no business operations. It is here: the maximization of profit and shareholder value, that is the locus of power in understanding the motivation of these and most businesses.

This is not a value judgment. As faulty and risky as this system may be, no better business structure has been found to provide value to the public through incentives for productive work, innovation, the satisfaction of demand, and efficiency. The challenge, apart from what political leadership decides to do regarding the rules of the market, is to make those rules that do exist work in the public interest through fair, ethical, and open contracting practices.

To do this successfully requires contracting and negotiating expertise. To many executives and non-contracting personnel, negotiations appear to be a zero-sum game. No doubt, popular culture, mass media and movies, and self-promoting business people help mold this perception. Those from the legal profession, in particular, deal with a negotiation as an extension of the adversarial processes through which they usually operate. This is understandable given their education, and usually disastrous.

As an attorney friend of mine once observed: “My job, if I have done it right, is to ensure that everyone walking out of the room is in some way unhappy. Your job, in contrast, is to ensure that everyone walking out of it is happy.” While a generalization—and told tongue-in-cheek—it highlights the core difference in approach between these competing perspectives.

A good negotiator has learned that, given two motivated sides coming together to form a contract, that there is an area of intersection where both parties will view the deal being struck as meeting their goals, and as such, fair and reasonable. It is the job of the negotiator to find that area of mutual fairness, while also ensuring that the contract is clear and free of ambiguity, and that the structure of the instrument—price and/or cost, delivery, technical specification, statement of work or performance specification, key performance parameters, measures of performance, measures of effectiveness, management, sufficiency of capability (responsibility), and expertise—sets up the parties involved for success. A bad contract can no more be made good than the poorly prepared and compacted soil and foundation of a house be made good after the building goes up.

The purpose of a good contract is to avoid litigation, not to increase the likelihood of it happening. Furthermore, it serves the interests of neither side to obtain a product or service at a price, or under such onerous conditions, where the enterprise fails to survive. Alternatively, it does a supplier little good to obtain a contract that provides the customer with little financial flexibility, that fails to fully deliver on its commitments, that adversely affects its reputation, or that is perceived in a negative light by the public.

Effective negotiators on both sides of the table are aware of these risks and hazards, and so each is responsible for the final result, though often the power dynamic between the parties may be asymmetrical, depending on the specific situation. It is one of the few cases in which parties having both mutual and competing interests are brought together where each side is responsible for ensuring that the other does not hazard their organization. It is in this way that a contract—specifically one that consists of a long-term R&D cost-plus contract—is much like a partnership. Both parties must act in good faith to ensure the success of the project—all other considerations aside—once the contract is signed.

In this way, the manner of negotiating and executing contracts is very much a microcosm of civil society as a whole, for good or for bad, depending on the practices employed.

Given that the structure of aerospace, space, and defense consists of one dominant buyer with few major suppliers, the disciplines required relate to the details of the contract and its resulting requirements that establish the rules of governance.

As I outlined in my previous post, the characteristics of program and project management in the public interest, which are the products of contract management, are focused on successfully developing and obtaining a product to meet particular goals of the public under law, practice, and other delineated specific characteristics.

As a result, the skill-sets that are of paramount importance to business in this market prior to contract award are cost estimating, applied engineering expertise including systems engineering, financial management, contract negotiation, and law. The remainder of disciplines regarding project and program management expertise follow based on what has been established in the contract and the amount of leeway the contracting instrument provides in terms of risk management, cost recovery, and profit maximization, but the main difference is that this approach to the project leans more toward contract management.

Another consideration in which domains are brought to bear relates to position of the business in terms of market share and level of dominance in a particular segment of the market. For example, a company may decide to allow a lower than desired target profit. In the most extreme cases, the company may allow the contract to become a loss leader in order to continue to dominate a core competency or to prevent new entries into that portion of the market.

On the other side of the table, government negotiators are prohibited by the Federal Acquisition Regulation (the FAR) from allowing companies to “buy-in” by proposing an obviously lowball offer, but some do in any event, whether it is due to lack of expertise or bowing to the exigencies of price or cost. This last condition, combined with rent-seeking behavior mentioned earlier, where they occur, will distort and undermine the practices and indicators needed for effective project and program management. In these cases, the dysfunctional result is to create incentives to maximize revenue and scope through change orders, contracting language ambiguity, and price inelasticity. This also creates an environment that is resistant to innovation and rewards inefficiency.

But apart from these exceptions, the contract and its provisions, requirements, and type are what determine the structure of the eventual project or program management team. Unlike the commercial markets in which there are many competitors, the government through negotiation will determine the manner of burdening rate structures and allowable profit or margin. This last figure is determined by the contract type and the perceived risk of the contract goals to the contractor. The higher the risk, the higher the allowed margin or profit. The reverse applies as well.

Given this basis, the interplay between private entities and the public acquisition organizations, including the policy-setting staffs, are also of primary concern. Decision-makers, influences, and subject-matter experts from these entities participate together in what are ostensibly professional organizations, such as the National Defense Industrial Association (NDIA), the Project Management Institute (PMI), the College of Scheduling (CoS), the College of Performance Management (CPM), the International Council on Systems Engineering (INCOSE), the National Contract Management Association (NCMA), and the International Cost Estimating and Analysis Association (ICEAA), among the most frequently attended by these groups. Corresponding and associated private and professional groups are the Project Control Academy and the Association for Computing Machinery (ACM).

This list is by no means exhaustive, but from the perspective of suppliers to public agencies, NDIA, PMI, CoS, and CPM are of particular interest because much of the business of influencing policy and the details of its application are accomplished here. In this manner, the interests of the participants from the corporate side of the equation relate to those areas always of concern: business certainty, minimization of oversight, market and government influence. The market for several years now has been reactive, not proactive.

There is no doubt that business organizations from local Chambers of Commerce to specialized trade groups that bring with them the advantages of finding mutual interests and synergy. All also come with the ills and dysfunction, to varying degrees, borne from self-promotion, glad-handing, back-scratching, and ossification.

In groups where there is little appetite to upend the status quo, innovation and change, is viewed with suspicion and as being risky. In such cases the standard reaction is cognitive dissonance. At least until measures can be taken to subsume or control the pace and nature of the change. This is particularly true in the area of project and program management in general and integrated project, program and portfolio management (IPPM), in particular.

Absent the appetite on the part of DoD to replicate market forces that drive the acceptance of innovative IPPM approaches, one large event and various evolutionary aviation and space technology trends have upended the ecosystem of rent-seeking, reaction, and incumbents bent on maintaining the status quo.

The one large event, of course, came about from the changes wrought by the Covid pandemic. The other, evolutionary changes, are a result of the acceleration of software technology in capturing and transforming big(ger) dataset combined with open business intelligence systems that can be flexibly delivered locally and via the Cloud.

I also predict that these changes will make hard-coded, purpose-driven niche applications obsolete within the next five years, as well as those companies that have built their businesses around delivering custom, niche applications, and MS Excel spreadsheets, and those core companies that are comfortable suboptimizing and reacting to delivering the letter, if not the spirit, of good business practice expected under their contracts.

Walking hand-in-hand with these technological and business developments, the business of the aerospace, space and defense market, in general, is facing a window opening for new entries and greater competition borne of emergent engineering and technological exigencies that demand innovation and new approaches to old, persistent problems.

The coronavirus pandemic and new challenges from the realities of global competition, global warming, geopolitical rivalries; aviation, space and atmospheric science; and the revolution in data capture, transformation, and optimization are upending a period of quiescence and retrenchment in the market. These factors are moving the urgency of innovation and change to the left both rapidly and in a disruptive manner that will only accelerate after the immediate pandemic crisis passes.

In my studies of Toynbee and other historians (outside of my day job, I am also credentialed in political science and history, among other disciplines, through both undergraduate and graduate education), I have observed that societies and cultures that do not embrace the future and confront their challenges effectively, and that do not do so in a constructive manner, find themselves overrun by it and them. History is the chronicle of human frailty, tragedy, and failure interspersed by amazing periods of resilience, human flourishing, advancement, and hope.

As it relates to our more prosaic concerns, Deloitte has published an insightful paper on the 2021 industry outlook. Among the identified short-term developments are:

  1. A slow recovery in passenger travel may impact aircraft deliveries and industry revenues in commercial aviation,
  2. The defense sector will remain stable as countries plan to sustain their military capabilities,
  3. Satellite broadband, space exploration and militarization will drive growth,
  4. Industry will shift to transforming supply chains into more resilient and dynamic networks,
  5. Merger and acquisitions are likely to recover in 2021 as a hedge toward ensuring long-term growth and market share.

More importantly, the longer-term changes to the industry are being driven by the following technological and market changes:

  • Advanced aerial mobility (AAM). Both FAA and NASA are making investments in this area, and so the opening exists for new entries into the market, including new entries in the supply chain, that will disrupt the giants (absent a permissive M&A stance under the new Administration in Washington). AAM is the new paradigm to introduce safe, short-distance, daily-commute flying technologies using vertical lift.
  • Hypersonics. Given the touted investment of Russia and China into this technology as a means of leveraging against the power projection of U.S. forces, particularly its Navy and carrier battle groups (aside from the apparent fact that Vladimir Putin, the president of Upper Volta with Missiles and Hackers, really hates Disney World), the DoD is projected to fast-track hypersonic capabilities and countermeasures.
  • Electric propulsion. NASA is investing in cost-sharing capabilities to leverage electric propulsion technologies, looking to benefit from the start-up growth in this sector. This is an exciting development which has the potential to transform the entire industry over the next decade and after.
  • Hydrogen-powered aircraft. OEMs are continuing to pour private investment money into start-ups looking to introduce more fuel-efficient and clean energy alternatives. As with electric propulsion, there are prototypes of these aircraft being produced and as public investments into cost-sharing and market-investment strategies take hold, the U.S., Europe, and Asia are looking at a more diverse and innovative aerospace, space, and defense market.

Given the present condition of the industry, and the emerging technological developments and resulting transformation of flight, propulsion, and fuel sources, the concept and definitions used in project and program management require a revision to meet the exigencies of the new market.

For both industry and government, in order to address these new developments, I believe that a new language is necessary, as well as a complete revision to what is considered to be the acceptable baseline of best business practice and the art of the possible. Only then will organizations and companies be positioned to address the challenges these new forms of investment and partnering systems will raise.

The New Language of Integrated Program, Project, and Portfolio Management (IPPM).

First a digression to the past: while I was on active duty in the Navy, near the end of my career, I was assigned to the staff of the Office of the Undersecretary of Defense for Acquisition and Technology (OUSD(A&T)). Ostensibly, my assignment was to give me a place to transition from the Service. Thus, I followed the senior executive, who was PEO(A) at NAVAIR, to the Pentagon, simultaneously with the transition of NAVAIR to Patuxent River, Maryland. In reality, I had been tasked by the senior executive, Mr. Dan Czelusniak, to explore and achieve three goals:

  1. To develop a common schema by supporting an existing contract for the collection of data from DoD suppliers from cost-plus R&D contracts with the goal in mind of creating a master historical database of contract performance and technological development risk. This schema would first be directed to cost performance, or EVM;
  2. To continue to develop a language, methodology, and standard, first started and funded by NAVAIR, for the integration of systems engineering and technical performance management into the program management business rhythm;
  3. To create and define a definition of Integrated Program Management.

I largely achieved the first two during my relatively brief period there.

The first became known and the Integrated Digital Environment (IDE), which was refined and fully implemented after my departure from the Service. Much of this work is the basis for data capture, transformation, and load (ETL) today. There had already been a good deal of work by private individuals, organizations, and other governments in establishing common schemas, which were first applied to the transportation and shipping industries. But the team of individuals I worked with were able to set the bar for what followed across datasets.

The second was completed and turned over to the Services and federal agencies, many of whom adopted the initial approach, and refined it as well to inform, through the identification of technical risk, cost performance and technical achievement. Much of this knowledge already existed in the Systems Engineering community, but working with INCOSE, a group of like-minded individuals were able to take the work from the proof-of-concept, which was awarded the Acker in Skill in Communication award at the DAU Acquisition Research Symposium, and turn it into the TPM and KPP standard used by organizations today.

The third began with establishing my position, which hadn’t existed until my arrival: Lead Action Officer, Integrated Program Management. Gary Christle, who was the senior executive in charge of the staff, asked me “What is Integrated Program Management?” I responded: “I don’t know, sir, but I intend to find out.” Unfortunately, this is the initiative that has still eluded both industry and government, but not without some advancement.

Note that this position with its charter to define IPM was created over 24 years ago—about the same time it takes, apparently, to produce an operational fighter jet. I note this with no flippancy, for I believe that the connection is more than just coincidental.

When spoken of, IPM and IPPM are oftentimes restricted to the concept of cost (read cost performance or EVM) and schedule integration, with aggregated portfolio organization across a selected number of projects thrown in, in the latter case. That was considered advancement in 1997. But today, we seem to be stuck in time. In light of present technology and capabilities, this is a self-limiting concept.

This concept is technologically supported by a neutral schema that is authored and managed by DoD. While essential to data capture and transformation—and because of this fact—it is currently the target by incumbents as a means of further limiting even this self-limited definition in practice. It is ironic that a technological advance that supports data-driven in lieu of report-driven information integration is being influenced to support the old paradigm.

The motivations are varied: industry suppliers who aim to restrict access to performance data under project and program management, incumbent technology providers who wish to keep the changes in data capture and transformation restricted to their limited capabilities, consulting companies aligned with technology incumbents, and staff augmentation firms dependent on keeping their customers dependent on custom application development and Excel workbooks. All of these forces work through the various professional organizations which work to influence government policy, hoping to establish themselves as the arbiters of the possible and the acceptable.

Note that oftentimes the requirements under project management are often critiqued under the rubric of government regulation. But that is a misnomer: it is an extension of government contract management. Another critique is made from the perspective of overhead costs. But management costs money, and one would not (or at least should not) drive a car or own a house without insurance and a budget for maintenance, much less a multi-year high-cost project involving the public’s money. In addition, as I have written previously which is supported by the literature, data-driven systems actually reduce costs and overhead.

All of these factors contribute to ossification, and impose artificial blinders that, absent reform, will undermine meeting the new paradigms of 21st Century project management, given that the limited concept of IPM was obviously insufficient to address the challenges of the transitional decade that broached the last century.

Embracing the Future in Aerospace, Space, and Defense

As indicated, the aerospace and space science and technology verticals are entering a new and exciting phase of technological innovation resulting from investments in start-ups and R&D, including public-private cost-sharing arrangements.

  1. IPM to Project Life-Cycle Management. Given the baggage that attends the acronym IPM, and the worldwide trend to data-driven decision-making, it is time to adjust the language of project and program management to align to it. In lieu of IPM, I suggest Project Life-Cycle Management to define the approach to project and program data and information management.
  2. Functionality-Driven to Data-Driven Applications. Our software, systems and procedures must be able to support that infrastructure and be similarly in alignment with that manner of thinking. This evolution includes the following attributes:
    • Data Agnosticism. As our decision-making methods expand to include a wider, deeper, and more comprehensive interdisciplinary approach, our underlying systems must be able to access data in this same manner. As such, these systems must be data agnostic.
    • Data neutrality. In order to optimize access to data, the overhead and effort needed to access data must be greatly reduced. Using data science and analysis to restructure pre-conditioned data in order to overcome proprietary lexicons—an approach used for business intelligence systems since the 1980s—provides no added value to either the data or the organization. If data access is ad hoc and customized in every implementation, the value of the effort cannot either persist, nor is the return on investment fully realized. It backs the customer into a corner in terms of flexibility and innovation. Thus, pre-configured data capture, extract, transformation, and load (ETL) into a non-proprietary and objective format, which applies to all data types used in project and program management systems, is essential to providing the basis for a knowledge-based environment that encourages discovery from data. This approach in ETL is enhanced by the utilization of neutral data schemas.
    • Data in Lieu of Reporting and Visualization. No doubt that data must be visualized at some point—preferably after its transformation and load into the database with other, interrelated data elements that illuminate information to enhance the knowledge of the decisionmaker. This implies that systems that rely on physical report formats, charts, and graphs as the goal are not in alignment with the new paradigm. Where Excel spreadsheets and PowerPoint are used as a management system, it is the preparer is providing the interpretation, in a manner that predisposes the possible alternatives of interpretation. The goal, instead, is to have data speak for itself. It is the data, transformed into information, interrelated and contextualized to create intelligence that is the goal.
    • All of the Data, All of the Time. The cost of 1TB of data compared to 1MB of data is the marginal cost of the additional electrons to produce it. Our systems must be able to capture all of the data essential to effective decision-making in the periodicity determined by the nature of the data. Thus, our software systems must be able to relate data at all levels and to scale from simplistic datasets to extremely large ones. It should do so in such a way that the option for determining what, among the full menu of data options available, is relevant rests in the consumer of that data.
    • Open Systems. Software solution providers beginning with the introduction of widespread CPU capability have manufactured software to perform particular functions based on particular disciplines and very specific capabilities. As noted earlier, these software applications are functionality-focused and proprietary in structure, method, and data. For data-driven project and program requirements, software systems must be flexible enough to accommodate a wide range of analytical and visualization demands in allowing the data to determine the rules of engagement. This implies systems that are open in two ways: data agnosticism, as already noted, but also open in terms of the user environment.
    • Flexible Application Configuration. Our systems must be able to address the needs of the various disciplines in their details, while also allowing for integration and contextualization of interrelated data across domains. As with Open Systems to data and the user environment, openness through the ability to roll out multiple specialized applications from a common platform places the subject matter expert and program manager in the driver’s seat in terms of data analysis and visualization. An effective open platform also reduces the overhead associated with limited purpose-driven, disconnected and proprietary niche applications.
    • No-Code/Low-Code. Given that data and the consumer will determine both the source and method of delivery, our open systems should provide an environment that supports Agile development and deployment of customization and new requirements.
    • Knowledge-Based Content. Given the extensive amount of experience and education recorded and documented in the literature, our systems must, at the very least, provide a baseline of predictive analytics and visualization methods usually found in the more limited, purpose-built hardcoded applications, if not more expansive. This knowledge-based content, however, must be easily expandable and refinable, given the other attributes of openness, flexibility, and application configuration. In this manner, our 21st century project and program management systems must possess the attributes of a hybrid system: providing the functionality of the traditional niche systems with the flexibility and power of a business intelligence system enhanced by COTS data capture and transformation.
    • Ease of Use. The flexibility and power of these systems must be such that implementation and deployment are rapid, and that new user environment applications can be quickly deployed. Furthermore, the end user should be able to determine the level of complexity or simplicity of the environment to support ease of use.
  1. Focus on the Earliest Indicator. A good deal of effort since the late 1990s has been expended on defining the highest level of summary data that is sufficient to inform earned value, with schedule integration derived from the WBS, oftentimes summarized on a one-to-many basis as well. This perspective is biased toward believing that cost performance is the basis for determining project control and performance. But even when related to cost, the focus is backwards. The project lifecycle in its optimized form exists of the following progression:

    Project Goals and Contract (framing assumptions) –> Systems Engineering, CDRLs, KPPs, MoEs, MoPs, TPMs –> Project Estimate –> Project Plan –> IMS –> Risk and Uncertainty Analysis –> Financial Planning and Execution –> PMB –> EVM

    As I’ve documented in this blog over the years, DoD studies have shown that, while greater detail within the EVM data may not garner greater early warning, proper integration with the schedule at the work package level does. Program variances first appear in the IMS. A good IMS, thus, is key to collecting and acting as the main execution document. This is why many program managers who are largely absent in the last decade or so from the professional organizations listed, tend to assert that EVM is like “looking in the rearview mirror.” It isn’t that it is not essential, but it is true that it is not the earliest indicator of variances from expected baseline project performance.

    Thus, the emphasis going forward under this new paradigm is not to continue the emphasis and a central role for EVM, but a shift to the earliest indicator for each aspect of the program that defines its framing assumptions.
  1. Systems Engineering: It’s not Space Science, it’s Space Engineering, which is harder.
    The focus on start-up financing and developmental cost-sharing shifts the focus to systems engineering configuration control and technical performance indicators. The emphasis on meeting expectations, program goals, and achieving milestones within the cost share make it essential to be able to identify fatal variances, long before conventional cost performance indicators show variances. The concern of the program manager in these cases isn’t so much on the estimate at complete, but whether the industry partner will be able to deploy the technology within the acceptable range of the MoEs, MoPs, TPPs, and KPPs, and not exceed the government’s portion of the cost share. Thus, the incentive is to not only identify variances and unacceptable risk at the earliest indicator, but to do so in terms of whether the end-item technology will be successfully deployed, or whether the government should cut its losses.
  1. Risk and Uncertainty is more than SRA. The late 20th century approach to risk management is to run a simulated Monte Carlo analysis against the schedule, and to identify alternative critical paths and any unacceptable risks within the critical path. This is known as the schedule risk analysis, or SRA. While valuable, the ratio of personnel engaged in risk management is much smaller than the staffs devoted to schedule and cost analysis.

    This is no doubt due to the specialized language and techniques devoted to risk and uncertainty. This segregation of risk from mainstream project and program analysis has severely restricted both the utility and the real-world impact of risk analysis on program management decision-making.

    But risk and uncertainty extend beyond the schedule risk analysis, and their utility in an environment of aggressive investment in new technology, innovation, and new entries to the market will place these assessments at center stage. In reality, our ability to apply risk analysis techniques extends to the project plan, to technical performance indicators, to estimating, to the integrated master schedule (IMS), and to cost, both financial and from an earned value perspective. Combined with the need to identify risk and major variances using the earliest indicator, risk analysis becomes pivotal to mainstream program analysis and decision-making.

Conclusions from Part Two

The ASD industry is most closely aligned with PPM in the public interest. Two overarching trends that are transforming this market that are overcoming the inertia and ossification of PPM thought are the communications and information systems employed in response to the coronavirus pandemic, which opened pathways to new ways of thinking about the status quo, and the start-ups and new entries into the ASD market, borne from the investments in new technologies arising from external market, geo-political, space science, global warming, and propulsion trends, as well as new technologies and methods being employed in data and information technology that drive greater efficiency and productivity. These changes have forced a new language and new expectations as to the art of the necessary, as well as the art of the possible, for PPM. This new language includes a transition to the concept of the optimal capture and use of all data across the program management life cycle with greater emphasis on systems engineering, technical performance, and risk.

Having summarized the new program paradigm in Aerospace, Space, and Defense, my next post will assess the characteristics of program management in various commercial industries, the rising trends in these verticals, and what that means for the project and program management discipline.

Open: Strategic Planning, Open Data Systems, and the Section 809 Panel

Sundays are usually days reserved for music and the group Rhye was playing in the background when this topic came to mind.

I have been preparing for my presentation in collaboration with my Navy colleague John Collins for the upcoming Integrated Program Management Workshop in Baltimore. This presentation will be a non-proprietary/non-commercial talk about understanding the issue of unlocking data to support national defense systems, but the topic has broader interest.

Thus, in advance of that formal presentation in Baltimore, there are issues and principles that are useful to cover, given that data capture and its processing, delivery, and use is at the heart of all systems in government, and private industry and organizations.

Top Data Trends in Industry and Their Relationship to Open Data Systems

According to Shohreh Gorbhani, Director, Project Control Academy, the top five data trends being pursued by private industry and technology companies. My own comments follow as they relate to open data systems.

  1. Open Technologies that transition from 2D Program Management to 3D and 4D PM. This point is consistent with the College of Performance Management’s emphasis on IPM, but note that the stipulation is the use of open technologies. This is an important distinction technologically, and one that I will explore further in this post.
  2. Real-time Data Capture. This means capturing data in the moment so that the status of our systems is up-to-date without the present delays associated with manual data management and conditioning. This does not preclude the collection of structured, periodic data, but also does include the capture of transactions from real-time integrated systems where appropriate.
  3. Seamless Data Flow Integration. From the perspective of companies in manufacturing and consumer products, technologies such as IoT and Cloud are just now coming into play. But, given the underlying premises of items 1 and 2, this also means the proper automated contextualization of data using an open technology approach that flows in such a way as to be traceable.
  4. The use of Big Data. The term has lost a good deal of its meaning because of its transformation into a buzz-phrase and marketing term. But Big Data refers to the expansion in the depth and breadth of available data driven by the economic forces that drive Moore’s Law. What this means is that we are entering a new frontier of data processing and analysis that will, no doubt, break down assumptions regarding the validity and strength of certain predictive analytics. The old assumptions that restrict access to data due to limitations of technology and higher cost no longer apply. We are now in the age of Knowledge Discovery in Data (KDD). The old approach of reporting assumed that we already know what we need to know. The use of data challenges old assumptions and allows us to follow the data where it will lead us.
  5. AI Forecasting and Analysis. No doubt predictive AI will be important as we move forward with machine learning and other similar technologies. But this infant is not yet a rug rat. The initial experiences with AI are that they tend to reflect the biases of the creators. The danger here is that this defeats KDD, which results in stagnation and fugue. But there are other areas where AI can be taught to automate mundane, value-neutral tasks relating to raw data interpretation.

The 809 Panel Recommendation

The fact that industry is the driving force behind these trends that will transform the way that we view information in our day-to-day work, it is not surprising that the 809 Panel had this to say about existing defense business systems:

“Use existing defense business system open-data requirements to improve strategic decision making on acquisition and workforce issues…. DoD has spent billions of dollars building the necessary software and institutional infrastructure to collect enterprise wide acquisition and financial data. In many cases, however, DoD lacks the expertise to effectively use that data for strategic planning and to improve decision making. Recommendation 88 would mitigate this problem by implementing congressional open-data mandates and using existing hiring authorities to bolster DoD’s pool of data science professionals.”

Section 809 Volume 3, Section 9, p. 477

At one point in my military career, I was assigned as the Materiel, Fuels, and Transportation Officer of Naval Air Station, Norfolk. As a major naval air base, transportation hub, and home to a Naval Aviation Depot, we shipped and received materiel and supplies across the world. In doing so, our transportation personnel would use what at the time was new digital technology to complete an electronic bill of lading that specified what and when items were being shipped, the common or military carrier, the intended recipient, and the estimated date of arrival, among other essential information.

The customer and receiving end of this workflow received an open systems data file that contained these particulars. The file was an early version of open data known as an X12 file, for which the commercial transportation industry was an early adopter. Shipping and receiving activities and businesses used their own type of local software: and there were a number of customized and commercial choices out there, as well as those used by common carriers such various trucking and shipping firms, the USPS, FEDEX, DHS, UPS, and others. The X12 file was the DMZ that made the information open. Software manufacturers, if they wanted to stay relevant in the market, could not impose a proprietary data solution.

Furthermore, standardization of terminology and concepts ensured that the information was readable and comprehensible wherever the items landed–whether across receiving offices in the United States, Japan, Europe, or even Istanbul. Understanding that DoD needs the skillsets to be able to optimize data, it didn’t require an army of data scientists to achieve this end-state. It required the right data science expertise in the right places, and the dictates of transportation consumers to move the technology market to provide the solution.

Over the years both industry and government have developed a number of schema standards focused on specific types of data, progressing from X12 to XML and now projected to use JSON-based schemas. Each of them in their initial iterations automated the submission of physical reports that had been required by either by contract or operations. These focused on a small subset of the full dataset relating to program management and project controls.

This progression made sense.

When digitized technology is first introduced into an intensive direct-labor environment, the initial focus is to automate the production of artifacts and their underlying processes in order to phase in the technology’s acceptance. This also allows the organization to realize immediate returns on investment and improvements in productivity. But this is the first step, not the final one.

Currently for project controls the current state is the UN/CEFACT XML for program performance management data, and the contract cost and labor data collection file known as the FlexFile. Clearly the latter file, given that the recipient is the Office of the Secretary of Defense Cost Assessment and Program Evaluation (OSD CAPE), establish it as one of many feedback loops that support that office’s role in coordinating the planning, programming, budgeting, and evaluation (PPBE) system related to military strategic investments and budgeting, but only one. The program performance information is also a vital part of the PPBE process in evaluation and in future planning.

For most of the U.S. economy, market forces and consumer requirements are the driving force in digital innovation. The trends noted by Ms. Gorbhani can be confirmed through a Google search of any one of the many technology magazines and websites that can be found. The 809 Panel, drawn as it was from specialists and industry and government, were tasked “to provide recommendations that would allow DoD to adapt and deliver capability at market speeds, while ensuring that DoD remains true to its commitment to promote competition, provide transparency in its actions, and maintain the integrity of the defense acquisition system.”

Given that the work of the DoD is unique, creating a type of monopsony, it is up to leadership within the Department to create the conditions and mandates necessary to recreate in microcosm the positive effects of market forces. The DoD also has a very special, vital mission in defending the nation.

When an individual business cobbles together its mission statement it is that mission that defines the necessary elements in data collection that are then essential in making decisions. In today’s world, best commercial sector practice is to establish a Master Data Management (MDM) approach in defining data requirements and practices. In the case of DoD, a similar approach would be beneficial. Concurrent with the period of the 809 Panel’s efforts, RAND Corporation delivered a paper in 2017 (link in the previous sentence) that made recommendations related to data governance that are consistent with the 809 Panel’s recommendations. We will be discussing these specific recommendations in our presentation.

Meeting the mission and readiness are the key components to data governance in DoD. Absent such guidance, specialized software solution providers, in particular, will engage in what is called “rent-seeking” behavior. This is an economic term that means that an “entity (that) seeks to gain added wealth without any reciprocal contribution of productivity.”

No doubt, given the marketing of software solution providers, it is hard for decision-makers to tell what constitutes an open data system. The motivation of a software solution is to make itself as “sticky” as possible and it does that by enticing a customer to commit to proprietary definitions, structures, and database schemas. Usually there are “black-boxed” portions of the software that makes traceability impossible and that complicates the issue of who exactly owns the data and the ability of the customer to optimize it and utilize it as the mission dictates.

Furthermore, data visualization components like dashboards are ubiquitous in the market. A cursory stroll through a tradeshow looks like a dashboard smorgasbord combined with different practical concepts of what constitutes “open” and “integration”.

As one DoD professional recently told me, it is hard to tell the software systems apart. To do this it is necessary to understand what underlies the software. Thus, a proposed honest-broker definition of an open data system is useful and the place to start, given that this is not a notional concept since such systems have been successfully been established.

The Definition of Open Data Systems

Practical experience in implementing open data systems toward the goal of optimizing essential information from our planning, acquisition, financial, and systems engineering systems informs the following proposed definition, which is based on commercial best practice. This proposal is also based on the principle that the customer owns the data.

  1. An open data system is one based on non-proprietary neutral schemas that allow for the effective capture of all essential elements from third-party proprietary and customized software for reporting and integration necessary to support both internal and external stakeholders.
  2. An open data system allows for complete traceability and transparency from the underlying database structure of the third-party software data, through the process of data capture, transformation, and delivery of data in the neutral schema.
  3. An open data system targets the loading of the underlying source data for analysis and use into a neutral database structure that replicates the structure of the neutral schema. This allows for 100% traceability and audit of data elements received through the neutral schema, and ensures that the receiving organization owns the data.

Under this definition, data from its origination to its destination is more easily validated and traced, ensuring quality and fidelity, and establishing confidence in its value. Given these characteristics, integration of data from disparate domains becomes possible. The tracking of conflicting indicators is mitigated, since open system data allows for its effective integration without the bias of proprietary coding or restrictions on data use. Finally, both government and industry will not only establish ownership of their data–a routine principle in commercial business–but also be free to utilize new technologies that optimize the use of that data.

In closing, Gahan Wilson, a cartoonist whose work appeared in National Lampoon, The New Yorker, Playboy, and other magazines recently passed.

When thinking of the barriers to the effective use of data, I came across this cartoon in The New Yorker:

Open Data is the key to effective integration and reporting–to the optimal use of information. Once mandated and achieved, our defense and business systems will be better informed and be able to test and verify assumed knowledge, address risk, and eliminate dogmatic and erroneous conclusions. Open Data is the driver of organizational transformation keyed to the effective understanding and use of information, and all that entails. Finally, Open Data is necessary to the mission and planning systems of both industry and the U.S. Department of Defense.

Sledgehammer: Pisano Talks!

My blogging hiatus is coming to an end as I take a sledgehammer to the writer’s block wall.

I’ve traveled far and wide over the last six months to various venues across the country and have collected a number of new and interesting perspectives on the issues of data transformation, integrated project management, and business analytics and visualization. As a result, I have developed some very strong opinions regarding the trends that work and those that don’t regarding these topics and will be sharing these perspectives (with the appropriate supporting documentation per usual) in following posts.

To get things started this post will be relatively brief.

First, I will be speaking along with co-presenter John Collins, who is a Senior Acquisition Specialist at the Navy Engineering & Logistics Office, at the Integrated Program Management Workshop at the Hyatt Regency in beautiful downtown Baltimore’s Inner Harbor 10-12 December. So come on down! (or over) and give us a listen.

The topic is “Unlocking Data to Improve National Defense Systems”. Today anyone can put together pretty visualizations of data from Excel spreadsheets and other sources–and some have made quite a bit of money doing so. But accessing the right data at the right level of detail, transforming it so that its information content can be exploited, and contextualizing it properly through integration will provide the most value to organizations.

Furthermore, our presentation will make a linkage to what data is necessary to national defense systems in constructing the necessary artifacts to support the Department of Defense’s Planning, Programming, Budgeting and Execution (PPBE) process and what eventually becomes the Future Years Defense Program (FYDP).

Traditionally information capture and reporting has been framed as a question of oversight, reporting, and regulation related to contract management, capital investment cost control, and DoD R&D and acquisition program management. But organizations that fail to leverage the new powerful technologies that double processing and data storage capability every 18 months, allowing for both the depth and breadth of data to expand exponentially, are setting themselves up to fail. In national defense, this is a condition that cannot be allowed to occur.

If DoD doesn’t collect this information, which we know from the reports of cybersecurity agencies that other state actors are collecting, we will be at a serious strategic disadvantage. We are in a new frontier of knowledge discovery in data. Our analysts and program managers think they know what they need to be viewing, but adding new perspectives through integration provide new perspectives and, as a result, will result in new indicators and predictive analytics that will, no doubt, overtake current practice. Furthermore, that information can now be processed and contribute more, timely, and better intelligence to the process of strategic and operational planning.

The presentation will be somewhat wonky and directed at policymakers and decisionmakers in both government and industry. But anyone can play, and that is the cool aspect of our community. The presentation will be non-commercial, despite my day job–a line I haven’t crossed up to this point in this blog, but in this latter case will be changing to some extent.

Back in early 2018 I became the sole proprietor of SNA Software LLC–an industry technology leader in data transformation–particularly in capturing datasets that traditionally have been referred to as “Big Data”–and a hybrid point solution that is built on an open business intelligence framework. Our approach leverages the advantages of COTS (delivering the 80% solution out of the box) with open business intelligence that allows for rapid configuration to adapt the solution to an organization’s needs and culture. Combined with COTS data capture and transformation software–the key to transforming data into information and then combining it to provide intelligence at the right time and to the right place–the latency in access to trusted intelligence is reduced significantly.

Along these lines, I have developed some very specific opinions about how to achieve this transformation–and have put those concepts into practice through SNA and delivered those solutions to our customers. Thus, the result has been to reduce both the effort and time to capture large datasets from data that originates in pre-processed data, and to eliminate direct labor and the duration to information delivery by more than 99%. The path to get there is not to apply an army of data scientists and data analysts that deals with all data as if it is flat and to reinvent the wheel–only to deliver a suboptimized solution sometime in the future after unnecessarily expending time and resources. This is a devolution to the same labor-intensive business intelligence approaches that we used back in the 1980s and 1990s. The answer is not to throw labor at data that already has its meaning embedded into its information content. The answer is to apply smarts through technology, and that’s what we do.

Further along these lines, if you are using hard-coded point solutions (also called purpose-built software) and knitted best-of-breed, chances are that you will find that you are poorly positioned to exploit new technology and will be obsolete within the next five years, if not sooner. The model of selling COTS solutions and walking away except for traditional maintenance and support is dying. The new paradigm will be to be part of the solution and that requires domain knowledge that translates into technology delivery.

More on these points in future posts, but I’ve placed the stake in the ground and we’ll see how they hold up to critique and comment.

Finally, I recently became aware of an extremely informative and cutting-edge website that includes podcasts from thought leaders in the area of integrated program management. It is entitled InnovateIPM and is operated and moderated by a gentleman named Rob Williams. He is a domain expert in project cost development, with over 20 years of experience in the oil, gas, and petrochemical industries. Robin has served in a variety of roles throughout his career and is now focuses on cost estimating and Front-End Loading quality assurance. His current role is advanced project cost estimator at Marathon Petroleum’s Galveston Bay Refinery in Texas City.

Rob was also nice enough to continue a discussion we started at a project controls symposium and interviewed me for a podcast. I’ll post additional information once it is posted.

Sunday Contemplation — Finding Wisdom: The Epimenides Paradox

The liar’s paradox, as it is often called, is a fitting subject for our time. For those not familiar with the paradox, it was introduced to me by the historian Gordon Prange when I was a young Navy enlisted man attending the University of Maryland. He introduced the paradox to me as a comedic rejoinder to the charge of a certain bias in history that he considered to be without merit. He stated it this way: “I heard from a Cretan that all Cretans are liars.”

The origin of this form of the liar’s paradox has many roots. It is discussed as a philosophical conundrum by Aristotle in ancient Greece as well as by Cicero in Rome. A version of it appears in the Christian New Testament and it was a source of study in Europe during the Middle Ages.

When I have introduced the paradox in a social setting and asked for a resolution to it by the uninitiated, usually a long conversation ensues. The usual approach is as a bi-polar proposition, accepting certain assumptions from the construction of the sentence, that is, if the Cretan is lying then all Cretans tell the truth which cannot be the case, but if the Cretan is telling the truth then he is lying, but he could not be telling the truth since all Cretans lie…and the circular contradiction goes on ad infinitum.

But there is a solution to the paradox and what it requires is thinking about the Cretan and breaking free of bi-polar thinking, which we often call, colloquially, “thinking in black and white.”

The solution.

The assumption in the paradox is that the Cretan in question can speak for all Cretans. This assumption could be false. Thus not all Cretans are liars and, thus, the Cretan in question is making a false statement. Furthermore, the Cretan making the assertion is not necessarily a liar–the individual could just be mistaken. We can test the “truthiness” of what the Cretan has said by testing other Cretans on a number of topics and seeing if they are simply ignorant, uninformed, or truly liars on all things.

Furthermore, there is a difference between something being a lie and a not-lie. Baked into our thinking by absolutist philosophies, ideologies, and religions is black and white thinking that clouds our judgement. A lie must have intent and be directed to misinform, misdirect, or to cloud a discussion. There are all kinds of lies and many forms of not-lies. Thus, the opposite of “all Cretans are liars” is not that “all Cretans are honest” but that “some Cretans are honest and some are not.”

If we only assume the original conclusion as being true, then this is truly a paradox, but it is not. If we show that Cretans do not lie all of the time then we are not required to reach the high bar that “all Cretans are honest”, simply that the Cretan making the assertion has made a false statement or is, instead, the liar.

In sum, our solution in avoiding falling into the thinking of the faulty or dishonest Cretan is not to accept the premises as they have been presented to us, but to use our ability to reason out the premises and to look at the world as it is as a “reality check.” The paradox is not truly a paradox, and the assertion is false.

(Note that I have explained this resolution without going into the philosophical details of the original syllogism, the mathematics, and an inquiry on the detailed assumptions. For a fuller discussion of liar’s paradoxes I recommend this link.)

Why Care About the Paradox?

We see versions of the paradox used all of the time. This includes the use of ad hominem attacks on people, that is, charges of guilt by association with an idea, a place, an ethnic group, or another person. “Person X is a liar (or his/her actions are suspect or cannot be trusted) because they adhere to Y idea, group, or place.” Oftentimes these attacks are joined with insulting or demeaning catchphrases and (especially racial or ethnic) slurs.

What we attribute to partisanship or prejudice or bias often uses this underlying type of thinking. It is a simplification born of ignorance and all simplifications are a form of evil in the world. This assertion was best articulated by Albert Camus in his book The Plague.

“The evil that is in the world always comes of ignorance, and good intentions may do as much harm as malevolence, if they lack understanding. On the whole, men are more good than bad; that, however, isn’t the real point. But they are more or less ignorant, and it is this that we call vice or virtue; the most incorrigible vice being that of an ignorance that fancies it knows everything and therefore claims for itself the right to kill. The soul of the murderer is blind; and there can be no true goodness nor true love without the utmost clear-sightedness.”

Our own times are not much different in its challenges than what Camus faced during the rise of fascism in Europe, for fascism’s offspring have given rise to a new generation that has insinuated itself into people’s minds.

Aside from my expertise in technology and the military arts and sciences, the bulk of my formal academic education is as an historian and political scientist. The world is currently in the grip of a plague that eschews education and Camus’ clear-sightedness in favor of materialism, ethnic hatred, nativisim, anti-intellectualism, and ideological propaganda.

History is replete with similar examples, both large and small, of this type of thinking which should teach us that this is an aspect of human character wired into our brains that requires eternal vigilance to guard against. Such examples as the Spanish Inquisition, the Reformation and Counter Reformation, the French Revolution, the defense of slavery in the American Civil War and the subsequent terror of Jim Crow, 18th and 19th century imperialism, apartheid after the Boer War, the disaster of the First World War, the Russian Revolutions, the history of anti-Jewish pogroms and the Holocaust, the rise of Fascism and Nazism, Stalinism, McCarthyism in the United States, Mao and China’s Cultural Revolution, Castro’s Cuba, Pinochet’s Chile, the Pathet Lao, the current violence and intolerance borne of religious fundamentalism–and the list can go on–teaches us that our only salvation and survival as a species lies in our ability to overcome ignorance and self-delusion.

We come upon more pedestrian examples of this thinking all of the time. As Joseph Conrad wrote in Heart of Darkness, “The mind of man is capable of anything—because everything is in it, all the past as well as all the future.”

We must perform this vigilance first on ourselves–and it is a painful process because it shatters the self-image that is necessary for us to continue from day-to-day: that narrative thread that connects the events of our existence and that guides our actions as best and in as limited ways that they can be guided, without falling into the abyss of nihilism. Only knowledge, and the attendant realization of the necessary components of human love, acceptance, empathy, sympathy, and community–that is understanding–the essential connections that make us human–can overcome the darkness that constantly threatens to envelope us. But there is something more.

The birth of the United States was born on the premise that the practical experiences of history and its excesses could be guarded against and such “checks and balances” would be woven, first, into the thread of its structure, and then, into the thinking of its people. This is the ideal, and it need not be said that, given that it was a construction of flawed men, despite their best efforts at education and enlightenment compared to the broad ignorance of their time, these ideals for many continued to be only that. This ideal is known as the democratic ideal.

Semantics Matter

It is one that is under attack as well. We often hear the argument against it dressed up in academic clothing as being “only semantics” on the difference between a republic and a democracy. But as I have illustrated  regarding the Epimenides Paradox, semantics matter.

For the democratic ideal is about self-government, which was a revolutionary concept in the 18th century and remains one today, which is why it has been and continues to be under attack by authoritarians, oligarchs, dictators, and factions pushing their version of the truth as they define it. But it goes further than than a mechanical process of government.

The best articulation of democracy in its American incarnation probably was written by the philosopher and educator John Dewey in his essay On Democracy. Democracy, says Dewey, is more than a special political form: it is a way of life, social and individual, that allows for the participation of every mature human being in forming the values that regulate society toward the twin goals of ensuring the general social welfare and full development of human beings as individuals.

While what we call intelligence be distributed in unequal amounts, it is the democratic faith that it is sufficiently general so that each individual has something to contribute, whose value can be assessed only as enters into the final pooled intelligence constituted by the contributions of all. Every authoritarian scheme, on the contrary, assumes that its value may be assessed by some prior principle, if not of family and birth or race and color or possession of material wealth, then by the position and rank a person occupies in the existing social scheme. The democratic faith in equality is the faith that each individual shall have the chance and opportunity to contribute whatever he is capable of contributing and that the value of his contribution be decided by its place and function in the organized total of similar contributions, not on the basis of prior status of any kind whatever.

In such a society there is no place for “I heard from a Cretan that all Cretans lie.” For democracy to work, however, requires not only vigilance but a dedication to education that is further dedicated to finding knowledge, however inconvenient or unpopular that knowledge may turn out to be. The danger has always been in lying to ourselves, and allowing ourselves to be seduced by good liars.

Note: This post has been updated for grammar and for purposes of clarity from the original.

You Know I’m No Good: 2016 Election Polls and Predictive Analytics

While the excitement and emotions of this past election work themselves out in the populace at large, as a writer and contributor to the use of predictive analytics, I find the discussion about “where the polls went wrong” to be of most interest.  This is an important discussion, because the most reliable polling organizations–those that have proven themselves out by being right consistently on a whole host of issues since most of the world moved to digitization and the Internet of Things in their daily lives–seemed to be dead wrong in certain of their predictions.  I say certain because the polls were not completely wrong.

For partisans who point to Brexit and polling in the U.K., I hasten to add that this is comparing apples to oranges.  The major U.S. polling organizations that use aggregation and Bayesian modeling did not poll Brexit.  In fact, there was one reliable U.K. polling organization that did note two factors:  one was that the trend in the final days was toward Brexit, and the other is that the final result was based on turnout, where greater turnout favored the “stay” vote.

But aside from these general details, this issue is of interest in project management because, unlike national and state polling, where there are sufficient numbers to support significance, at the micro-microeconomic level of project management we deal with very small datasets that expand the range of probable results.  This is not an insignificant point that has been made time-and-time again over the years, particularly given single-point estimates using limited time-phased data absent a general model that provides insight into what are the likeliest results.  This last point is important.

So let’s look at the national polls on the eve of the election according to RealClear.  IBD/TIPP Tracking had it Trump +2 at +/-3.1% in a four way race.  LA Times/USC had it Trump +3 at the 95% confidence interval, which essentially means tied.  Bloomberg had Clinton +3, CBS had Clinton +4, Fox had Clinton +4, Reuters/Ipsos had Clinton +3, ABC/WashPost, Monmouth, Economist/YouGov, Rasmussen, and NBC/SM had Clinton +2 to +6.  The margin for error for almost all of these polls varies from +/-3% to +/-4%.

As of this writing Clinton sits at about +1.8% nationally, the votes are still coming in and continue to confirm her popular vote lead, currently standing at about 300,000 votes.  Of the polls cited, Rasmussen was the closest to the final result.  Virtually every other poll, however, except IBD/TIPP, was within the margin of error.

The polling that was off in predicting the election were those that aggregated polls along with state polls, adjusted polling based on non-direct polling indicators, and/or then projected the chances of winning based on the probable electoral vote totals.  This is where things were off.

Among the most popular of these sites is Nate Silver’s FiveThirtyEight blog.  Silver established his bonafides in 2008 by picking winners with incredible accuracy, particularly at the state level, and subsequently in his work at the New York Times which continued to prove the efficacy of data in predictive analytics in everything from elections to sports.  Since that time his significant reputation has only grown.

What Silver does is determine the probability of an electoral outcome by using poll results that are transparent in their methodologies and that have a high level of confidence.  Silver’s was the most conservative of these types of polling organizations.  On the eve of the election Silver gave Clinton a 71% chance of winning the presidency. The other organizations that use poll aggregation, poll normalization, or other adjusting indicators (such as betting odds, financial market indicators, and political science indicators) include the New York Times Upshot (Clinton 85%), HuffPost (Clinton 98%), PredictWise (Clinton 89%), Princeton (Clinton >99%), DailyKos (Clinton 92%), Cook (Lean Clinton), Roth (Lean Clinton), and Sabato (Lean Clinton).

In order to understand what probability means in this context, the polls were using both bottom-up state polling to track the electoral college combined with national popular vote polling.  But keep in mind that, as Nate Silver wrote over the course of the election, that just a 17% chance of winning “is the same as your chances of losing a “game” of Russian roulette”.  Few of us would take that bet, particularly since the result of losing the game is finality.

Still, except for FiveThirtyEight, none of the other methods using probability got it right.  None, except FiveThirtyEight, left enough room for drawing the wrong chamber.  Also, in fairness, the Cook, Rothenberg, and Sabato projections also left enough room to see a Trump win if the state dominoes fell right.

The place that the models failed were in the states of Florida, North Carolina, Pennsylvania, Michigan, and Wisconsin.  In particular, even with Florida (result Trump +1.3%) and North Carolina (result Trump +3.8%), Trump would not win if Pennsylvania (result Trump +1.2%), Michigan (result Trump +.3), and Wisconsin (result Trump +1.0)–supposed Clinton firewall states–were not breached.  So what happened?

Among the possible factors are the effect of FBI Director Comey’s public intervention, which was too soon to the election to register in the polling; ineffective polling methods in rural areas (garbage in-garbage out), bad state polling quality, voter suppression, purging, and restrictions (of the battleground states this includes Florida, North Carolina, Wisconsin, Ohio, and Iowa), voter turnout and enthusiasm (aside from the factors of voter suppression), and the inability to peg the way the high level of undecided voters would go at the last minute.

In hindsight, the national polls were good predictors.  The sufficiency of the data in drawing significance, and the high level of confidence in their predictive power is borne out by the final national vote totals.

I think that where the polling failed in the projections of the electoral college was from the inability to take into account non-statistical factors, selection bias, and that the state poll models probably did not accurately reflect the electorate in the states given the lessons from the primaries.  Along these lines, I believe that if pollsters look at the demographics in the respective primaries that they will find that both voter enthusiasm and composition provide the corrective to their projections. Given these factors, the aggregators and probabilistic models should all have called the race too close to call.  I think both Monte Carlo and Bayesian methods in simulations will bear this out.

For example, as one who also holds a political science degree and so will put on that hat.  It is a basic tenet that negative campaigns depress voter participation.  This causes voters to select the lesser of two evils (or lesser of two weevils).  Voter participation was down significantly due to a unprecedentedly negative campaign.  When this occurs, the most motivated base will usually determine the winner in an election.  This is why midterm elections are so volatile, particularly after a presidential win that causes a rebound of the opposition party.  Whether this trend continues with the reintroduction of gerrymandering is yet to be seen.

What all this points to from a data analytics perspective is that one must have a model to explain what is happening.  Statistics by themselves, while correct a good bit of the time, will cause one to be overconfident of a result based solely on the numbers and simulations that give the false impression of solidity, particularly when one is in a volatile environment.  This is known as reification.  It is a fallacious way of thinking.  Combined with selection bias and the absence of a reasonable narrative model–one that introduces the social interactions necessary to understand the behavior of complex adaptive systems–one will often find that invalid results result.

The Revolution Will Not Be Televised — The Sustainability Manifesto for Projects

While doing stuff and living life (which seems to take me away from writing) there were a good many interesting things written on project management.  The very insightful Dave Gordon at his blog, The Practicing IT Project Manager, provides a useful weekly list of the latest contributions to the literature that are of note.  If you haven’t checked it out please do so–I recommend it highly.

While I was away Dave posted to an interesting link on the concept of sustainability in project management.  Along those lines three PM professionals have proposed a Sustainability Manifesto for Projects.  As Dave points out in his own post on the topic, it rests on three basic principles:

  • Benefits realization over metrics limited to time, scope, and cost
  • Value for many over value of money
  • The long-term impact of our projects over their immediate results

These are worthy goals and no one needs to have me rain on their parade.  I would like to see these ethical principles, which is what they really are, incorporated into how we all conduct ourselves in business.  But then there is reality–the “is” over the “ought.”

For example, Dave and I have had some correspondence regarding the nature of the marketplace in which we operate through this blog.  Some time ago I wrote a series of posts here, here, and here providing an analysis of the markets in which we operate both in macroeconomic and microeconomic terms.

This came in response to one my colleagues making the counterfactual assertion that we operate in a “free market” based on the concept of “private enterprise.”  Apparently, such just-so stories are lies we have to tell ourselves to make the hypocrisy of daily life bearable.  But, to bring the point home, in talking about the concept of sustainability, what concrete measures will the authors of the manifesto bring to the table to counter the financialization of American business that has occurred of the past 35 years?

For example, the news lately has been replete with stories of companies moving plants from the United States to Mexico.  This despite rising and record corporate profits during a period of stagnating median working class incomes.  Free trade and globalization have been cited as the cause, but this involves more hand waving and the invocation of mantras, rather than analysis.  There has also been the predictable invocations of the Ayn Randian cult and the pseudoscience* of Social Darwinism.  Those on the opposite side of the debate characterize things as a morality play, with the public good versus greed being the main issue.  All of these explanations miss their mark, some more than others.

An article setting aside a few myths was recently published by Jonathan Rothwell at Brookings, which came to me via Mark Thoma’s blog, in the article, “Make elites compete: Why the 1% earn so much and what to do about it”.  Rothwell looks at the relative gains of the market over the last 40 years and finds that corporate profits, while doing well, have not been the driver of inequality that Robert Reich and other economists would have it be.  In looking at another myth that has been promulgated by Greg Mankiw, he finds that the rewards of one’s labors is not related to any special intelligence or skill.  On the contrary, one’s entry into the 1% is actually related to what industry one chooses to enter, regardless of all other factors.  This disparity is known as a “pay premium”.  As expected, petroleum and coal products, financial instruments, financial institutions, and lawyers, are at the top of the pay premium.  What is not, against all expectations of popular culture and popular economic writing, is the IT industry–hardware, software, etc.  Though they are the poster children of new technology, Bill Gates, Mark Zuckerburg, and others are the exception to the rule in an industry that is marked by a 90% failure rate.  Our most educated and talented people–those in science, engineering, the arts, and academia–are poorly paid–with negative pay premiums associated with their vocations.

The financialization of the economy is not a new or unnoticed phenomenon.  Kevin Phillips, in Wealth and Democracy, which was written in 2003, noted this trend.  There have been others.  What has not happened as a result is a national discussion on what to do about it, particularly in defining the term “sustainability”.

For those of us who have worked in the acquisition community, the practical impact of financialization and de-industrialization have made logistics challenging to say the least.  As a young contract negotiator and Navy Contracting Officer, I was challenged to support the fleet when any kind of fabrication or production was involved, especially in non-stocked machined spares of any significant complexity or size.  Oftentimes my search would find that the company that manufactured the items was out of business, its pieces sold off during Chapter 11, and most of the production work for those items still available done seasonally out of country.  My “out” at the time–during the height of the Cold War–was to take the technical specs, which were paid for and therefore owned by the government, to one of the Navy industrial activities for fabrication and production.  The skillset for such work was still fairly widespread, supported by the quality control provided by a fairly well-unionized and trade-based workforce–especially among machinists and other skilled workers.

Given the new and unique ways judges and lawyers have applied privatized IP law to items financed by the public, such opportunities to support our public institutions and infrastructure, as I was able, have been largely closed out.  Furthermore, the places to send such work, where possible, have also gotten vanishingly smaller.  Perhaps digital printing will be the savior for manufacturing that it is touted to be.  What it will not do is stitch back the social fabric that has been ripped apart in communities hollowed out by the loss of their economic base, which, when replaced, comes with lowered expectations and quality of life–and often shortened lives.

In the end, though, such “fixes” benefit a shrinkingly few individuals at the expense of the democratic enterprise.  Capitalism did not exist when the country was formed, despite the assertion of polemicists to link the economic system to our democratic government.  Smith did not write his pre-modern scientific tract until 1776, and much of what it meant was years off into the future, and its relevance given what we’ve learned over the last 240 years about human nature and our world is up for debate.  What was not part of such a discussion back then–and would not have been understood–was the concept of sustainability.  Sustainability in the study of healthy ecosystems usually involves the maintenance of great diversity and the flourishing of life that denotes health.  This is science.  Economics, despite Keynes and others, is still largely rooted in 18th and 19th century pseudoscience.

I know of no fix or commitment to a sustainability manifesto that includes global, environmental, and social sustainability that makes this possible short of a major intellectual, social or political movement willing to make a long-term commitment to incremental, achievable goals toward that ultimate end.  Otherwise it’s just the mental equivalent to camping out in Zuccotti Park.  The anger we note around us during this election year of 2016 (our year of discontent) is a natural human reaction to the end of an idea, which has outlived its explanatory power and, therefore, its usefulness.  Which way shall we lurch?

The Sustainability Manifesto for Projects, then, is a modest proposal.  It may also simply be a sign of the times, albeit a rational one.  As such, it leaves open a lot of questions, and most of these questions cannot be addressed or determined by the people to which it is targeted: project managers, who are usually simply employees of a larger enterprise.  People behave as they are treated–to the incentives and disincentives presented to them, oftentimes not completely apparent on the conscious level.  Thus, I’m not sure if this manifesto hits its mark or even the right one.

*This term is often misunderstood by non-scientists.  Pseudoscience means non-science, just as alternative medicine means non-medicine.  If any of the various hypotheses of pseudoscience are found true, given proper vetting and methodology, that proposition would simply be called science.  Just as alternative methods of treatment, if found effective and consistent, given proper controls, would simply be called medicine.

Sunday Contemplation — Race Matters — Scalia’s Shameful Invocation of Racial Inferiority in 2015

To start out the year 2016 I’ve decided to write about something that has stuck in my craw since the issue first came about.  I find it galling, really, to have to write about something of this sort in the new year of 2016 but it is there nonetheless and I cannot in good conscience not write about it.

The topic at hand is the questioning by Supreme Court Justice Antonin Scalia during oral arguments in the affirmative action case, Fisher vs. University of Texas at Austin.  His comments are well documented but they are worth recounting, only because this line of thinking is shared by a significant proportion of the population.  Below is the full exchange begun by Gregory Garre, the attorney for UT.

Garre:  “If this Court rules that the University of Texas can’t consider race, or if it rules that universities that consider race have to die a death of a thousand cuts for doing so, we know exactly what’s going to happen…Experience tells us that.” (When the use of race has been dropped elsewhere) “diversity plummeted.”

Scalia:  “There are those who contend that it does not benefit African­-Americans to — ­ to get them into the University of Texas where they do not do well, as opposed to having them go to a less­-advanced school, a less ­—­ a slower­-track school where they do well. One of ­­— one of the briefs pointed out that ­­— that most of the — ­­most of the black scientists in this country don’t come from schools like the University of Texas. They come from lesser schools where they do not feel that they’re ­­— that they’re being pushed ahead in ­—­ in classes that are too ­­— too fast for them. I’m just not impressed by the fact that —­­ that the University of Texas may have fewer (black students). Maybe it ought to have fewer.”

Garre:  “This court heard and rejected that argument, with respect, Justice Scalia….frankly, I don’t think the solution to the problems with student body diversity can be to set up a system in which not only are minorities going to separate schools, they’re going to inferior schools.”

I want to get back to Scalia’s comments, but first it is useful to go over the facts of this case, which seem to barely warrant a review by the Supreme Court.  UT admits the overwhelming majority of its students based on the Top Ten Program, that is, if you graduated from a Texas high school within the top 10% of your class, you were admitted if you applied.  In the year that Fisher applied, 92% of the entering class gained admission on that basis.  For the other 8% of seats that were open, as Vox explained, other factors were taken into consideration including based on a “holistic” process.  Two scores were given from this process: one for essays, leadership activities, and background, which included race; and the other based on grades and test scores.  The overwhelming majority of students accepted for admission under this process were white.  Given that the inclusion of race as a factor was not a discriminatory quota, there is little here except to assert, in general, that any consideration of race is unconstitutional under the Equal Protection Clause of the 14th Amendment.

The majority of legal analysis of the Fisher case itself has centered on Grutter vs. Bollinger, mostly because it is the Supreme Court’s latest statement on the issue of affirmative action.  In this case, the Court ruled that University of Michigan Law School did not discriminate when taking race into account among a number of other factors in order to ensure a diverse student body, especially in including previously disenfranchised and excluded minorities, as long as there was a compelling interest in doing so and it passed the definition of “strict scrutiny.”

Given that the Court attempts to maintain continuity and precedent (known by the Latin term stare decisis), the wellspring for this decision was really based on the case of University of California Regents v. Bakke from 1978.  There are two competing constitutional interests at play according to the majority opinion written by Justice Lewis Powell.  One is to ensure that the Equal Protection Clause of the 14th Amendment apply not only to protect the interests of African-Americans in “dialing back the clock to 1868” in a United States that no longer resembles the one when the amendment was passed, but to all persons.  The other is under the academic freedom afforded schools and colleges under the First Amendment known as the “four essential freedoms.”

Forgetting in his argument that Justice Powell was a good constitutional judge but a poor historian, this other interest may come as a surprise to those not familiar with these competing interests.  This is not surprising given the partisan–and many racist–arguments against affirmative action.  Powell invokes two previous cases in outlining the four essential freedoms.  He writes:

“Mr. Justice Frankfurter summarized the “four essential freedoms” that constitute academic freedom:

“`It is the business of a university to provide that atmosphere which is most conductive to speculation, experiment and creation. It is an atmosphere in which there prevail “the four essential freedoms” of a university—to determine for itself on academic grounds who may teach, what may be taught, how it shall be taught, and who may be admitted to study.'”  Sweezy v. New Hampshire, 354 U. S. 234, 263 (1957) (concurring in result).

Our national commitment to the safeguarding of these freedoms within university communities was emphasized in Keyshian v. Board of Regents, 385 U.S. 589, 603 (1967):

“Our Nation is deeply committed to safeguarding academic freedom which is of transcendent value to all of us and not merely to the teachers concerned. That freedom is therefore a special concern of the First Amendment . . . . The Nation’s future depends upon leaders trained through wide exposure to that robust exchange of ideas which discovers truth `out of a multitude of tongues, [rather] than through any kind of authoritative selection.’  United States v. Associated Press, 523 F. Supp. 362, 372.”

What Powell indicated was that, given these conflicting rights (given that no right is absolute), that when the university takes racial factors into account into admissions that there needs to be both a substantial state interest in ensuring diversity, and that strict scrutiny must be applied to such racial or ethnic factoring.  The first time around, when the Supreme Court remanded the Fisher case back to the appellate court in 2013, the majority indicated that while not a quota–and hence not an outright violation of the Equal Protection Clause–that the court had not applied strict scrutiny in determining whether UT had established a substantial state interest.  Or, at least, that’s what it seemed given that the logic which comes down by the Roberts Court is oftentimes sophomoric.

It is important to note that the UT Top Ten Program has increased diversity.  The reason is that the top ten percent, regardless of school, qualify for the program.  This effect is rooted in discrimination in housing patterns that extend back to the late 19th century when, first, Jim Crow laws were passed in the southern states (such as Texas) to essentially re-enslave and disenfranchise the freedmen.  Many people will be surprised to know that these laws continued in force well into the 1960s, the last case being brought to overturn the last vestiges of the race laws in the mid-1970s.  Then in the north beginning in the 1920s, first, local ordinance, and then, when those were struck down, restrictive covenants were applied to keep African Americans and other minority and ethnic groups out of white, Anglo-Saxon protestant neighborhoods.  When restrictive covenants were eventually overturned, real estate brokers and bankers applied the process of “redlining.”  That is, home loans and mortgages were made harder to qualify for or denied to certain racial and ethnic groups.  The map was lined in red to keep people in their “place.”  Ostensibly, this practice was outlawed with the passage of the Fair Housing Act in 1968, but the practice has continued to this day.  Furthermore, for most of our history African Americans have had to pay a premium for better housing that otherwise would have gone for a lower market price.  It was racial fear and manipulation that caused white flight in giving the impression of falling real estate value when African-Americans were allowed to move into a predominately white community.  The sordid history behind this phenomena are amply documented in the National Book Award winning history, Arc of Justice by historian Kevin Boyle.

When one hears political and opinion leaders assert that the housing crisis was caused by diversity targets in sub-prime loans they are not only stating a counterfactual and providing bad economic analysis, but are also engaged in race baiting.  It is redlining that caused minorities to be most vulnerable when the bubble burst because they tended to pay usurious interest rates–or were funneled into subprime balloon mortgages–in order to derive the same benefits of home ownership as other groups, who were afforded more reasonable financing.  Given that most of these are working people living paycheck to paycheck, it takes no great insight to know that they will be the first to default during an unusually severe economic downturn.

Thus, when one considers that most public school funding is derived from real estate property taxes and that the average homeowner based on a 2013 survey stays in their home 13 years (with the historical average varying between 10 and 16 years since 1997), the effects of previously discriminatory practices–and school funding, as well as socio-economic and racial composition–depending the rate of turnover in any particular neighborhood, can last a generation or more.  Despite political arguments to the contrary, monies spent on schools plays a significant role in student achievement.  It would have been appropriate for Justice Powell in Bakke to have at least acknowledged this history as well as the history of new immigrants and minorities that he invoked in his decision in expressing his concern about turning back the clock.

It is important to state clearly that there is no doubt things have improved despite Bakke, and that it was probably a largely well-conceived adjudication.  Despite claims to the contrary, the Great Society and Civil Rights reforms of the 1960s have eliminated the worst de jure and de facto day-to-day indignities, fear of violence, discrimination, and denial of human rights that African-Americans lived under well into the late 20th century.  Opportunities have opened up and it is amazing that over the last 50 years that we can find young African-Americans who have never suffered the indignity of bias or discrimination due to the color of their skin.  But as with the recent problems in policingcriminal justice, and the subtle racism that exists in job selection and opportunities, among other issues, it is apparent that we still have work to do for us to fully overcome our history of slavery, Jim Crow, racism, discrimination, and racial terrorism.  For when one looks back, the fact of the matter is that much of this country–and the basis of its wealth–was built on the backs of African American slavery and oppression.  Without the African-American experience, American culture is indecipherable.  New immigrants, when taking advantage of the inherited advantages of being American also, unknowing to them, inherit the history that made those advantages possible.

But now back to Justice Scalia’s remarks.  Had Scalia restricted himself to the constitutional issues addressed by Powell in Bakke, there would be no concern.  But this is not what the members of this SCOTUS are about.  In the case of Scalia, his remarks would have been at home in the post-Reconstruction south in the late 19th and early 20th century, along with Spencer’s Social Darwinism and eugenics.  This was the period that endorsed separate but “equal” facilities for African-Americans.  Scalia seems to be suggesting a modern version of it in higher education.  But we have seen these ideas invoked fairly recently elsewhere, particularly in the discredited work of Herrnstein and Murray in publishing their work The Bell Curve.  His comments are what is called “Begging the Question.”  Scalia “begs the question” in assuming in his remarks that African-Americans are not qualified generally for UT, and that they do not possess the mental or educational skills to succeed there.  His remarks also reveal someone who thinks in terms of hierarchy and aristocracy, that there are levels of human fitness and superiority, which also underlies such concepts as “makers” and “takers.”

Apologists in academia and elsewhere have attempted to temper the justice’s words by invoking the concept of mismatch in college admissions.  It is often referred to as a “theory” but that would elevate it to have an authority that it does not possess.  It is Cargo Cult Social Science based on loosely correlated statistics that provides a veneer of respectability to those who still seek to explain inequality in a society that claims fancifully to be meritocratic, or egalitarian, or a land of opportunity, but which is not really any of these.  But that is not the underlying assumption in Scalia’s remarks.  He begins with calling out African Americans (and restricts himself to African Americans among minorities) and then goes from there.  Further studies on mismatch (link below) show that it is a common phenomenon which affects all racial and ethnic groups.  No system is perfect, and especially not one conceived by society or academia.

But even putting aside the racist assumptions of Justice Scalia, does the mismatch concept even pass the “so-what?” test?  What if one is thrown into a situation for which they are poorly prepared?  In real life we call this “sink or swim.”  Does it really do harm?  There are all kinds of casuistry put forth but, despite assertions to the contrary, the facts are not conclusive.

To give but one famous historical example that undermines this bit of sophistry, the fact that General Lee graduated second in his West Point class and U.S. Grant graduated in the bottom half no doubt influenced them later in life.  Lee was able to defeat with great skill generals who were unused to defeat and disappointment, and routed them from the field.  But his supreme confidence in his abilities caused his utter failure at Gettysburg.  Grant, on the other hand, who experienced failure both as a civilian and on the battlefield, grew unafraid of it and succeeded in the end.  The fact that someone experiences a setback or must work hard in order to succeed is not such a bad thing.  It is how the individual reacts in the face of disappointment or long odds that we call character.  It is the standard means of training Naval officers at sea and why mentoring is so important.  (Only puffed up college professors don’t feel that their job is teaching).  Yes, the world is a big place; yes, there are things you don’t know, but absent a severe learning or emotional disability you can learn them.

But seeing this self-evident insight would assume rational thought and evidence.  For example, many of the characteristics attributed to African-Americans in The Bell Curve have since been overcome, such as significantly rising math and reading scores on standardized tests that are closing the gap with white achievement.  If these were innate or unsolvable deficiencies how was it possible that public policy is alleviating the gap?  Does it harm African-Americans to be challenged to do so?  Given the disreputable history of race in America what is more likely: that African-Americans are innately less likely to succeed at UT (and increasing numbers entered under the Top Ten Program), or that the history of unequal educational opportunity deserves to be addressed in the most equitable and constitutional manner?  Or that unequal treatment and the socio-economic effects of economic discrimination, which still exist, have a great effect on minorities that require a reasoned assessment of the individual in taking into account mitigating circumstances, including racial or ethnic background, in college admissions for those on the fence?

That a Supreme Court justice can voice such stupidity and bias in the year 2015 is evidence enough that there is something wrong with our judicial system.  I beg to differ with the proposition voiced by the late Senator Roman Hruska in defending Nixon’s appointment of G. Harold Carswell to the Supreme Court (which was rejected), that mediocrity deserves representation on the court.  While we can’t always find a Brandeis, Cardozo, or Frankfurter (or a Holmes, Brennan, Black, Story, or Warren), we can at least attempt to do so.  Unfortunately we are stuck with a Scalia and his ilk.

Legitimacy and the EU Democratic Deficit

Turning to political science again, Kevin O’Rourke has an important article regarding the democratic deficit and types of legitimacy in Critical Quarterly, particularly in light of the events surrounding the Greek crisis.  He cites the late political scientist Peter Mair’s book, Ruling the Void, as providing a solid framework for understanding what is happening in Europe, and to some extent within all democracies as a result of wealth and power concentration among an increasingly transnational economic elite.

The issue that O’Rourke tackles based on Mair’s insights, is one of democratic legitimacy.  For economists and financiers who seem to have (I would argue) taken an illegitimately outsized role in determining what is good for Greece, even if Greece disagrees, the dichotomy here seems to be between what has been called input vs. output legitimacy.  I understand what he is saying here, but in political science “legitimacy” is not the same as “democratic legitimacy” and, in the end I think, this is his point.

O’Rourke, an economist himself, tackles how using this argument, particularly in regard to output legitimacy, has been hijacked so that concerns about distribution have been stripped out of the definition by the application of technocrat-speak.  I have a backlog of items for the Devil’s Phraseology from “Structural Reform” to other euphemisms for, essentially, screwing working people over, especially right now if they are Greek, Italian, Spanish, or Irish.

His article is important in tracing the subtle transformation of legitimacy over time.  For those unfamiliar with this terminology, legitimacy in this sense–if you remember nothing else but your Lincoln or Jefferson–in democratic societies is properly derived by the people.  This concept, which can be measured on the input side, is reinforced by processes and institutions that support it.  So clean elections which seek to maximize participation of the adult population; freedoms that support human rights, particularly those concerning speech, free association, and free movement; institutions that are largely responsive to the democratic will but which possess limitations to prevent the denial of human rights; and an independent judiciary that metes out justice based on due process; the absence of corruption, undue influence, unequal treatment, or graft in these institutions, etc. are all indicators of “legitimacy.”  In the context of the European debate this is known as “input” legitimacy.

Then there is “output” legitimacy.  This is the type of legitimacy on which the EU rests, since it obviously–especially since the last Greek referendum on the terms of the Troika’s terms–doesn’t seem to be based on any kind of “input” legitimacy.  Here legitimacy is based on a utilitarian measure–the ability of the EU to raise the aggregate euro value at any one time.  This is the “rising tide lifts all boats” trope.  Nice imagery, what with the JFK connection and all, but the rules of the game and economic environment have changed since 1963 to the extent that the analogy no longer applies.  A rising tide lifts all boats only if everyone has a stake in the tide rising.  Feel free to add any additional analogies now that we are beginning to understand the effect of rising tides on coastal cities as the earth warms.  An actual rising tide certainly didn’t do anyone in NOLA’s Lower Ninth and Lakeside neighborhoods any favors, but we do know that it impacted people residing in different economic strata differently.

Furthermore, output legitimacy as a utilitarian project sounds a lot like “we made the trains run on time”.  Furthermore, it wasn’t all that long ago that more than half of Europe suffered under authoritarian regimes.  Output legitimacy, I would argue, by definition is the opposite of democratic legitimacy, not one of two types of democratic legitimacy.  As O’Rourke points out, one cannot take politics out of policy, so the way in which decisions are made is important in defining the type and level of legitimacy.

Post-1989 generations have not had to come to an understanding of the fact that even oppressive regimes can possess political legitimacy that is sufficient for them to survive.  From an historical standpoint, all of those German people in the streets chanting “Heil Hitler” weren’t doing so at gun point.  The block captains and those others who denounced family members in the old Eastern Block countries largely acted independently and voluntarily.  Many Russians today pine for the days under the old Soviet Union and have a leader in Putin that channels that nostalgia.  Autocratic and authoritarian regimes simply possess legitimacy through institutions and processes that are more restrictive than those found in democratic societies, but which rests on elites, centers of power, and pluralities that allow them to function.

Thus, whether the EU will admit it publicly or not, one need only do a Google search to see that this is a generally recognized issue that the European countries seem unwilling or unable to address.  The recent charging of Yanis Varoufakis, the former Greek minister, of treason at the instigation of Greek and European elites raises the ante and strips whatever remaining veil there was to hide the anti-democratic roots of the Greek crisis.  Apparently the 60% of the Greek people who voted “No” to the Troika were also traitors.

That this is happening is Greece is also problematic due to its geographical location in the eastern Mediterranean and its fairly recent transition to democratic processes and institutions.  De-legitimization of democracies is an all too familiar event in the history of the European continent and can only lead to radicalization, especially given the pain being inflicted on the Greek people.  What Europe’s technocrats have done is turn an economic recession and market failure–that could have been ameliorated and solved given the proper solutions learned by hard experience from the 1930s and immediately following the Second World War–rejected those methods and, as a result, though obstinance, tyrannical actions, corruption, and greed, have created a political and economic disaster that threatens the legitimacy of the EU.

Time to reform the reformers.