Ring Out the Old, Ring in the New: Data Transformation Podcasting

Robin Williams at Innovate IPM interviewed me a few weeks ago and has a new podcast up to cap off the year. The main thrust of our discussion, as it turned out, which began as a wide-ranging one, settled on digital transformation and the changes and developments that I’ve seen in this area over the last three decades.

I met Rob at a recent Projects Controls conference. He is a professional, curious, and engaging individual who quickly puts one at ease. We both found a lot in common regarding our perspectives on project management and project controls and I agreed to the podcast interview. Our discussion was no different than many that I’ve had with other professionals in my areas of interest in my own living room, and the discussion comes off as a similarly engaging and informal conversation between like-minded individuals.

Before he posted the podcast, I managed to get a preview. Despite years of doing interviews, hosting symposiums, an occasional emcee or radio spot, home movies, and other recordings, I still cannot get over the strange feeling of hearing my own voice during a long conversation. I am constantly looking for faults, and cringed with the utterance of each “ah” or “um” while listening to myself–returning in my head to the admonitions of my supervisors when I was taught to be a Navy instructor–though, thankfully, they are few.

Still, thanks to the magic of editing, Rob managed to keep the focus on the main point of the conversation when I strayed into some side discussion. During the time of the interview Rob caught me at a time when I was working on a paper to present to DoD professionals regarding digital transformation, and so the interview caught me in real-time while I was developing in my mind two main concepts that I picked up by reading the literature in the areas of establishing a Master Data Management (MDM) strategy, and a knowledge management environment. While I do not mention these items in the interview, the discussion allowed me to subsequently sort out where these concepts apply.

In any event, the podcast can be found here: https://www.innovateipm.com/podcast/episode/206e7fbd/13-history-of-digital-transformation-with-nick-pisano. I hope you find it interesting and informative.

Open: Strategic Planning, Open Data Systems, and the Section 809 Panel

Sundays are usually days reserved for music and the group Rhye was playing in the background when this topic came to mind.

I have been preparing for my presentation in collaboration with my Navy colleague John Collins for the upcoming Integrated Program Management Workshop in Baltimore. This presentation will be a non-proprietary/non-commercial talk about understanding the issue of unlocking data to support national defense systems, but the topic has broader interest.

Thus, in advance of that formal presentation in Baltimore, there are issues and principles that are useful to cover, given that data capture and its processing, delivery, and use is at the heart of all systems in government, and private industry and organizations.

Top Data Trends in Industry and Their Relationship to Open Data Systems

According to Shohreh Gorbhani, Director, Project Control Academy, the top five data trends being pursued by private industry and technology companies. My own comments follow as they relate to open data systems.

  1. Open Technologies that transition from 2D Program Management to 3D and 4D PM. This point is consistent with the College of Performance Management’s emphasis on IPM, but note that the stipulation is the use of open technologies. This is an important distinction technologically, and one that I will explore further in this post.
  2. Real-time Data Capture. This means capturing data in the moment so that the status of our systems is up-to-date without the present delays associated with manual data management and conditioning. This does not preclude the collection of structured, periodic data, but also does include the capture of transactions from real-time integrated systems where appropriate.
  3. Seamless Data Flow Integration. From the perspective of companies in manufacturing and consumer products, technologies such as IoT and Cloud are just now coming into play. But, given the underlying premises of items 1 and 2, this also means the proper automated contextualization of data using an open technology approach that flows in such a way as to be traceable.
  4. The use of Big Data. The term has lost a good deal of its meaning because of its transformation into a buzz-phrase and marketing term. But Big Data refers to the expansion in the depth and breadth of available data driven by the economic forces that drive Moore’s Law. What this means is that we are entering a new frontier of data processing and analysis that will, no doubt, break down assumptions regarding the validity and strength of certain predictive analytics. The old assumptions that restrict access to data due to limitations of technology and higher cost no longer apply. We are now in the age of Knowledge Discovery in Data (KDD). The old approach of reporting assumed that we already know what we need to know. The use of data challenges old assumptions and allows us to follow the data where it will lead us.
  5. AI Forecasting and Analysis. No doubt predictive AI will be important as we move forward with machine learning and other similar technologies. But this infant is not yet a rug rat. The initial experiences with AI are that they tend to reflect the biases of the creators. The danger here is that this defeats KDD, which results in stagnation and fugue. But there are other areas where AI can be taught to automate mundane, value-neutral tasks relating to raw data interpretation.

The 809 Panel Recommendation

The fact that industry is the driving force behind these trends that will transform the way that we view information in our day-to-day work, it is not surprising that the 809 Panel had this to say about existing defense business systems:

“Use existing defense business system open-data requirements to improve strategic decision making on acquisition and workforce issues…. DoD has spent billions of dollars building the necessary software and institutional infrastructure to collect enterprise wide acquisition and financial data. In many cases, however, DoD lacks the expertise to effectively use that data for strategic planning and to improve decision making. Recommendation 88 would mitigate this problem by implementing congressional open-data mandates and using existing hiring authorities to bolster DoD’s pool of data science professionals.”

Section 809 Volume 3, Section 9, p. 477

At one point in my military career, I was assigned as the Materiel, Fuels, and Transportation Officer of Naval Air Station, Norfolk. As a major naval air base, transportation hub, and home to a Naval Aviation Depot, we shipped and received materiel and supplies across the world. In doing so, our transportation personnel would use what at the time was new digital technology to complete an electronic bill of lading that specified what and when items were being shipped, the common or military carrier, the intended recipient, and the estimated date of arrival, among other essential information.

The customer and receiving end of this workflow received an open systems data file that contained these particulars. The file was an early version of open data known as an X12 file, for which the commercial transportation industry was an early adopter. Shipping and receiving activities and businesses used their own type of local software: and there were a number of customized and commercial choices out there, as well as those used by common carriers such various trucking and shipping firms, the USPS, FEDEX, DHS, UPS, and others. The X12 file was the DMZ that made the information open. Software manufacturers, if they wanted to stay relevant in the market, could not impose a proprietary data solution.

Furthermore, standardization of terminology and concepts ensured that the information was readable and comprehensible wherever the items landed–whether across receiving offices in the United States, Japan, Europe, or even Istanbul. Understanding that DoD needs the skillsets to be able to optimize data, it didn’t require an army of data scientists to achieve this end-state. It required the right data science expertise in the right places, and the dictates of transportation consumers to move the technology market to provide the solution.

Over the years both industry and government have developed a number of schema standards focused on specific types of data, progressing from X12 to XML and now projected to use JSON-based schemas. Each of them in their initial iterations automated the submission of physical reports that had been required by either by contract or operations. These focused on a small subset of the full dataset relating to program management and project controls.

This progression made sense.

When digitized technology is first introduced into an intensive direct-labor environment, the initial focus is to automate the production of artifacts and their underlying processes in order to phase in the technology’s acceptance. This also allows the organization to realize immediate returns on investment and improvements in productivity. But this is the first step, not the final one.

Currently for project controls the current state is the UN/CEFACT XML for program performance management data, and the contract cost and labor data collection file known as the FlexFile. Clearly the latter file, given that the recipient is the Office of the Secretary of Defense Cost Assessment and Program Evaluation (OSD CAPE), establish it as one of many feedback loops that support that office’s role in coordinating the planning, programming, budgeting, and evaluation (PPBE) system related to military strategic investments and budgeting, but only one. The program performance information is also a vital part of the PPBE process in evaluation and in future planning.

For most of the U.S. economy, market forces and consumer requirements are the driving force in digital innovation. The trends noted by Ms. Gorbhani can be confirmed through a Google search of any one of the many technology magazines and websites that can be found. The 809 Panel, drawn as it was from specialists and industry and government, were tasked “to provide recommendations that would allow DoD to adapt and deliver capability at market speeds, while ensuring that DoD remains true to its commitment to promote competition, provide transparency in its actions, and maintain the integrity of the defense acquisition system.”

Given that the work of the DoD is unique, creating a type of monopsony, it is up to leadership within the Department to create the conditions and mandates necessary to recreate in microcosm the positive effects of market forces. The DoD also has a very special, vital mission in defending the nation.

When an individual business cobbles together its mission statement it is that mission that defines the necessary elements in data collection that are then essential in making decisions. In today’s world, best commercial sector practice is to establish a Master Data Management (MDM) approach in defining data requirements and practices. In the case of DoD, a similar approach would be beneficial. Concurrent with the period of the 809 Panel’s efforts, RAND Corporation delivered a paper in 2017 (link in the previous sentence) that made recommendations related to data governance that are consistent with the 809 Panel’s recommendations. We will be discussing these specific recommendations in our presentation.

Meeting the mission and readiness are the key components to data governance in DoD. Absent such guidance, specialized software solution providers, in particular, will engage in what is called “rent-seeking” behavior. This is an economic term that means that an “entity (that) seeks to gain added wealth without any reciprocal contribution of productivity.”

No doubt, given the marketing of software solution providers, it is hard for decision-makers to tell what constitutes an open data system. The motivation of a software solution is to make itself as “sticky” as possible and it does that by enticing a customer to commit to proprietary definitions, structures, and database schemas. Usually there are “black-boxed” portions of the software that makes traceability impossible and that complicates the issue of who exactly owns the data and the ability of the customer to optimize it and utilize it as the mission dictates.

Furthermore, data visualization components like dashboards are ubiquitous in the market. A cursory stroll through a tradeshow looks like a dashboard smorgasbord combined with different practical concepts of what constitutes “open” and “integration”.

As one DoD professional recently told me, it is hard to tell the software systems apart. To do this it is necessary to understand what underlies the software. Thus, a proposed honest-broker definition of an open data system is useful and the place to start, given that this is not a notional concept since such systems have been successfully been established.

The Definition of Open Data Systems

Practical experience in implementing open data systems toward the goal of optimizing essential information from our planning, acquisition, financial, and systems engineering systems informs the following proposed definition, which is based on commercial best practice. This proposal is also based on the principle that the customer owns the data.

  1. An open data system is one based on non-proprietary neutral schemas that allow for the effective capture of all essential elements from third-party proprietary and customized software for reporting and integration necessary to support both internal and external stakeholders.
  2. An open data system allows for complete traceability and transparency from the underlying database structure of the third-party software data, through the process of data capture, transformation, and delivery of data in the neutral schema.
  3. An open data system targets the loading of the underlying source data for analysis and use into a neutral database structure that replicates the structure of the neutral schema. This allows for 100% traceability and audit of data elements received through the neutral schema, and ensures that the receiving organization owns the data.

Under this definition, data from its origination to its destination is more easily validated and traced, ensuring quality and fidelity, and establishing confidence in its value. Given these characteristics, integration of data from disparate domains becomes possible. The tracking of conflicting indicators is mitigated, since open system data allows for its effective integration without the bias of proprietary coding or restrictions on data use. Finally, both government and industry will not only establish ownership of their data–a routine principle in commercial business–but also be free to utilize new technologies that optimize the use of that data.

In closing, Gahan Wilson, a cartoonist whose work appeared in National Lampoon, The New Yorker, Playboy, and other magazines recently passed.

When thinking of the barriers to the effective use of data, I came across this cartoon in The New Yorker:

Open Data is the key to effective integration and reporting–to the optimal use of information. Once mandated and achieved, our defense and business systems will be better informed and be able to test and verify assumed knowledge, address risk, and eliminate dogmatic and erroneous conclusions. Open Data is the driver of organizational transformation keyed to the effective understanding and use of information, and all that entails. Finally, Open Data is necessary to the mission and planning systems of both industry and the U.S. Department of Defense.

Sledgehammer: Pisano Talks!

My blogging hiatus is coming to an end as I take a sledgehammer to the writer’s block wall.

I’ve traveled far and wide over the last six months to various venues across the country and have collected a number of new and interesting perspectives on the issues of data transformation, integrated project management, and business analytics and visualization. As a result, I have developed some very strong opinions regarding the trends that work and those that don’t regarding these topics and will be sharing these perspectives (with the appropriate supporting documentation per usual) in following posts.

To get things started this post will be relatively brief.

First, I will be speaking along with co-presenter John Collins, who is a Senior Acquisition Specialist at the Navy Engineering & Logistics Office, at the Integrated Program Management Workshop at the Hyatt Regency in beautiful downtown Baltimore’s Inner Harbor 10-12 December. So come on down! (or over) and give us a listen.

The topic is “Unlocking Data to Improve National Defense Systems”. Today anyone can put together pretty visualizations of data from Excel spreadsheets and other sources–and some have made quite a bit of money doing so. But accessing the right data at the right level of detail, transforming it so that its information content can be exploited, and contextualizing it properly through integration will provide the most value to organizations.

Furthermore, our presentation will make a linkage to what data is necessary to national defense systems in constructing the necessary artifacts to support the Department of Defense’s Planning, Programming, Budgeting and Execution (PPBE) process and what eventually becomes the Future Years Defense Program (FYDP).

Traditionally information capture and reporting has been framed as a question of oversight, reporting, and regulation related to contract management, capital investment cost control, and DoD R&D and acquisition program management. But organizations that fail to leverage the new powerful technologies that double processing and data storage capability every 18 months, allowing for both the depth and breadth of data to expand exponentially, are setting themselves up to fail. In national defense, this is a condition that cannot be allowed to occur.

If DoD doesn’t collect this information, which we know from the reports of cybersecurity agencies that other state actors are collecting, we will be at a serious strategic disadvantage. We are in a new frontier of knowledge discovery in data. Our analysts and program managers think they know what they need to be viewing, but adding new perspectives through integration provide new perspectives and, as a result, will result in new indicators and predictive analytics that will, no doubt, overtake current practice. Furthermore, that information can now be processed and contribute more, timely, and better intelligence to the process of strategic and operational planning.

The presentation will be somewhat wonky and directed at policymakers and decisionmakers in both government and industry. But anyone can play, and that is the cool aspect of our community. The presentation will be non-commercial, despite my day job–a line I haven’t crossed up to this point in this blog, but in this latter case will be changing to some extent.

Back in early 2018 I became the sole proprietor of SNA Software LLC–an industry technology leader in data transformation–particularly in capturing datasets that traditionally have been referred to as “Big Data”–and a hybrid point solution that is built on an open business intelligence framework. Our approach leverages the advantages of COTS (delivering the 80% solution out of the box) with open business intelligence that allows for rapid configuration to adapt the solution to an organization’s needs and culture. Combined with COTS data capture and transformation software–the key to transforming data into information and then combining it to provide intelligence at the right time and to the right place–the latency in access to trusted intelligence is reduced significantly.

Along these lines, I have developed some very specific opinions about how to achieve this transformation–and have put those concepts into practice through SNA and delivered those solutions to our customers. Thus, the result has been to reduce both the effort and time to capture large datasets from data that originates in pre-processed data, and to eliminate direct labor and the duration to information delivery by more than 99%. The path to get there is not to apply an army of data scientists and data analysts that deals with all data as if it is flat and to reinvent the wheel–only to deliver a suboptimized solution sometime in the future after unnecessarily expending time and resources. This is a devolution to the same labor-intensive business intelligence approaches that we used back in the 1980s and 1990s. The answer is not to throw labor at data that already has its meaning embedded into its information content. The answer is to apply smarts through technology, and that’s what we do.

Further along these lines, if you are using hard-coded point solutions (also called purpose-built software) and knitted best-of-breed, chances are that you will find that you are poorly positioned to exploit new technology and will be obsolete within the next five years, if not sooner. The model of selling COTS solutions and walking away except for traditional maintenance and support is dying. The new paradigm will be to be part of the solution and that requires domain knowledge that translates into technology delivery.

More on these points in future posts, but I’ve placed the stake in the ground and we’ll see how they hold up to critique and comment.

Finally, I recently became aware of an extremely informative and cutting-edge website that includes podcasts from thought leaders in the area of integrated program management. It is entitled InnovateIPM and is operated and moderated by a gentleman named Rob Williams. He is a domain expert in project cost development, with over 20 years of experience in the oil, gas, and petrochemical industries. Robin has served in a variety of roles throughout his career and is now focuses on cost estimating and Front-End Loading quality assurance. His current role is advanced project cost estimator at Marathon Petroleum’s Galveston Bay Refinery in Texas City.

Rob was also nice enough to continue a discussion we started at a project controls symposium and interviewed me for a podcast. I’ll post additional information once it is posted.

Both Sides Now — The Value of Data Exploration

Over the last several months I have authored a number of stillborn articles that just did not live up to the standards that I set for this blog site. After all, sometimes we just have nothing important to add to the conversation. In a world dominated by narcissism, it is not necessary to constantly have something to say. Some reflection and consideration are necessary, especially if one is to be as succinct as possible.

A quote ascribed to Woodrow Wilson, which may be apocryphal, though it does appear in two of his biographies, was in response to being lauded by someone for making a number of short, succinct, and informative speeches. When asked how he was able to do this, President Wilson is supposed to have replied:

“It depends. If I am to speak ten minutes, I need a week for preparation; if fifteen minutes, three days; if half an hour, two days; if an hour, I am ready now.”

An undisciplined mind has a lot to say about nothing in particular with varying degrees of fidelity to fact or truth. When in normal conversation we most often free ourselves from the discipline expected for more rigorous thinking. This is not necessarily a bad thing if we are saying nothing of consequence and there are gradations, of course. Even the most disciplined mind gets things wrong. We all need editors and fact checkers.

While I am pulling forth possibly apocryphal quotes, the one most applicable that comes to mind is the comment by Hemingway as told by his deckhand in Key West and Cuba, Arnold Samuelson. Hemingway was supposed to have given this advice to the aspiring writer:

“Don’t get discouraged because there’s a lot of mechanical work to writing. There is, and you can’t get out of it. I rewrote the first part of A Farewell to Arms at least fifty times. You’ve got to work it over. The first draft of anything is shit. When you first start to write you get all the kick and the reader gets none, but after you learn to work it’s your object to convey everything to the reader so that he remembers it not as a story he had read but something that happened to himself.”

Though it deals with fiction, Hemingway’s advice applies to any sort of writing and rhetoric. Dr. Roger Spiller, who more than anyone mentored me as a writer and historian, once told me, “Writing is one of those skills that, with greater knowledge, becomes harder rather than easier.”

As a result of some reflection, over the last few months, I had to revisit the reason for the blog. Thus, this is still its purpose: it is a way to validate ideas and hypotheses with other professionals and interested amateurs in my areas of interest. I try to keep uninformed opinion in check, as all too many blogs turn out to be rants. Thus, a great deal of research goes into each of these posts, most from primary sources and from interactions with practitioners in the field. Opinions and conclusions are my own, and my reasoning for good or bad are exposed for all the world to see and I take responsibility for them.

This being said, part of my recent silence has also been due to my workload in–well–the effort involved in my day job of running a technology company, and in my recent role, since late last summer, as the Managing Editor of the College of Performance Management’s publication known as the Measurable News. Our emphasis in the latter case has been to find new contributions to the literature regarding business analytics and to define the concept of integrated project, program, and portfolio management. Stepping slightly over the line to make a pitch, I recommend anyone interested in contributing to the publication to submit an article. The submission guidelines can be found here.

Both Sides Now: New Perspectives

That out of the way, I recently saw, again on the small screen, the largely underrated movie about Neil Armstrong and the Apollo 11 moon landing, “First Man”, and was struck by this scene:

Unfortunately, the first part of the interview has been edited out of this clip and I cannot find a full scene. When asked “why space” he prefaces his comments by stating that the atmosphere of the earth seems to be so large from the perspective of looking at it from the ground but that, having touched the edge of space previously in his experience as a test pilot of the X15, he learned that it is actually very thin. He then goes on to posit that looking at the earth from space will give us a new perspective. His conclusion to this observation is then provided in the clip.

Armstrong’s words were prophetic in that the space program provided a new perspective and a new way of looking at things that were in front of us the whole time. Our spaceship Earth is a blue dot in a sea of space and, at least for a time, the people of our planet came to understand both our loneliness in space and our interdependence.

Earth from Apollo 8. Photo courtesy of NASA.

 

The impact of the Apollo program resulted in great strides being made in environmental and planetary sciences, geology, cosmology, biology, meteorology, and in day-to-day technology. The immediate effect was to inspire the environmental and human rights movements, among others. All of these advances taken together represent a new revolution in thought equal to that during the initial Enlightenment, one that is not yet finished despite the headwinds of reaction and recidivism.

It’s Life’s Illusions I Recall: Epistemology–Looking at and Engaging with the World

In his book Darwin’s Dangerous Idea, Daniel Dennett posited that what was “dangerous” about Darwinism is that it acts as a “universal acid” that, when touching other concepts and traditions, transforms them in ways that change our world-view. I have accepted this position by Dennett through the convincing argument he makes and the evidence in front of us, and it is true that Darwinism–the insight in the evolution of species over time through natural selection–has transformed our perspective of the world and left the old ways of looking at things both reconstructed and unrecognizable.

In his work, Time’s Arrow, Time’s Cycle, Stephen Jay Gould noted that Darwinism is part of one of the three great reconstructions of human thought that, in quoting Sigmund Freud, where “Humanity…has had to endure from the hand of science…outrages upon its naive self-love.” These outrages include the Copernican revolution that removed the Earth from the center of the universe, Darwinism and the origin of species, including the descent of humanity, and what John McPhee, coined as the concept of “deep time.”

But–and there is a “but”–I would propose that Darwinism and the other great reconstructions noted are but different ingredients of a larger and more broader, though compatible, type of innovation in the way the world is viewed and how it is approached–a more powerful universal acid. That innovation in thought is empiricism.

It is this approach to understanding that eats through the many ills of human existence that lead to self-delusion and folly. Though you may not know it, if you are in the field of information technology or any of the sciences, you are part of this way of viewing and interacting with the world. Married with rational thinking, this epistemology–coming from the perspectives of the astronomical observations of planets and other heavenly bodies by Charles Sanders Peirce, with further refinements by William James and John Dewey, and others have come down to us in what is known as Pragmatism. (Note that the word pragmatism in this context is not the same as the more generally used colloquial form of the word. For this type of reason Peirce preferred the term “pragmaticism”). For an interesting and popular reading of the development of modern thought and the development of Pragmatism written for the general reader I highly recommend the Pulitzer Prize-winning The Metaphysical Club by Louis Menand.

At the core of this form of empiricism is that the collection of data, that is, recording, observing, and documenting the universe and nature as it is will lead us to an understanding of things that we otherwise would not see. In our more mundane systems, such as business systems and organized efforts applying disciplined project and program management techniques and methods, we also can learn more about these complex adaptive systems through the enhanced collection and translation of data.

I Really Don’t Know Clouds At All: Data, Information, Intelligence, and Knowledge

The term “knowledge discovery in data”, or KDD for short, is an aspirational goal and so, in terms of understanding that goal, is a point of departure from the practice information management and science. I’m taking this stance because the technology industry uses terminology that, as with most language, was originally designed to accurately describe a specific phenomenon or set of methods in order to advance knowledge, only to find that that terminology has been watered down to the point where it obfuscates the issues at hand.

As I traveled to locations across the U.S. over the last three months, I found general agreement among IT professionals who are dealing with the issues of “Big Data”, data integration, and the aforementioned KDD of this state of affairs. In almost every case there is hesitation to use this terminology because it has been absconded and abused by mainstream literature, much as physicists rail against the misuse of the concept of relativity by non-scientific domains.

The impact of this confusion in terminology has caused organizations to make decisions where this terminology is employed to describe a nebulous end-state, without the initiators having an idea of the effort or scope. The danger here, of course, is that for every small innovative company out there, there is also a potential Theranos (probably several). For an in-depth understanding of the psychology and double-speak that has infiltrated our industry I highly recommend the HBO documentary, “The Inventor: Out for Blood in Silicon Valley.”

The reason why semantics are important (as they always have been despite the fact that you may have had an associate complain about “only semantics”) is that they describe the world in front of us. If we cloud the meanings of words and the use of language, it undermines the basis of common understanding and reveals the (poor) quality of our thinking. As Dr. Spiller noted, the paradox of writing and in gathering knowledge is that the more you know, the more you realize you do not know, and the harder writing and communicating knowledge becomes, though we must make the effort nonetheless.

Thus KDD is oftentimes not quite the discovery of knowledge in the sense that the term was intended to mean. It is, instead, a discovery of associations that may lead us to knowledge. Knowing this distinction is important because the corollary processes of data mining, machine learning, and the early application of AI in which we find ourselves is really the process of finding associations, correlations, trends, patterns, and probabilities in data that is approached in a manner as if all information is flat, thereby obliterating its context. This is not knowledge.

We can measure the information content of any set of data, but the real unlocked potential in that information content will come with the processing of it that leads to knowledge. To do that requires an underlying model of domain knowledge, an understanding of the different lexicons in any given set of domains, and a Rosetta Stone that provides a roadmap that identifies those elements of the lexicon that are describing the same things across them. It also requires capturing and preserving context.

For example, when I use the chat on my iPhone it attempts to anticipate what I want to write. I am given three choices of words to choose if I want to use this shortcut. In most cases, the iPhone guesses wrong, despite presenting three choices and having at its disposal (at least presumptively) a larger vocabulary than the writer. Oftentimes it seems to take control, assuming that I have misspelled or misidentified a word and chooses the wrong one for me, where my message becomes a nonsense message.

If one were to believe the hype surrounding AI, one would think that there is magic there but, as Arthur C. Clarke noted (known as Clarke’s Third Law): “Any sufficiently advanced technology is indistinguishable from magic.” Familiar with the new technologies as we are, we know that there is no magic there, and also that it is consistently wrong a good deal of the time. But many individuals come to rely upon the technology nonetheless.

Despite the gloss of something new, the long-established methods of epistemology, code-breaking, statistics, and Calculus apply–as do standards of establishing fact and truth. Despite a large set of data, the iPhone is wrong because the iPhone does not understand–does not possess knowledge–to know why it is wrong. As an aside, its dictionary is also missing a good many words.

A Segue and a Conclusion–I Still Haven’t Found What I’m Looking For: Why Data Integration?…and a Proposed Definition of the Bigness of Data

As with the question to Neil Armstrong, so the question on data. And so the answer is the same. When we look at any set of data under a particular structure of a domain, the information we derive provides us with a manner of looking at the world. In economic systems, businesses, and projects that data provides us with a basis for interpretation, but oftentimes falls short of allowing us to effectively describe and understand what is happening.

Capturing interrelated data across domains allows us to look at the phenomena of these human systems from a different perspective, providing us with the opportunity to derive new knowledge. But in order to do this, we have to be open to this possibility. It also calls for us to, as I have hammered home in this blog, reset our definitions of what is being described.

For example, there are guides in project and program management that refer to statistical measures as “predictive analytics.” This further waters down the intent of the phrase. Measures of earned value are not predictive. They note trends and a single-point outcome. Absent further analysis and processing, the statistical fallacy of extrapolation can be baked into our analysis. The same applies to any index of performance.

Furthermore, these indices and indicators–for that is all they are–do not provide knowledge, which requires a means of not only distinguishing between correlation and causation but also applying contextualization. All systems operate in a vector space. When we measure an economic or social system we are really measuring its behavior in the vector space that it inhabits. This vector space includes the way it is manifested in space-time: the equivalent of length, width, depth (that is, its relative position, significance, and size within information space), and time.

This then provides us with a hint of a definition of what often goes by the definition of “big data.” Originally, as noted in previous blogs, big data was first used in NASA in 1997 by Cox and Ellsworth (not as credited to John Mashey on Wikipedia with the dishonest qualifier “popularized”) and was simply a statement meaning “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze.”

This is a relative term given Moore’s Law. But we can begin to peel back a real definition of the “bigness” of data. It is important to do this because too many approaches to big data assume it is flat and then apply probabilities and pattern recognition to data that undermines both contextualization and knowledge. Thus…

The Bigness of Data (B) is a function (f ) of the entropy expended (S) to transform data into information, or to extract its information content.

Information evolves. It evolves toward greater complexity just as life evolves toward greater complexity. The universe is built on coded bits of information that, taken together and combined in almost unimaginable ways, provides different forms of life and matter. Our limited ability to decode and understand this information–and our interactions in it– are important to us both individually and collectively.

Much entropy is already expended in the creation of the data that describes the activity being performed. Its context is part of its information content. Obliterating the context inherent in that information content causes all previous entropy to be of no value. Thus, in approaching any set of data, the inherent information content must be taken into account in order to avoid the unnecessary (and erroneous) application of data interpretation.

More to follow in future posts.

Sunday Music Interlude — Alison Moyet Performing “Alive”

She was born Geneviève Alison Jane Moyet in 1961 in Essex in the town of Billericay outside of London. Her singing career began in 1982 as the lead singer of the synth pop duo Yazoo, which had to change its name in the U.S. to Yaz due to trademark concerns (Yazoo Records was already in operation). During that time they had three U.S. hits and a slew of hits in the U.K.

When the group broke up in 1983 she began her solo career. Since that time the husky instrument of her voice, which she can stretch to an amazing degree to capture virtually any genre, established her as a fixture in the U.K. musical world. Over the years she has been near the top of the U.K. and EU charts, with occasional crossovers to the States but, unfortunately, her following in the U.S. has not had the same type of following of her musical contemporaries as one would have thought.

This is a common phenomena among certain European and U.K. artists largely, I think, because the musical tastes overseas tend to be quite broad and experimental, and so the artists tend to stretch out to address such catholic tastes, which cuts against the U.S. tendency of marketing artists by musical genre. For example, today, though they are at the core of American music, such genres as jazz, blues, and folk still tend to appeal to a niche of the wider musical audience, the latter following a series of popular musical types that change with each passing youthful generation. Furthermore, in her early years Ms. Moyet had a weight issue which, unfortunately, presented a barrier to her solo marketability to a mass audience which, particularly in the U.S., tends to meld image with musical identity.

Despite these headwinds, she has attained a degree of financial success and musical respect that few artists have achieved. Her discography and other information can be found at her website.

While I prefer to show live performances of a song, unfortunately those that currently appear on YouTube are of questionable origination from a copyright and fair use perspective, and of poor quality. Thus, here is the recording “Alive” from her most recent album entitled Other, her voice calling like a loving siren across ethereal space and time. In my opinion, it is one of her most intelligent, mature, and mesmerizing songs, painting colors and emotions with a pallet of evocative lyrics that float over a timeless electronic musical soundscape.

 

 

 

Take Me To The River, Part 3, Technical Performance and Risk Management Digital Elements of Integrated Program Management

Part three of this series of articles on the elements of Integrated Program and Project Management will focus on two additional areas of IPM: technical performance and risk management. Prior to jumping in, however–and given the timeframe over which I’ve written this series–a summary to date is in order.

The first part of our exploration into IPM digital inventory concerned cost elements. Cost in this sense was broadly defined as any cost elements that need to be of interest to a project or program managers and their  teams. I first clarified our terms by defining the differences between project and program management–and how those differences will influence our focus. Then I outlined the term cost as falling into the following categories:

  1. Contract costs and the cost categories within the organizational hierarchy;
  2. Cost estimates, “colors” of money where such distinctions exist, and cashflow;
  3. Additional costs that relate to the program or project effort that are not always directly attributed to the effort, such as PMA, furnished materials or labor, corollary and supporting efforts on the part of the customer, and other overhead and G&A type costs;
  4. Contract cost performance under earned value management (EVM); and
  5. Portfolio management considerations and total cost of ownership.

The second part of this exposition concerned schedule elements, that is, time-phased planning and performance that is essential to any project or program effort. The article first discussed the primacy of the schedule in project and program planning and execution, given its ties in defining the basis for the cost elements addressed in the first part of the series. I then discussed the need for integrated planning as the basis for a valid executable schedule and PMB, the detailed elements and citations of the sources of that information in the literature and formal guidance, the role of framing assumptions in the construction of schedule and cost plans with its holistic approach to go/no-go decision-making, and, finally, the role of the schedule in establishing the project and program battle rhythm.

Now, in this final section, we will determine the other practical elements of IPM beyond even my expansive view of cost and schedule integration.

Technical Performance Management

Given this paper that resulted from a programmatic effort in Navy regarding Technical Performance Management (TPM), it is probably not surprising that I will start here. My core paper in the link above represents what I viewed as an initial effort at integration of TPM to determine impacts of that performance within program cost performance (EVM) projections. But this approach was based on the following foundations:

a. That the solution needed to tie technical achievement to EVM so that it represented greater fidelity to performance than what I viewed as indirect and imprecise methods; such as WBS elements that contained partial or tangential relationships to technical performance measures, and more subjective and arbitrary methods, such as percent complete.

b. That the approach needed to be tied to established systems engineering methods of technical risk management.

c. That the solution should be simple to implement and be statistically valid in its results, tested by retrospective analyses that performed forensic what-if analysis against the ultimate results.

One need only to look at the extensive bibliography that accompanied my paper to understand that there were clear foundations for TPM, but it remained–and in some quarters remains–a controversial concept that provoked resistance, though programs clearly note achievement of technical requirements. For example, the foundations of technical risk management and tracking that the paper cited were in use at what was Martin Marietta for many years. Thus, why the resistance to change?

First, I think, is that the domain of project performance has rested too long in the hands of the EVM community with its historical foundations in cost and financial management, with a risk averse approach to new innovations. Second, given this history, the natural differences between program management, systems engineering, and earned value SMEs created a situation where there just wasn’t the foundation necessary for any one group to take ownership of this development in systems and business intelligence improvement. Even in industry, such cross-domain initiatives tend to initially garner both skepticism, if not outright cynicism, and resistance by personnel unsure of how the new measures will affect assessment of their work.

But keep in mind that, dating myself a bit, this is the same type of reaction that organizations experienced during the first wave of digitization of work. The reaction to each initiative that I witnessed, from the introduction of desktop computers connected to a central server, to the introduction of the first PCs, to the digitization of work products were met with the common refrain at the time that it was too experimental, or too transient, or too unstable, or too unproven, until it wasn’t any of those things.

I also overstate this resistance a bit. Over the last 20 years organizations within the military services adopted this method–or a variation–of TPM integration, as have some commercial companies. Furthermore, thinking and contributions on TPM have advanced in the intervening years.

The elements of technical performance management can be found in the language of the scope being planned. The brilliant paper authored by Glen B. Alleman, Thomas J. Coonce, and Rick A. Price entitled “Building a Credible Performance Measurement Baseline”, establishes the basis for tying project and program performance to technical achievement. These elements are measures of effectiveness (MoEs), measures of performance (MoPs), technical performance measures (TPMs), and key performance parameters and indicators (KPPs and KPIs). Taken together these define the framing assumptions for the project or program.

When properly constructing the systems, procedures, and artifacts from the decomposition of planning documents and performance language, the proper assignment of these elements to the WBS and specific work packages establishes a strong foundation for tying project and program success to both overall technical performance and the framing assumptions implicit in the effort.

What this means is that there also may be a technical performance baseline, which acts in parallel to the cost-focused performance management baseline. This technical performance baseline is the same as the work that is planned at the work package level for planned work. The assessment of progress is further decomposed to look at the timeframe at that point of progress within the context of the integrated master schedule (the IMS). We ask ourselves as a function of risk: what is the chance of achieving the next threshold in our technical performance plan?

As with all elements of work, our MoEs, MoPs, TPMs, KPPs, and KPIs do not reside at the same level of overall performance management and tracking within the WBS hierarchy. Some can be tracked to the lowest level, usually at work package, some will have contributions from lower levels and be summarized at the control account level, and others are at the total project or program level, with contributors from specific lower levels of the WBS structure.

A common example of what is claimed is a difficult technical performance measure is the factor of weight in aircraft design and production. Weight is an essential factor and must be in alignment with the mission of the aircraft. For example, if an aircraft is being built for the Navy, chances are high that the expectation is for it to be able to take off and land on a moving carrier deck. Take off requires coming up to airspeed very quickly. Landings are especially hard, since they are essentially controlled crashes augmented by an arresting gear. Airframes, avionics, and engines must operate in a salt water environment that involves a metal ship. The electro-magnetic effects alone, if they are not mitigated in the design and systems on both aircraft and ship, will significantly degrade the ability of the aircraft to operate as intended. Controlling weight in this case is essential, especially when one considers the need for fuel, ordnance, and avoiding being detected and shot down.

In current practice, the process of tracking weight over the life of aircraft design and development is tightly controlled. It is a function of tradeoff analysis and decision-making with contributors from many sub-elements of the WBS hierarchy. Thus, the use of the factor of weight as an argument to defeat the need to tightly integrate technical measures to the performance measurement baseline is a canard. On the contrary, it is an argument for tighter and broader integration of IPM data and, in particular, ties our systems to–and thus making the projections and the basis of our decision-making a function of– risk management, which is the next topic.

Risk Management Elements and Integration

There is a good deal of literature on risk, so I will confine this section to how risk in terms of integrated project and program management.

For many subdomains within the project and program management, when one mentions the term “risk management” the view often encountered is that the topic at hand is applying Monte Carlo analysis using non-random random numbers to the integrated master schedule (IMS) to determine the probabilities of a range of task durations and completions. This is known as a Schedule Risk Analysis or SRA.

Most of the correlations today are based on the landmark paper by Philip M. Lurie and Matthew S. Goldberg with the sexy title, “An approximate method for sampling correlated random variables from partially specified distributions”. With Monte Carlo informed by Lurie-Goldberg (for short) we then can make inferences as to alternative critical paths and near-critical paths for time-phasing our work. Also, the contribution of each task in terms of its criticality and contribution to the critical path can be measured. Sensitivity analysis elements identifies the most critical risk elements.

If the integrated master schedule is truly integrated to resource and cost, Lurie-Goldberg allows us to defeat the single-point estimate heavy projections of EVM to calculate a range of cost outcomes by probability distribution. This same type of analysis can be done against the time-phased PMB.

But that is just one area of risk management, which is known as quantitative risk. Another area of risk which should be familiar to project and program managers is qualitative risk. The project and programmatic risk analysis of qualitative risk involves the following steps:

1. Risk identification

2. Risk evaluation

3. Risk handling, and

4. Continual risk management

This is a closed loop system, which garners a risk register, risk ranking, a risk matrix, risk handling and mitigation plans, and a risk handling waterfall chart. These artifacts of risk analysis will also require the monitoring of risk triggers, and cross-referencing to risk ownership.

Once again, though cost impacts are also calculated, with their probability of manifesting, the strongest tie of risk management begins with the integrated master schedule. Thus, conditional and probabilistic branching will provide the project and program team with a step-by-step what-if? analysis that provides alternative schedules that will also provide ranges of cost impact.

Mainstreaming Risk Management and TPM into IPM

In reality, project and program management is simply monitoring and forecasting without technical performance and risk management. Yet, these sub-domains are oftentimes confined to a few specialists or viewed as a dichotomous and independent processes under the general duties of the team.

The economic urgency and essentiality of integrated project and program management is the realization that technical achievement of the product, and the assessment and handling of risks along the course of that achievement, are at the core of project and program management.

Ch- Ch- Changes–What I Learned at the NDIA IPMD Meeting and Last Thoughts on POGO DCMA

Hot Topics at the National Defense Industrial Association’s Integrated Program Management Division (NDIA-IPMD)

For those of you who did not attend, or who have a passing interest in what is happening in the public sphere of DoD acquisition, the NDIA IPMD meeting held last week was a great importance. Here are the highlights.

Electronic Submission of Program Management Information under the New Proposed DoD Schema

Those who have attended meetings in the past, and who read this blog, know where I stand on this issue, which is to capture all of the necessary information that provides a full picture of program and project performance among all of its systems and subsystems, but to do so in an economically feasible manner that reduces redundancy, reduces data streams, and improves timeliness of submission. Basic information economics state that a TB of data is only incrementally more expensive–as defined by the difference in the electricity generated–as a MB of data. Basic experience in IT management demonstrates that automating a process that eliminates touch labor in data production/validation improves productivity and speed.

Furthermore, if a supplier in complex program and project management is properly managing–and has sufficient systems in place–then providing the data necessary for the DoD to establish accountability and good stewardship, to ensure that adequate progress is being made under the terms of the contract, to ensure that contractually required systems that establish competency are reliable and accurate, and to utilize in future defense acquisition planning–should not be a problem. We live in a world of 0s and 1s. What we expect of our information systems is to do the grunt work handling ever large systems in providing information. In this scenario the machine is the dumb one and the person assessing the significance and context of what is processed into intelligence is the smart one.

The most recent discussions and controversies surrounded the old canard regarding submission at Control Account as opposed to the Work Package level of the WBS. Yes, let’s party like it’s 1997. The other issue was whether cumulative or current data should be submitted. I have issues with both of these items, which continue to arise like bad zombie ideas. You put a stake in them, but they just won’t die.

To frame the first issue, there are some organizations/project teams that link budget to control account, and others to work package. So practice is the not determinant, but it speaks to earned value management (EVM).The receiving organization is going to want the lowest level for reporting where there is foot-and-tie to not only budget, but to other systems. This is the rub.

I participated in an still-unpublished study for DoD that indicated that if one uses earned value management (EVM) exclusively to manage that it doesn’t matter. You get a bit more fidelity and early warning at the work package level, but not much.

But note my conditional.

No one exclusively uses EVM to manage projects and programs. That would be foolish and seems to be the basis of the specious attack on the methodology when I come upon it, especially by baby PMs. The discriminator is the schedule, and the early warning is found there. The place where you foot-and-tie schedule to the WBS is at the work package level. If you are restricted to the control account for reporting you have a guessing game–and gaming of the system–given that there will be many schedule activities to one control account.

Furthermore, the individual reviewing EVM and schedule will want to ensure that the Performance Measurement Baseline (PMB) and the Integrated Master Schedule (IMS) were not constructed in isolation from one another. There needs to be evidence that the work planned under the cost plan matches the work in time.

Regarding cumulative against current dollar submission the issue is one of accuracy. First, consecutive cumulative submissions require that the latest figure be subtracted from the last, which causes round-up errors–which are exacerbated if reporting is restricted to the control account level. NDIA IPMD had a long discussion on the intrinsic cumulative-to-cumulative error at a meeting last year, which was raised by Gary Humphreys of Humphreys & Associates. Second, cumulative submissions often hide retroactive changes. Third, to catch items in my second point, one must execute cross checks for different types of data, rather than getting a dump from the system of record and rolling up. The more operations and manipulation made to data, the harder it becomes to ensure fidelity and get everyone to agree on one trusted source, that is, in reading off of the same page.

When I was asked about my opinion on these issues, my response was twofold. First, as the head of a technology company it doesn’t matter to me. I can handle data in accordance with the standard DoD schema in any way specified. Second, as a former program management type and as an IT professional with an abiding dislike of inefficient systems, the restrictions proposed are based on the limitations of proprietary systems in use by suppliers that, in my opinion, need to be retired. The DoD and A&D market is somewhat isolated from other market pressures, by design. So the DoD must artificially construct incentives and an ecosystem that pushes businesses (and its own organizations) to greater efficiency and innovation. We don’t fly F-4s anymore, so why continue to use IT business systems designed in 1997 that are solely supported by sunk-cost arguments and rent seeking behavior?

Thus, my recommendation was that it was up to the DoD to determine the information required to achieve their statutory and management responsibilities, and it is up to the software solution providers to provide the, you know, solutions that meet them.

I was also asked if I agreed with another solution provider to have the software companies have another go at the schema prior to publication. My position was consistent in that regard: we don’t work the refs. My recommendation to OSD, also given that I have been in a similar position regarding an earlier initiative long the same lines back when I wore a uniform, is to explore the art of the possible with suppliers. The goals are to reduce data streams, eliminate redundancy, and improve speed. Let the commercial software guys figure out how to make it work.

Current projection is three to four weeks before a final schema is published. We will see if the corresponding documentation will also be provided simultaneously.

DCMA EVAS – Data-driven Assessment and Surveillance

This is a topic for which I cannot write without a conflict of interest since the company that is my day job is the solution provider, so I will make this short and sweet.

First, it was refreshing to see three Hub leads at the NDIA IPMD meeting. These are the individuals in the field who understand the important connection between government acquisition needs and private industry capabilities in the logistics supply chain.

Second, despite a great deal of behind-the-scenes speculation and drama among competitors in the solution provider market, DCMA affirmed that it had selected its COTS solution and that it was working with that provider to work out any minor issues now that MIlestone B has been certified and they are into full implementation.

Third, DCMA announced that the Hubs would be collecting information and that the plan for a central database for EVAS that would combine other DoD data has been put on hold until management can determine the best course for that solution.

Fourth, the Agency announced that the first round of using the automated metrics was later this month and that effort would continue into October.

Fifth, the Agency tamped down some of the fear related to this new process, noting that tripping metrics may simply indicate that additional attention was needed in that area, including those cases where it simply needed to be documented that the supplier’s System Description deviated from the standard indicator. I think this will be a process of familiarization as the Hubs move out with implementation.

DCMA EVAS, in my opinion, is a significant reform of the way the agency does business. It not only drives process and organizational improvement within the agency by eliminating uneven and arbitrary determinations of contract non-compliance (as well as improvements in data management), but opens a dialogue regarding systems improvement, driving similar changes to the supplier base.

NDAA Section 804

There were a couple of public discussions on NDAA Section 804 which, if you are not certain what it is, should go to this link. Having kept track of developments in the NDAA for this coming fiscal year, what I can say is that the final language of Section 804 doesn’t say what many think it says when it was in draft.

What it doesn’t authorize is a broad authority to overrule other statutory requirements for government accountability, oversight, and reporting, including the requirement for earned value management on large programs. This statement is supported by both OSD speakers that addressed the issue in the meeting.

The purpose of Section 804 was to provide the ability to quickly prototype and field new technologies in the wake of 911, particularly as it related to identifying, tracking, and preventing terrorist acts. But the rhetoric behind this section, which was widely touted by elected representatives long before the final version of the current NDAA had been approved, implied a broader mandate for more prosaic acquisitions. My opinion in having seen programs like this before (think Navy A12 program) is that, if people use this authority too broadly that we will be discussing more significant issues than a minor DCMA program that ends this blog post.

Thus, the message coming from OSD is that there is no carte blanche get-out-of-jail card for covering yourself under Section 804 and deciding that lack of management is a substitute for management, and that failure to obtain timely and necessary program performance information does not mean that it cannot be forensically traced in an audit or investigation, especially if things go south. A word to the wise, while birds of a feather catch cold.

DoD Reorganization

The Department of Defense has been undergoing reorganization and the old Office of the Undersecretary of Defense for Acquisition, Technology, and Logistics (OUSD (AT&L) has been broken up and reassigned largely to a new Undersecretary of Defense for Acquisition and Sustainment (USD (A&S).

As a result of this reorganization there were other points indicated:

a. Day-to-day program management will be pushed to the military services. No one really seems to understand what this means. The services already have PMOs in place that do day-to-day management. The policy part of old AT&L will be going intact to A&S as well as program analysis. The personnel cuts that are earmarked for some DoD departments was largely avoided in the reorganization, except at the SES level, which I will address below.

b. Other Transaction Authority (OTA) and Section 804 procurements are getting a lot of attention, but they seem ripe for abuse. I had actually was a member of a panel regarding Acquisition Reform at the NDIA Training and Simulation Industry Symposium held this past June in Orlando. I thought the focus would be on the recommendations from the 809 panel but, instead, turned out to be on OTA and Section 804 acquisitions. What impressed me the most was that even companies that had participated in these types of contracting actions felt that they were unnecessarily loosely composed, which would eventually impede progress upon review and audit of the programs. The consensus in discussions with the audience and other panel members was that the FAR and DFARS already possessed sufficient flexibility if Contracting Officers were properly trained to know how to construct such a requirement and still stay between the lines, absent a serious operational need that cannot be met through normal acquisition methods. Furthermore, OTA SME knowledge is virtually non-existent. Needless to say, things like Nunn-McCurdy and new Congressional reporting requirements in the latest NDAA still need to be met.

c. The emphasis in the department, it was announced, would also shift to a focus on portfolio analysis, but–again–no one could speak to exactly what that means. PARCA and the program analysis personnel on the OSD staffs provide SecDef with information on the entire portfolio of major programs. That is why there is a DoD Central Repository for submission of program data. If the Department is looking to apply some of the principles in DoD that provide flexibility in identifying risks and tradeoffs across, then that would be most useful and a powerful tool in managing resources. We’ve seen efforts like Cost as an Independent Variable (CAIV) and other tradeoff methods come and go, it would be nice if the department would reward the identification of programmatic risk early and often in program go/no-go/tradeoff/early production decisions.

d. To manage over $7 trillion dollars of program PARCA’s expense is $4.5M. The OSD personnel made this point, I think, to emphasize the return on investment in their role regarding oversight, risk identification, and root cause analysis with an eye to efficiency in the management of DoD programs. This is like an insurance policy and a built-in DoD change agent. But from my outside reading, there was a move by Representative Mac Thornberry, who is Chairman of House Armed Services, to hollow out OSD by eliminating PARCA and much of the AT&L staffs. I had discussions with staffs for other Congressional members of the Armed Services Committee when this was going on, and the cause seemed to be that there is a lack of understanding to the extent that DoD has streamlined its acquisition business systems and how key PARCA, DCMA, and the analysis and assessment staffs are to the acquisition ecosystem, and how they foot and tie to the service PEOs and PMOs. Luckily for the taxpayer, it seems that Senate Armed Services members were aware of this and took the language out during markup.

Other OSD Business — Reconciling FAR/DFARS, and Agile

a.. DoD is reconciling differences between overlapping FAR and DFARS clauses. Given that DoD is more detailed and specific in identifying reporting and specifying oversight of contracts by dollar threshold, complexity, risk, and contract type, it will be interesting how this plays out over time. The example given by Mr. John McGregor of OSD was the difference between the FAR and DFARS clauses regarding the application of earned value management (EVM). The FAR clause is more expansive and cut-and-dried. The DFARS clause distinguishes the level of EVM reporting and oversight (and surveillance) that should apply based on more specific criteria regarding the nature of the program and the contract characteristics.

b. The issue of Agile and how it somehow excuses using estimating, earned value management, risk management, and other proven program management controls was addressed. This contention is, of course, poppycock and Glen Alleman on his blog has written extensively about this zombie idea. The 809 Panel seemed to have been bitten by it, though, where its members were convinced that Agile is a program or project management method, and that there is a dichotomy between Agile and the use of EVM. The prescient point in critiquing this assertion was effectively made by the OSD speakers. They noted that they attend many forums and speak to various groups about Agile, and that there is virtually no consensus about what exactly it is and what characteristics define it, but everyone pretty much recognizes it as an approach to software development. Furthermore, EVM is used today on programs that at least partially use Agile software development methodology and do so very effectively. It’s not like crossing the streams.

Gary Bliss, PARCA – Fair Winds and Following Seas

The blockbuster announcement at the meeting was the planned retirement of Gary Bliss, who has been and is the Director of PARCA, on 30 September 2018. This was due to the cut in billets at the Senior Executive Service (SES) level. He will be missed.

Mr. Bliss has transformed the way that DoD does business and he has done so by building bridges. I have been attending NDIA IPMD meetings (and under its old PMSC name) for more than 20 years. Over that time, from when I was near the end of my uniformed career in attending the government/joint session, and, later, when I attended full sessions after joining private industry, I have witnessed a change for the better. Mr. Bliss leaves behind an industry that has established collaboration with DoD and federal program management personnel as its legacy for now and into the future.

Before the formation of PARCA all too often there were two camps in the organization, which translated to a similar condition in the field in relation to PMOs and oversight agencies, despite the fact that everyone was on the same team in terms of serving the national defense. The issue, of course, as it always is, was money.

These two camps would sometimes break out in open disagreement and expressed disparagement of the other. Mr. Bliss brought in a gentleman by the name of Gordon Kranz and together they opened a dialogue in meeting PARCA’s mission. This dialogue has continued with Mr. Kranz’s replacement, John McGregor.

The dialogue has revolved around finding root causes for long delays between development and production in program management and to recommend ways of streamlining processes and eliminating impediments, to root out redundancy, inefficiency, and waste throughout the program and project management supply chain, and to communicate with industry so that they understand the reasons for particular DoD policies and procedures, to obtain feedback on the effects of those decisions and how they can implemented to avoid arbitrariness, and to provide certainty to those who would seek to provide supplies and services to the national defense–especially innovative ones–in defining the rules of engagement. The focus was on collaborative process improvement–and it has worked. Petty disputes occasionally still arise, but they are the exception to the rule.

Under his watch Mr. Bliss established a common trusted data stream for program management data, and forged policies that drove process improvement from the industrial base through the DoD program office. This was not an easy job. His background as an economist and his long distinguished career in the public service armed him well in this regard. We owe him a debt of gratitude.

We can all hope that the next OSD leadership that assumes that role will be as effective and forward leaning.

Final Thoughts on DCMA report revelations

The interest I received on my last post on the DCMA internal report regarding the IWMS project was very broad, but the comments that I received expressed some confusion on what I took away as the lessons learned in my closing paragraphs. The reason for this was the leaked nature of the reports, which alleged breaches of federal statute and other administrative and professional breaches, some of a reputational nature. They are not the final word and for anyone to draw final conclusions from leaked material of that sort would be premature. But here are some initial lessons learned:

Lesson #1: Do not split requirements and game the system to fall below financial thresholds to avoid oversight and management approval. This is a Contracts 101 issue and everyone should be aware of it.

Lesson #2: Ensure checks and balances in the procurement process are established and maintained. Too much power, under the moniker of acquisition reform and “flexibility”, has given CIOs and PMs the authority to make decisions that require collaboration, checks, and internal oversight. In normative public sector acquisition environments the end-user does not get to select the contractor, the contract type, the funding sources, or the acquisition method involving fair and open competition–or a deviation from it. Nor having directed the procurement, to allow the same individual(s) to certify receipt and acceptance. Establishing checks and balances without undermining operational effectiveness requires a subtle hand, in which different specialists working within a matrix organization, with differing chains of command and responsibility, ensure that there is integrity in the process. All members of this team can participate in planning and collaboration for the organizations’ needs. It appears, though not completely proven, that some of these checks and balances did not exist. We do know from the inspections that Contracting Officer’s Representatives (CORs) and Contracting Officers’s Technical Representatives (COTRs) were not appointed for long-term contracts in many cases.

Lesson #3: Don’t pre-select a solution by a particular supplier. This is done by understanding the organization’s current and future needs and putting that expression in a set of salient characteristics, a performance work statement, or a statement of work. This document is authored to share with the marketplace though a formalized and documented process of discovery, such as a request for information (RFI).

Lesson #4: I am not certain if the reports indicate that a legal finding of the appropriate color of money is or is not a sufficient defense but they seem to. This can be a controversial topic within an organization and oftentimes yields differing opinions. Sometimes the situation can be corrected with the substitution of proper money for that fiscal year by higher authority. Some other examples of Anti-deficiency Act (ADA) violations can be found via this link, published by the Defense Comptroller. I’ve indicated from my own experience how, going from one activity to another as an uniformed Navy Officer, I had run into Comptrollers with different opinions of the appropriate color of money for particular types of supplies and services at the same financial thresholds. They can’t all have been correct. I guess I am fortunate that over 23 years–18 of them as a commissioned Supply Corps Officer* and five before that as an enlisted man–that I never ran into a ADA violation in any transaction in which I was involved. The organizations I was assigned to had checks and balances to ensure there was not a statutory violation which, I may add, is a federal crime. Thus, no one should be cavalierly making this assertion as if it were simply an administrative issue. But everyone in the chain is not responsible, unless misconduct or criminal behavior across that chain contributed to the violation. I don’t see it in these reports.Systemic causes require systemic solutions and education.

Note that all of these lessons learned are taught as basic required knowledge in acquisition classes and in regulation. I also note that, in the reports, there are facts of mitigation. It will be interesting to see what eventually comes out of this.