Potato, Potahto, Tomato, Tomahto: Data Normalization vs. Standardization, Why the Difference Matters

In my vocation I run a technology company devoted to program management solutions that is primarily concerned with taking data and converting it into information to establish a knowledge-based environment. Similarly, in my avocation I deal with the meaning of information and how to turn it into insight and knowledge. This latter activity concerns the subject areas of history, sociology, and science.

In my travels just prior to and since the New Year, I have come upon a number of experts and fellow enthusiasts in these respective fields. The overwhelming numbers of these encounters have been productive, educational, and cordial. We respectfully disagree in some cases about the significance of a particular approach, governance when it comes to project and program management policy, but generally there is a great deal of agreement, particularly on basic facts and terminology. But some areas of disagreement–particularly those that come from left field–tend to be the most interesting because they create an opportunity to clarify a larger issue.

In a recent venue I encountered this last example where the issue was the use of the phrase data normalization. The issue at hand was that the use of “data normalization” suggested some statistical methodology in reconciling data into a standard schema. Instead, it was suggested, the term “data standardization” was more appropriate.

These phrases do not describe the same thing, but they do describe processes that are symbiotic, not mutually exclusive. So what about data normalization? No doubt there is a statistical use of the term, but we are dealing with the definition as used in digital technology here, just as the use of “standardization” was suggested in the same context. There are many examples of technical terminology that do not have the same meaning when used in different contexts. Here is the definition of normalization applied to data science from Technopedia, which is the proper use of the term in this case:

Normalization is the process of reorganizing data in a database so that it meets two basic requirements: (1) There is no redundancy of data (all data is stored in only one place), and (2) data dependencies are logical (all related data items are stored together). Normalization is important for many reasons, but chiefly because it allows databases to take up as little disk space as possible, resulting in increased performance.

Normalization is also known as data normalization

This is pretty basic (and necessary) stuff. I have written at length about data normalization, but also pair it with two other terms. This is data rationalization and contextualization. Here is a short definition of rationalization:

What is the benefit of Data Rationalization? To be able to effectively exploit, manage, reuse, and govern enterprise data assets (including the models which describe them), it is necessary to be able to find them. In addition, there is (or should be) a wealth of semantics (e.g. business names, definitions, relationships) embedded within an organization’s models that can be exposed for improved analysis and knowledge transfer. By linking model objects (across or within models) it is possible to discover the higher order conceptual objects for any given object. Conversely, it is possible to identify what implementation artifacts implement a higher order model object. For example, using data rationalization, one can traverse from a conceptual model entity to a logical model entity to a physical model table to a database table, etc. Similarly, Data Rationalization enables understanding of a database table by traversing up through the different model levels.

Finally, we have contextualization. Here is a good definition using Wikipedia:

Context or contextual information is any information about any entity that can be used to effectively reduce the amount of reasoning required (via filtering, aggregation, and inference) for decision making within the scope of a specific application.[2] Contextualisation is then the process of identifying the data relevant to an entity based on the entity’s contextual information. Contextualisation excludes irrelevant data from consideration and has the potential to reduce data from several aspects including volume, velocity, and variety in large-scale data intensive applications

There is no approximation of reflecting the accuracy of data in any of these terms wihin the domain of data and computer science. Nor are there statistical methods involved to approximate what needs to be accomplished precisely. The basic skill required to accomplish these tasks–knowing that the data is structured and pre-conditioned–is to reconcile the various lexicons from differing sources, much as I reconcile in my avocation the meaning of words and phrases across periods in history and across languages.

In this discussion we are dealing with the issue of different words used to describe a process or phenomenon. Similarly, we find this challenge in data.

So where does this leave data standardization? In terms of data and computer science, this describes a completely different method. Here is a definition from Wikipedia, which is the proper contextual use of the term under “Standard data model”:

A standard data model or industry standard data model (ISDM) is a data model that is widely applied in some industry, and shared amongst competitors to some degree. They are often defined by standards bodies, database vendors or operating system vendors.

In the context of project and program management, particularly as it relates to government data submission and international open standards across vendors in an industry, is the use of a common schema. In this case there is a DoD version of a UN/CEFACT XML file currently set as the standard, but soon to be replaced by a new standard using the JSON file structure.

In any event, what is clear here is that, while standardization is a necessary part of a data policy to allow for sharing of information, the strength of the chosen schema and the instructions regarding it will vary–and this variation will have an effect on the quality of the information shared. But that is not all.

This is where data normalization, rationalization, and contextualization come into play. In order to create data for the a standardized format, it is first necessary to convert what is an otherwise opaque set of data due to differences into a cohesive lexicon. In data, this is accomplished by reconciling data dictionaries to determine which items are describing the same thing, process, measure, or phenomenon. In a domain like program management, this is a finite set. But it is also specialized knowledge and where the value is added to any end product that is produced. Then, once we know how to identify the data, we must be able to map those terms to the standard schema but, keeping on eye on the use of the data down the line, must be able to properly structure and ensure interrelationships of the data are established and/or maintained to ensure its effective use. This is no mean task and why all data transformation methods and companies are not the same.

Furthermore, these functions can be accomplished efficiently or inefficiently. The inefficient method is to take the old-fashioned business intelligence method that has been around since the 1980s and before, where a team of data scientists and analysts deal with data as if it is flat and, essentially, reinvents the wheel in establishing the meaning and proper context of the data. Given enough time and money anything can be accomplished, but brute force labor will not defeat the Second Law of Thermodynamics.

In computing, which comes close to minimizing that physical law, we know that data has already been imbued with meaning upon its initial processing. In lieu of brute force labor we apply intelligence and knowledge to accomplish this requirement. This is called normalization, rationalization, and contextualization of data. It requires a small fraction of other methods in terms of time and effort, and is infinitely more transparent.

Using these methods is also where innovation, efficiency, performance, accuracy, scalability, and anticipating future requirements based on the latest technology trends comes into play. Establishing a seamless flow of data integration allows, for example, the capture of more data being able to be properly structured in a database, which lays the ground for the transition from 2D to 3D and 4D (that is, what is often called integrated) program management, as well as more effective analytics.

The term “standardization” also suffers from a weakness in data and computer science that requires that it be qualified. After all, data standardization in an enterprise or organization does not preclude the prescription of a propriety dataset. In government, this is contrary to both statutory and policy mandates. Furthermore, even given an effective, open standard, there will be a large pool of legacy and other non-conforming data that will still require capture and transformation.

The Section 809 Panel study dealt directly with this issue:

Use existing defense business system open-data requirements to improve strategic decision making on acquisition and workforce issues…. DoD has spent billions of dollars building the necessary software and institutional infrastructure to collect enterprise wide acquisition and financial data. In many cases, however, DoD lacks the expertise to effectively use that data for strategic planning and to improve decision making. Recommendation 88 would mitigate this problem by implementing congressional open-data mandates and using existing hiring authorities to bolster DoD’s pool of data science professionals.

Section 809 Volume 3, Section 9, p.477

As operating environment companies expose more and more capability into the market through middleware and other open systems methods of visualizing data, the key to a system no longer resides in its ability to produce charts and graphs. The use of Excel as an ad hoc data repository with its vulnerability to error, to manipulation, and for its resistance to the establishment of an optimized data management and corporate knowledge environment is a symptom of the larger issue.

Data and its proper structuring is at the core of organizational success and process improvement. Standardization alone will not address barriers to data optimization. According to RAND studies in 2015 and 2017* these are:

  • Data Quality and Discontinuities
  • Data Silos and Underutilized Repositories
  • Timeliness of Data for use by SMEs and Decision-makers
  • Lack of Access and Contextualization
  • Traceability and Auditability
  • Lack of the Ability to Apply Discovery in the Data
  • The issue of Contractual Technical Data and Proprietary Data

That these issues also exist in private industry demonstrates the universality of the issue. Thus, yes, standardize by all means. But also ensure that the standard is open and that transformation is traceable and auditable from the the source system to the standard schema, and then into the target database. Only then will the enterprise, the organization, and the government agency have full ownership of the data it requires to efficiently and effectively carry out its purpose.

*RAND Corporation studies are “Issues with Access to Acquisition Data and Information in the DoD: Doing Data Right in Weapons System Acquisition” (RR880, 2017), and “Issues with Access to Acquisition Data and Information in the DoD: Policy and Practice (RR1534, 2015). These can be found here.

Open: Strategic Planning, Open Data Systems, and the Section 809 Panel

Sundays are usually days reserved for music and the group Rhye was playing in the background when this topic came to mind.

I have been preparing for my presentation in collaboration with my Navy colleague John Collins for the upcoming Integrated Program Management Workshop in Baltimore. This presentation will be a non-proprietary/non-commercial talk about understanding the issue of unlocking data to support national defense systems, but the topic has broader interest.

Thus, in advance of that formal presentation in Baltimore, there are issues and principles that are useful to cover, given that data capture and its processing, delivery, and use is at the heart of all systems in government, and private industry and organizations.

Top Data Trends in Industry and Their Relationship to Open Data Systems

According to Shohreh Gorbhani, Director, Project Control Academy, the top five data trends being pursued by private industry and technology companies. My own comments follow as they relate to open data systems.

  1. Open Technologies that transition from 2D Program Management to 3D and 4D PM. This point is consistent with the College of Performance Management’s emphasis on IPM, but note that the stipulation is the use of open technologies. This is an important distinction technologically, and one that I will explore further in this post.
  2. Real-time Data Capture. This means capturing data in the moment so that the status of our systems is up-to-date without the present delays associated with manual data management and conditioning. This does not preclude the collection of structured, periodic data, but also does include the capture of transactions from real-time integrated systems where appropriate.
  3. Seamless Data Flow Integration. From the perspective of companies in manufacturing and consumer products, technologies such as IoT and Cloud are just now coming into play. But, given the underlying premises of items 1 and 2, this also means the proper automated contextualization of data using an open technology approach that flows in such a way as to be traceable.
  4. The use of Big Data. The term has lost a good deal of its meaning because of its transformation into a buzz-phrase and marketing term. But Big Data refers to the expansion in the depth and breadth of available data driven by the economic forces that drive Moore’s Law. What this means is that we are entering a new frontier of data processing and analysis that will, no doubt, break down assumptions regarding the validity and strength of certain predictive analytics. The old assumptions that restrict access to data due to limitations of technology and higher cost no longer apply. We are now in the age of Knowledge Discovery in Data (KDD). The old approach of reporting assumed that we already know what we need to know. The use of data challenges old assumptions and allows us to follow the data where it will lead us.
  5. AI Forecasting and Analysis. No doubt predictive AI will be important as we move forward with machine learning and other similar technologies. But this infant is not yet a rug rat. The initial experiences with AI are that they tend to reflect the biases of the creators. The danger here is that this defeats KDD, which results in stagnation and fugue. But there are other areas where AI can be taught to automate mundane, value-neutral tasks relating to raw data interpretation.

The 809 Panel Recommendation

The fact that industry is the driving force behind these trends that will transform the way that we view information in our day-to-day work, it is not surprising that the 809 Panel had this to say about existing defense business systems:

“Use existing defense business system open-data requirements to improve strategic decision making on acquisition and workforce issues…. DoD has spent billions of dollars building the necessary software and institutional infrastructure to collect enterprise wide acquisition and financial data. In many cases, however, DoD lacks the expertise to effectively use that data for strategic planning and to improve decision making. Recommendation 88 would mitigate this problem by implementing congressional open-data mandates and using existing hiring authorities to bolster DoD’s pool of data science professionals.”

Section 809 Volume 3, Section 9, p. 477

At one point in my military career, I was assigned as the Materiel, Fuels, and Transportation Officer of Naval Air Station, Norfolk. As a major naval air base, transportation hub, and home to a Naval Aviation Depot, we shipped and received materiel and supplies across the world. In doing so, our transportation personnel would use what at the time was new digital technology to complete an electronic bill of lading that specified what and when items were being shipped, the common or military carrier, the intended recipient, and the estimated date of arrival, among other essential information.

The customer and receiving end of this workflow received an open systems data file that contained these particulars. The file was an early version of open data known as an X12 file, for which the commercial transportation industry was an early adopter. Shipping and receiving activities and businesses used their own type of local software: and there were a number of customized and commercial choices out there, as well as those used by common carriers such various trucking and shipping firms, the USPS, FEDEX, DHS, UPS, and others. The X12 file was the DMZ that made the information open. Software manufacturers, if they wanted to stay relevant in the market, could not impose a proprietary data solution.

Furthermore, standardization of terminology and concepts ensured that the information was readable and comprehensible wherever the items landed–whether across receiving offices in the United States, Japan, Europe, or even Istanbul. Understanding that DoD needs the skillsets to be able to optimize data, it didn’t require an army of data scientists to achieve this end-state. It required the right data science expertise in the right places, and the dictates of transportation consumers to move the technology market to provide the solution.

Over the years both industry and government have developed a number of schema standards focused on specific types of data, progressing from X12 to XML and now projected to use JSON-based schemas. Each of them in their initial iterations automated the submission of physical reports that had been required by either by contract or operations. These focused on a small subset of the full dataset relating to program management and project controls.

This progression made sense.

When digitized technology is first introduced into an intensive direct-labor environment, the initial focus is to automate the production of artifacts and their underlying processes in order to phase in the technology’s acceptance. This also allows the organization to realize immediate returns on investment and improvements in productivity. But this is the first step, not the final one.

Currently for project controls the current state is the UN/CEFACT XML for program performance management data, and the contract cost and labor data collection file known as the FlexFile. Clearly the latter file, given that the recipient is the Office of the Secretary of Defense Cost Assessment and Program Evaluation (OSD CAPE), establish it as one of many feedback loops that support that office’s role in coordinating the planning, programming, budgeting, and evaluation (PPBE) system related to military strategic investments and budgeting, but only one. The program performance information is also a vital part of the PPBE process in evaluation and in future planning.

For most of the U.S. economy, market forces and consumer requirements are the driving force in digital innovation. The trends noted by Ms. Gorbhani can be confirmed through a Google search of any one of the many technology magazines and websites that can be found. The 809 Panel, drawn as it was from specialists and industry and government, were tasked “to provide recommendations that would allow DoD to adapt and deliver capability at market speeds, while ensuring that DoD remains true to its commitment to promote competition, provide transparency in its actions, and maintain the integrity of the defense acquisition system.”

Given that the work of the DoD is unique, creating a type of monopsony, it is up to leadership within the Department to create the conditions and mandates necessary to recreate in microcosm the positive effects of market forces. The DoD also has a very special, vital mission in defending the nation.

When an individual business cobbles together its mission statement it is that mission that defines the necessary elements in data collection that are then essential in making decisions. In today’s world, best commercial sector practice is to establish a Master Data Management (MDM) approach in defining data requirements and practices. In the case of DoD, a similar approach would be beneficial. Concurrent with the period of the 809 Panel’s efforts, RAND Corporation delivered a paper in 2017 (link in the previous sentence) that made recommendations related to data governance that are consistent with the 809 Panel’s recommendations. We will be discussing these specific recommendations in our presentation.

Meeting the mission and readiness are the key components to data governance in DoD. Absent such guidance, specialized software solution providers, in particular, will engage in what is called “rent-seeking” behavior. This is an economic term that means that an “entity (that) seeks to gain added wealth without any reciprocal contribution of productivity.”

No doubt, given the marketing of software solution providers, it is hard for decision-makers to tell what constitutes an open data system. The motivation of a software solution is to make itself as “sticky” as possible and it does that by enticing a customer to commit to proprietary definitions, structures, and database schemas. Usually there are “black-boxed” portions of the software that makes traceability impossible and that complicates the issue of who exactly owns the data and the ability of the customer to optimize it and utilize it as the mission dictates.

Furthermore, data visualization components like dashboards are ubiquitous in the market. A cursory stroll through a tradeshow looks like a dashboard smorgasbord combined with different practical concepts of what constitutes “open” and “integration”.

As one DoD professional recently told me, it is hard to tell the software systems apart. To do this it is necessary to understand what underlies the software. Thus, a proposed honest-broker definition of an open data system is useful and the place to start, given that this is not a notional concept since such systems have been successfully been established.

The Definition of Open Data Systems

Practical experience in implementing open data systems toward the goal of optimizing essential information from our planning, acquisition, financial, and systems engineering systems informs the following proposed definition, which is based on commercial best practice. This proposal is also based on the principle that the customer owns the data.

  1. An open data system is one based on non-proprietary neutral schemas that allow for the effective capture of all essential elements from third-party proprietary and customized software for reporting and integration necessary to support both internal and external stakeholders.
  2. An open data system allows for complete traceability and transparency from the underlying database structure of the third-party software data, through the process of data capture, transformation, and delivery of data in the neutral schema.
  3. An open data system targets the loading of the underlying source data for analysis and use into a neutral database structure that replicates the structure of the neutral schema. This allows for 100% traceability and audit of data elements received through the neutral schema, and ensures that the receiving organization owns the data.

Under this definition, data from its origination to its destination is more easily validated and traced, ensuring quality and fidelity, and establishing confidence in its value. Given these characteristics, integration of data from disparate domains becomes possible. The tracking of conflicting indicators is mitigated, since open system data allows for its effective integration without the bias of proprietary coding or restrictions on data use. Finally, both government and industry will not only establish ownership of their data–a routine principle in commercial business–but also be free to utilize new technologies that optimize the use of that data.

In closing, Gahan Wilson, a cartoonist whose work appeared in National Lampoon, The New Yorker, Playboy, and other magazines recently passed.

When thinking of the barriers to the effective use of data, I came across this cartoon in The New Yorker:

Open Data is the key to effective integration and reporting–to the optimal use of information. Once mandated and achieved, our defense and business systems will be better informed and be able to test and verify assumed knowledge, address risk, and eliminate dogmatic and erroneous conclusions. Open Data is the driver of organizational transformation keyed to the effective understanding and use of information, and all that entails. Finally, Open Data is necessary to the mission and planning systems of both industry and the U.S. Department of Defense.

Sledgehammer: Pisano Talks!

My blogging hiatus is coming to an end as I take a sledgehammer to the writer’s block wall.

I’ve traveled far and wide over the last six months to various venues across the country and have collected a number of new and interesting perspectives on the issues of data transformation, integrated project management, and business analytics and visualization. As a result, I have developed some very strong opinions regarding the trends that work and those that don’t regarding these topics and will be sharing these perspectives (with the appropriate supporting documentation per usual) in following posts.

To get things started this post will be relatively brief.

First, I will be speaking along with co-presenter John Collins, who is a Senior Acquisition Specialist at the Navy Engineering & Logistics Office, at the Integrated Program Management Workshop at the Hyatt Regency in beautiful downtown Baltimore’s Inner Harbor 10-12 December. So come on down! (or over) and give us a listen.

The topic is “Unlocking Data to Improve National Defense Systems”. Today anyone can put together pretty visualizations of data from Excel spreadsheets and other sources–and some have made quite a bit of money doing so. But accessing the right data at the right level of detail, transforming it so that its information content can be exploited, and contextualizing it properly through integration will provide the most value to organizations.

Furthermore, our presentation will make a linkage to what data is necessary to national defense systems in constructing the necessary artifacts to support the Department of Defense’s Planning, Programming, Budgeting and Execution (PPBE) process and what eventually becomes the Future Years Defense Program (FYDP).

Traditionally information capture and reporting has been framed as a question of oversight, reporting, and regulation related to contract management, capital investment cost control, and DoD R&D and acquisition program management. But organizations that fail to leverage the new powerful technologies that double processing and data storage capability every 18 months, allowing for both the depth and breadth of data to expand exponentially, are setting themselves up to fail. In national defense, this is a condition that cannot be allowed to occur.

If DoD doesn’t collect this information, which we know from the reports of cybersecurity agencies that other state actors are collecting, we will be at a serious strategic disadvantage. We are in a new frontier of knowledge discovery in data. Our analysts and program managers think they know what they need to be viewing, but adding new perspectives through integration provide new perspectives and, as a result, will result in new indicators and predictive analytics that will, no doubt, overtake current practice. Furthermore, that information can now be processed and contribute more, timely, and better intelligence to the process of strategic and operational planning.

The presentation will be somewhat wonky and directed at policymakers and decisionmakers in both government and industry. But anyone can play, and that is the cool aspect of our community. The presentation will be non-commercial, despite my day job–a line I haven’t crossed up to this point in this blog, but in this latter case will be changing to some extent.

Back in early 2018 I became the sole proprietor of SNA Software LLC–an industry technology leader in data transformation–particularly in capturing datasets that traditionally have been referred to as “Big Data”–and a hybrid point solution that is built on an open business intelligence framework. Our approach leverages the advantages of COTS (delivering the 80% solution out of the box) with open business intelligence that allows for rapid configuration to adapt the solution to an organization’s needs and culture. Combined with COTS data capture and transformation software–the key to transforming data into information and then combining it to provide intelligence at the right time and to the right place–the latency in access to trusted intelligence is reduced significantly.

Along these lines, I have developed some very specific opinions about how to achieve this transformation–and have put those concepts into practice through SNA and delivered those solutions to our customers. Thus, the result has been to reduce both the effort and time to capture large datasets from data that originates in pre-processed data, and to eliminate direct labor and the duration to information delivery by more than 99%. The path to get there is not to apply an army of data scientists and data analysts that deals with all data as if it is flat and to reinvent the wheel–only to deliver a suboptimized solution sometime in the future after unnecessarily expending time and resources. This is a devolution to the same labor-intensive business intelligence approaches that we used back in the 1980s and 1990s. The answer is not to throw labor at data that already has its meaning embedded into its information content. The answer is to apply smarts through technology, and that’s what we do.

Further along these lines, if you are using hard-coded point solutions (also called purpose-built software) and knitted best-of-breed, chances are that you will find that you are poorly positioned to exploit new technology and will be obsolete within the next five years, if not sooner. The model of selling COTS solutions and walking away except for traditional maintenance and support is dying. The new paradigm will be to be part of the solution and that requires domain knowledge that translates into technology delivery.

More on these points in future posts, but I’ve placed the stake in the ground and we’ll see how they hold up to critique and comment.

Finally, I recently became aware of an extremely informative and cutting-edge website that includes podcasts from thought leaders in the area of integrated program management. It is entitled InnovateIPM and is operated and moderated by a gentleman named Rob Williams. He is a domain expert in project cost development, with over 20 years of experience in the oil, gas, and petrochemical industries. Robin has served in a variety of roles throughout his career and is now focuses on cost estimating and Front-End Loading quality assurance. His current role is advanced project cost estimator at Marathon Petroleum’s Galveston Bay Refinery in Texas City.

Rob was also nice enough to continue a discussion we started at a project controls symposium and interviewed me for a podcast. I’ll post additional information once it is posted.

Take Me To The River, Part 3, Technical Performance and Risk Management Digital Elements of Integrated Program Management

Part three of this series of articles on the elements of Integrated Program and Project Management will focus on two additional areas of IPM: technical performance and risk management. Prior to jumping in, however–and given the timeframe over which I’ve written this series–a summary to date is in order.

The first part of our exploration into IPM digital inventory concerned cost elements. Cost in this sense was broadly defined as any cost elements that need to be of interest to a project or program managers and their  teams. I first clarified our terms by defining the differences between project and program management–and how those differences will influence our focus. Then I outlined the term cost as falling into the following categories:

  1. Contract costs and the cost categories within the organizational hierarchy;
  2. Cost estimates, “colors” of money where such distinctions exist, and cashflow;
  3. Additional costs that relate to the program or project effort that are not always directly attributed to the effort, such as PMA, furnished materials or labor, corollary and supporting efforts on the part of the customer, and other overhead and G&A type costs;
  4. Contract cost performance under earned value management (EVM); and
  5. Portfolio management considerations and total cost of ownership.

The second part of this exposition concerned schedule elements, that is, time-phased planning and performance that is essential to any project or program effort. The article first discussed the primacy of the schedule in project and program planning and execution, given its ties in defining the basis for the cost elements addressed in the first part of the series. I then discussed the need for integrated planning as the basis for a valid executable schedule and PMB, the detailed elements and citations of the sources of that information in the literature and formal guidance, the role of framing assumptions in the construction of schedule and cost plans with its holistic approach to go/no-go decision-making, and, finally, the role of the schedule in establishing the project and program battle rhythm.

Now, in this final section, we will determine the other practical elements of IPM beyond even my expansive view of cost and schedule integration.

Technical Performance Management

Given this paper that resulted from a programmatic effort in Navy regarding Technical Performance Management (TPM), it is probably not surprising that I will start here. My core paper in the link above represents what I viewed as an initial effort at integration of TPM to determine impacts of that performance within program cost performance (EVM) projections. But this approach was based on the following foundations:

a. That the solution needed to tie technical achievement to EVM so that it represented greater fidelity to performance than what I viewed as indirect and imprecise methods; such as WBS elements that contained partial or tangential relationships to technical performance measures, and more subjective and arbitrary methods, such as percent complete.

b. That the approach needed to be tied to established systems engineering methods of technical risk management.

c. That the solution should be simple to implement and be statistically valid in its results, tested by retrospective analyses that performed forensic what-if analysis against the ultimate results.

One need only to look at the extensive bibliography that accompanied my paper to understand that there were clear foundations for TPM, but it remained–and in some quarters remains–a controversial concept that provoked resistance, though programs clearly note achievement of technical requirements. For example, the foundations of technical risk management and tracking that the paper cited were in use at what was Martin Marietta for many years. Thus, why the resistance to change?

First, I think, is that the domain of project performance has rested too long in the hands of the EVM community with its historical foundations in cost and financial management, with a risk averse approach to new innovations. Second, given this history, the natural differences between program management, systems engineering, and earned value SMEs created a situation where there just wasn’t the foundation necessary for any one group to take ownership of this development in systems and business intelligence improvement. Even in industry, such cross-domain initiatives tend to initially garner both skepticism, if not outright cynicism, and resistance by personnel unsure of how the new measures will affect assessment of their work.

But keep in mind that, dating myself a bit, this is the same type of reaction that organizations experienced during the first wave of digitization of work. The reaction to each initiative that I witnessed, from the introduction of desktop computers connected to a central server, to the introduction of the first PCs, to the digitization of work products were met with the common refrain at the time that it was too experimental, or too transient, or too unstable, or too unproven, until it wasn’t any of those things.

I also overstate this resistance a bit. Over the last 20 years organizations within the military services adopted this method–or a variation–of TPM integration, as have some commercial companies. Furthermore, thinking and contributions on TPM have advanced in the intervening years.

The elements of technical performance management can be found in the language of the scope being planned. The brilliant paper authored by Glen B. Alleman, Thomas J. Coonce, and Rick A. Price entitled “Building a Credible Performance Measurement Baseline”, establishes the basis for tying project and program performance to technical achievement. These elements are measures of effectiveness (MoEs), measures of performance (MoPs), technical performance measures (TPMs), and key performance parameters and indicators (KPPs and KPIs). Taken together these define the framing assumptions for the project or program.

When properly constructing the systems, procedures, and artifacts from the decomposition of planning documents and performance language, the proper assignment of these elements to the WBS and specific work packages establishes a strong foundation for tying project and program success to both overall technical performance and the framing assumptions implicit in the effort.

What this means is that there also may be a technical performance baseline, which acts in parallel to the cost-focused performance management baseline. This technical performance baseline is the same as the work that is planned at the work package level for planned work. The assessment of progress is further decomposed to look at the timeframe at that point of progress within the context of the integrated master schedule (the IMS). We ask ourselves as a function of risk: what is the chance of achieving the next threshold in our technical performance plan?

As with all elements of work, our MoEs, MoPs, TPMs, KPPs, and KPIs do not reside at the same level of overall performance management and tracking within the WBS hierarchy. Some can be tracked to the lowest level, usually at work package, some will have contributions from lower levels and be summarized at the control account level, and others are at the total project or program level, with contributors from specific lower levels of the WBS structure.

A common example of what is claimed is a difficult technical performance measure is the factor of weight in aircraft design and production. Weight is an essential factor and must be in alignment with the mission of the aircraft. For example, if an aircraft is being built for the Navy, chances are high that the expectation is for it to be able to take off and land on a moving carrier deck. Take off requires coming up to airspeed very quickly. Landings are especially hard, since they are essentially controlled crashes augmented by an arresting gear. Airframes, avionics, and engines must operate in a salt water environment that involves a metal ship. The electro-magnetic effects alone, if they are not mitigated in the design and systems on both aircraft and ship, will significantly degrade the ability of the aircraft to operate as intended. Controlling weight in this case is essential, especially when one considers the need for fuel, ordnance, and avoiding being detected and shot down.

In current practice, the process of tracking weight over the life of aircraft design and development is tightly controlled. It is a function of tradeoff analysis and decision-making with contributors from many sub-elements of the WBS hierarchy. Thus, the use of the factor of weight as an argument to defeat the need to tightly integrate technical measures to the performance measurement baseline is a canard. On the contrary, it is an argument for tighter and broader integration of IPM data and, in particular, ties our systems to–and thus making the projections and the basis of our decision-making a function of– risk management, which is the next topic.

Risk Management Elements and Integration

There is a good deal of literature on risk, so I will confine this section to how risk in terms of integrated project and program management.

For many subdomains within the project and program management, when one mentions the term “risk management” the view often encountered is that the topic at hand is applying Monte Carlo analysis using non-random random numbers to the integrated master schedule (IMS) to determine the probabilities of a range of task durations and completions. This is known as a Schedule Risk Analysis or SRA.

Most of the correlations today are based on the landmark paper by Philip M. Lurie and Matthew S. Goldberg with the sexy title, “An approximate method for sampling correlated random variables from partially specified distributions”. With Monte Carlo informed by Lurie-Goldberg (for short) we then can make inferences as to alternative critical paths and near-critical paths for time-phasing our work. Also, the contribution of each task in terms of its criticality and contribution to the critical path can be measured. Sensitivity analysis elements identifies the most critical risk elements.

If the integrated master schedule is truly integrated to resource and cost, Lurie-Goldberg allows us to defeat the single-point estimate heavy projections of EVM to calculate a range of cost outcomes by probability distribution. This same type of analysis can be done against the time-phased PMB.

But that is just one area of risk management, which is known as quantitative risk. Another area of risk which should be familiar to project and program managers is qualitative risk. The project and programmatic risk analysis of qualitative risk involves the following steps:

1. Risk identification

2. Risk evaluation

3. Risk handling, and

4. Continual risk management

This is a closed loop system, which garners a risk register, risk ranking, a risk matrix, risk handling and mitigation plans, and a risk handling waterfall chart. These artifacts of risk analysis will also require the monitoring of risk triggers, and cross-referencing to risk ownership.

Once again, though cost impacts are also calculated, with their probability of manifesting, the strongest tie of risk management begins with the integrated master schedule. Thus, conditional and probabilistic branching will provide the project and program team with a step-by-step what-if? analysis that provides alternative schedules that will also provide ranges of cost impact.

Mainstreaming Risk Management and TPM into IPM

In reality, project and program management is simply monitoring and forecasting without technical performance and risk management. Yet, these sub-domains are oftentimes confined to a few specialists or viewed as a dichotomous and independent processes under the general duties of the team.

The economic urgency and essentiality of integrated project and program management is the realization that technical achievement of the product, and the assessment and handling of risks along the course of that achievement, are at the core of project and program management.

Back to School Daze Blogging–DCMA Investigation on POGO, DDSTOP, $600 Ashtrays,and Epistemic Sunk Costs

Family summer visits and trips are in the rear view–as well as the simultaneous demands of balancing the responsibilities of a, you know, day job–and so it is time to take up blogging once again.

I will return to my running topic of Integrated Program and Project Management in short order, but a topic of more immediate interest concerns the article that appeared on the website for pogo.org last week entitled “Pentagon’s Contracting Gurus Mismanaged Their Own Contracts.” Such provocative headlines are part and parcel of organizations like POGO, which have an agenda that seems to cross the line between reasonable concern and unhinged outrage with a tinge conspiracy mongering. But the content of the article itself is accurate and well written, if also somewhat ripe with overstatement, so I think it useful to unpack what it says and what it means.

POGO and Its Sources

The source of the article comes from three sources regarding an internal Defense Contract Management Agency (DCMA) IT project known as the Integrated Workflow Management System (IWMS). These consist of a September 2017 preliminary investigative report, an April 2018 internal memo, and a draft of the final report.

POGO begins the article by stating that DCMA administers over $5 trillion in contracts for the Department of Defense. The article erroneously asserts that it also negotiates these contracts, apparently not understanding the process of contract oversight and administration. The cost of IWMS was apparently $46.6M and the investigation into the management and administration of the program was initiated by the then-Commander of DCMA, Lieutenant General Wendy Masiello, shortly before she retired from the government in May 2017.

The implication here, given the headline, seems to be that if there is a problem in internal management within the agency, then that would translate into questioning its administration of the $5 trillion in contract value. I view it differently, given that I understand that there are separate lines of responsibility in the agency that do not overlap, particularly in IT. Of the $46.6M there is a question of whether $17M in value was properly funded. More on this below, but note that, to put things in perspective, $46.6M is .000932% of DCMA’s oversight responsibility. This is aside from the fact that the comparison is not quite correct, given that the CIO had his own budget, which was somewhat smaller and unrelated to the $5 trillion figure. But I think it important to note that POGO’s headline and the introduction of figures, while sounding authoritative, are irrelevant to the findings of the internal investigation and draft report. This is a scare story using scare numbers, particularly given the lack of context. I had some direct experience in my military career with issues inspired by the POGO’s founders’ agenda that I will cover below.

In addition to the internal investigation on IWMS, there was also an inspector general (IG) investigation of thirteen IT services contracts that resulted in what can only be described as pedestrian procedural discrepancies that are easily correctable, despite the typically overblown language found in most IG reports. Thus, I will concentrate on this post on the more serious findings of the internal investigation.

My Own Experience with DCMA

A note at this point on full disclosure: I have done business with and continue to do business with DCMA, both as a paid supplier of software solutions, and have interacted with DCMA personnel at publicly attended professional forums and workshops. I have no direct connection, as far as I am aware, to the IWMS program, though given that the assessment is to the IT organization, it is possible that there was an indirect relationship. I have met Lieutenant General Masiello and dealt with some of her subordinates not only during her time at DCMA, but also in some of her previous assignments in Air Force. I always found her to be an honest and diligent officer and respect her judgment. Her distinguished career speaks for itself. I have talked on the telephone to some of the individuals mentioned in the article on unrelated matters, and was aware of their oversight of some of my own efforts. My familiarity with all of them was both businesslike and brief.

As a supplier to DCMA my own contracts and the personnel that administer them were, from time-to-time, affected by the fallout from what I now know to have occurred. Rumors have swirled in our industry regarding the alleged mismanagement of an IT program in DCMA, but until the POGO article, the reasons for things such as a temporary freeze and review of existing IT programs and other actions were viewed as part and parcel of managing a large organization. I guess the explanation is now clear.

The Findings of the Investigation

The issue at hand is largely surrounding the method of source selection, which may have constituted a conflict of interest, and the type of money that was used to fund the program. In reading the report I was reminded of what Glen Alleman recently wrote in his blog entitled “DDSTOP: The Saga Continues.” The acronym DDSTOP means: Don’t Do Stupid Things On Purpose.

There is actually an economic behavioral principle for DDSTOP that explains why people make and double down on bad decisions and irrational beliefs. It is called epistemic sunk cost. It is what causes people to double down in gambling (to the great benefit of the house), to persist in mistaken beliefs, and, as stated in the link above, to “persist with the option which they have already invested in and resist changing to another option that might be more suitable regarding the future requirements of the situation.” The findings seem to document a situation that fits this last description.

In going over the findings of the report, it appears that IWMS’s program violated the following:

a. Contractual efforts in the program that were appropriate for the use of Research, Development, Test and Evaluation (R,D,T & E) funds as opposed to those appropriate for O&M (Operations and Maintenance) funds. What the U.S. Department of Defense calls “color of money.”

b. Amounts that were expended on contract that exceeded the authorized funding documents, which is largely based on the findings regarding the appropriate color of money. This would constitute a serious violation known as an Anti-Deficiency Act violation which, in layman’s terms, is directed to punish public employees for the misappropriation of government funds.

c. Expended amounts of O&M that exceeded the authorized levels.

d. Poor or non-existent program management and cost performance management.

e. Inappropriate contracting vehicles that, taken together, sidestepped more stringent oversight, aside from the award of a software solutions contract to the same company that defined the agency’s requirements.

Some of these are procedural and some are serious, particularly the Anti-deficiency Act (ADA) violations, are serious. In the Contracting Officer’s rulebook, you can withstand pedestrian procedural and administrative findings that are part and parcel of running an intensive contracting organization that acquires a multitude of supplies and services under deadline. But an ADA violation is the deadly one, since it is a violation of statute.

As a result of these findings, the recommendation is for DCMA to lose acquisition authority over the DoD micro-contracting level ($10,000). Organizationally and procedurally, this is a significant and mission-disruptive recommendation.

The Role and Importance of DCMA

DCMA performs an important role in contract compliance and oversight to ensure that public monies are spent properly and for the intended purpose. They perform this role mostly on contracts that are negotiated and entered into by other agencies and the military services within the Department of Defense, where they are assigned contract administration duties. Thus, the fact that DCMA’s internal IT acquisition systems and procedures were problematic is embarrassing.

But some perspective is necessary because there is a drive by some more extreme elements in Congress and elsewhere that would like to see the elimination of the agency. I believe that this would be a grave mistake. As John F. Kennedy is quoted as having said: “You don’t tear your fences down unless you know why they were put up.”

For those of you who were not around prior to the formation of DCMA or its predecessor organization, the Defense Contract Management Command (DCMC), it is important to note that the formation of the agency is a result of acquisition reform. Prior to 1989 the contract administration services (CAS) capabilities of the military services and various DoD offices varied greatly in capability, experience, and oversight effectiveness.Some of these duties had been assigned to what is now the Defense Logistics Agency (DLA), but major acquisition contracts remained with the Services.

For example, when I was on active duty as a young Navy Supply Corps Officer as part of the first class that was to be the Navy Acquisition Corps, I was taught cradle-to-grave contracting. That is, I learned to perform customer requirements development, economic analysis, contract planning, development of a negotiating position, contract negotiation, and contract administration–soup to nuts. The expense involved in developing and maintaining the skill set required of personnel to maintain such a broad-based expertise is unsustainable. For analogy, it is as if every member of a baseball club must be able to play all nine positions at the same level of expertise; it is impossible.

Furthermore, for contract administration a defense contractor would have contractual obligations for oversight in San Diego, where I was stationed, that were different from contracts awarded in Long Beach or Norfolk or any of the other locations where a contracting office was located. Furthermore, the military services, having their own organizational cultures, provided additional variations that created a plethora of unique requirements that added cost, duplication, inconsistency, and inter-organizational conflict.

This assertion is more than anecdotal. A series of studies were commissioned in the 1980s (the findings of which were subsequently affirmed) to eliminate duplication and inconsistency in the administration of contracts, particularly major acquisition programs. Thus, DCMC was first established under DLA and subsequently became its own agency. Having inherited many of the contracting field office, the agency has struggled to consolidate operations so that CAS is administered in a consistent manner across contracts. Because contract negotiation and program management still resides in the military services, there is a natural point of conflict between the services and the agency.

In my view, this conflict is a healthy one, as all power in the hands of a single individual, such as a program manager, would lead to more fraud, waste, and abuse, not less. Internal checks and balances are necessary in proper public administration, where some efficiency is sacrificed to accountability. It is not just the goal of government to “make the trains run on time”, but to perform oversight of the public’s money so that there is accountability in its expenditure, and integrity in systems and procedures. In the case of CAS, it is to ensure that what is being procured actually gets delivered in conformance to the contract terms and conditions designed to reduce the inherent risk in complex acquisition programs.

In order to do its job effectively, DCMA requires innovative digital systems to allow it to perform its CAS function. As a result, the agency must also possess an acquisition capability. Given the size of the task at hand in performing CAS on over $5 trillion of contract effort, the data involved is quite large, and the number of personnel geographically distributed. The inevitable comparisons to private industry will arise, but few companies in the world have to perform this level of oversight on such a large economic scale, which includes contracts comprising every major supplier to the U.S. Department of Defense, involving detailed knowledge of the management control systems of those companies that receive the taxpayer’s money. Thus, this is a uniquely difficult job. When one understands that in private industry the standard failure rate of IT projects is more than 70% percent, then one cannot help but be unimpressed by these findings, given the challenge.

Assessing the Findings and Recommendations

There is a reason why internal oversight documents of this sort stay confidential–it is because these are preliminary/draft findings and there are two sides to every story which may lead to revisions. In addition, reading these findings without the appropriate supporting documentation can lead one to the wrong impression and conclusions. But it is important to note that this was an internally generated investigation. The checks and balances of management oversight that should occur, did occur. But let’s take a close look at what the reports indicate so that we can draw some lessons. I also need to mention here that POGO’s conflation of the specific issues in this program as a “poster child” for cost overruns and schedule slippage displays a vast ignorance of DoD procurement systems on the part of the article’s author.

Money, Money, Money

The core issue in the findings revolves around the proper color of money, which seems to hinge on the definition of Commercial-Off-The-Shelf (COTS) software and the effort that was expended using the two main types of money that apply to the core contract: RDT&E and O&M.

Let’s take the last point first. It appears that the IWMS effort consisted of a combination of COTS and custom software. This would require acquisition, software familiarization, and development work. It appears that the CIO was essentially running a proof-of-concept to see what would work, and then incrementally transitioned to developing the solution.

What is interesting is that there is currently an initiative in the Department of Defense to do exactly what the DCMA CIO did as part of his own initiative in introducing a new technological approach to create IWMS. It is called Other Transactional Authority (OTA). The concept didn’t exist and was not authorized until the 2016 NDAA and is given specific statutory authority under 10 U.S.C. 2371b. This doesn’t excuse the actions that led to the findings, but it is interesting that the CIO, in taking an incremental approach to finding a solution, also did exactly what was recommended in the 2016 GAO report that POGO references in their article.

Furthermore, as a career Navy Supply Corps Officer, I have often gotten into esoteric discussions in contracts regarding the proper color of money. Despite the assertion of the investigation, there is a lot of room for interpretation in the DoD guidance, not to mention a stark contrast in interpreting the proper role of RDT&E and O&M in the procurement of business software solutions.

When I was on the NAVAIR staff and at OSD I ran into the difference in military service culture where what Air Force financial managers often specified for RDT&E would never be approved by Navy financial managers where, in the latter case, they specified that only O&M dollars applied, despite whether development took place. Given that there was an Air Force flavor to the internal investigation, I would be interested to know whether the opinion of the investigators in making an ADA determination would withstand objective scrutiny among a panel of government comptrollers.

I am certain that, given the differing mix of military and civil service cultures at DCMA–and the mixed colors of money that applied to the effort–that the legal review that was sought to resolve the issue. One of the principles of law is that when you rely upon legal advice to take an action that you have a defense, unless your state of mind and the corollary actions that you took indicates that you manipulated the system to obtain a result that shows that you intended to violate the law. I just do not see that here, based on what has been presented in the materials.

It is very well possible that an inadvertent ADA violation occurred by default because of an improper interpretation of the use of the monies involved. This does not rise to the level of a scandal. But going back to the confusion that I have faced from my own experiences on active duty, I certainly hope that this investigation is not used as a precedent to review all contracts under the approach of accepting a post-hoc alternative interpretation by another individual who just happens to be an inspector long after a reasonable legal determination was made, regardless of how erroneous the new expert finds the opinion. This is not an argument against accountability, but absent corruption or criminal intent, a legal finding is a valid defense and should stand as the final determination for that case.

In addition, this interpretation of RDT&E vs. O&M relies upon an interpretation of COTS. I daresay that even those who throw that term around and who are familiar with the FAR fully understand what constitutes COTS when the line between adaptability and point solutions is being blurred by new technology.

Where the criticism is very much warranted are those areas where the budget authority would have been exceeded in any event–and it is here that the ADA determination is most damning. It is one thing to disagree on the color of money that applies to different contract line items, but it is another to completely lack financial control.

Part of the reason for lack of financial control was the absence of good contracting practices and the imposition of program management.

Contracts 101

While I note that the CIO took an incremental approach to IWMS–what a prudent manager would seem to do–what was lacking was a cohesive vision and a well-informed culture of compliance to acquisition policy that would avoid even the appearance of impropriety and favoritism. Under the OTA authority that I reference above as a new aspect of acquisition reform, the successful implementation of a proof-of-concept does not guarantee the incumbent provider continued business–salient characteristics for the solution are publicized and the opportunity advertised under free and open competition.

After all, everyone has their favorite applications and, even inadvertently, an individual can act improperly because of selection bias. The procurement procedures are established to prevent abuse and favoritism. As a solution provider I have fumed quite often where a selection was made without competition based on market surveys or use of a non-mandatory GSA contract, which usually turn out to be a smokescreen for pre-selection.

There are two areas of fault on IMWS from the perspective of acquisition practice, and another in relation to program management.

These are the initial selection of Apprio, which had laid out the initial requirements and subsequently failed to have the required integration functionality, and then, the selection of Discover Technologies under a non-mandatory GSA Blanket Purchase Agreement (BPA) contract under a sole source action. Furthermore, the contract type was not appropriate to the task at hand, and the arbitrary selection of Discover precluded the agency finding a better solution more fit to its needs.

The use of the GSA BPA allowed managers, however, to essentially spit the requirements to stay below more stringent management guidelines–an obvious violation of acquisition regulation that will get you removed from your position. This leads us to what I think is the root cause of all of these clearly avoidable errors in judgment.

Program Management 101

Personnel in the agency familiar with the requirements to replace the aging procurement management system understood from the outset that the total cost would probably fall somewhere between $20M and $40M. Yet all effort was made to reduce the risk by splitting requirements and failing to apply a programmatic approach to a clearly complex undertaking.

This would have required the agency to take the steps to establish an acquisition strategy, open the requirement based on a clear performance work statement to free and open competition, and then to establish a program management office to manage the effort and to allow oversight of progress and assessment of risks in a formalized environment.

The establishment of a program management organization would have prevented the lack of financial control, and would have put in place sufficient oversight by senior management to ensure progress and achievement of organizational goals. In a word, a good deal of the decision-making was based on doing stupid things on purpose.

The Recommendations

In reviewing the recommendations of the internal investigation, I think my own personal involvement in a very similar issue from 1985 will establish a baseline for comparison.

As I indicated earlier, in the early 1980s, as a young Navy commissioned officer, I was part of the first class of what was to be the Navy Acquisition Corps, stationed at the Supply Center in San Diego, California. I had served as a contracting intern and, after extensive education through the University of Virginia Darden School of Business, the extended Federal Acquisition Regulation (FAR) courses that were given at the time at Fort Lee, Virginia, and coursework provided by other federal acquisition organizations and colleges, I attained my warrant as a contracting officer. I also worked on acquisition reform issues, some of which were eventually adopted by the Navy and DoD.

During this time NAS Miramar was the home of Top Gun. In 1984 Congressman Duncan Hunter (the elder not the currently indicted junior of the same name, though from the same San Diego district), inspired by news of $7,600 coffee maker and a $435 hammer publicized by the founders of POGO, was given documents by a disgruntled employee at the base regarding the acquisition of replacement E-2C ashtrays that had a cost of $300. He presented them to the Base Commander, which launched an investigation.

I served on the JAG investigation under the authority of the Wing Commander regarding the acquisitions and then, upon the firing of virtually the entire chain of command at NAS Miramar, which included the Wing Commander himself, became the Officer-in-Charge of Supply Center San Diego Detachment NAS Miramar. Under Navy Secretary Lehman’s direction I was charged with determining the root cause of the acquisition abuses and given 60-90 days to take immediate corrective action and clear all possible discrepancies.

I am not certain who initiated the firings of the chain of command. From talking with contemporaneous senior personnel at the time it appeared to have been instigated in a fit of pique by the sometimes volcanic Secretary of Defense Caspar Weinberger. While I am sure that Secretary Weinberger experienced some emotional release through that action, placed in perspective, his blanket firing of the chain of command, in my opinion, was poorly advised and counterproductive. It was also grossly unfair, given what my team and I found as the root cause.

First of all, the ashtray was misrepresented in the press as a $600 ashtray because during the JAG I had sent a sample ashtray to the Navy industrial activity at North Island with a request to tell me what the fabrication of one ashtray would cost and to provide the industrial production curve that would reduce the unit price to a reasonable level. The figure of $600 was to fabricate one. A “whistleblower” at North Island took this slice of information out of context and leaked it to the press. So the $300 ashtray, which was bad enough, became the $600 ashtray.

Second, the disgruntled employee who gave the files to Congressman Hunter had been laterally assigned out of her position as a contracting officer by the Supply Officer because of the very reason that the pricing of the ashtray was not reasonable, among other unsatisfactory performance measures that indicated that she was not fit to perform those duties.

Third, there was a systemic issue in the acquisition of odd parts. For some reason there was an ashtray in the cockpit of the E-2C. These aircraft were able to stay in the air an extended period of time. A pilot had actually decided to light up during a local mission and, his attention diverted, lost control of the aircraft and crashed. Secretary Lehman ordered corrective action. The corrective action taken by the squadron at NAS Miramar was to remove the ashtray from the cockpit and store them in a hangar locker.

Four, there was an issue of fraud. During inspection the spare ashtrays were removed and deposited in the scrap metal dumpster on base. The tech rep for the DoD supplier on base retrieved the ashtrays and sold them back to the government for the price to fabricate one, given that the supply system had not experienced enough demand to keep them in stock.

Fifth, back to the systemic issue. When an aircraft is to be readied for deployment there can be no holes representing missing items in the cockpit. A deploying aircraft with this condition is then grounded and a high priority “casuality report” or CASREP is generated. The CASREP was referred to purchasing which then paid $300 for each ashtray. The contracting officer, however, feeling under pressure by the high priority requisition, did not do due diligence in questioning the supplier on the cost of the ashtray. In addition, given that several aircraft deploy, there were a number of these requisitions that should have led the contracting officer to look into the matter more closely to determine price reasonableness.

Furthermore, I found that buying personnel were not properly trained, that systems and procedures were not established or enforced, that the knowledge of the FAR was spotty, and that procurements did not go through multiple stages of review to ensure compliance with acquisition law, proper documentation, and administrative procedure.

Note that in the end this “scandal” was born by a combination of systemic issues, poor decision-making, lack of training, employee discontent, and incompetence.

I successfully corrected the issues at NAS Miramar during the prescribed time set by the Secretary of the Navy, worked with the media to instill public confidence in the system, built up morale, established better customer service, reduced procurement acquisition lead times (PALT), recommended necessary disciplinary action where it seemed appropriate, particularly in relation to the problematic employee, recovered monies from the supplier, referred the fraud issues to Navy legal, and turned over duties to a new chain of command.

NAS Miramar procurement continued to do its necessary job and is still there.

What the higher chain of command did not do was to take away the procurement authority of NAS Miramar. It did not eliminate or reduce the organization. It did not close NAS Miramar.

It requires leadership and focus to take effective corrective action to not only fix a broken system, but to make it better while the corrective actions are being taken. As I outlined above, DCMA performs an essential mission. As it transitions to a data-driven approach and works to reduce redundancy and inefficiency in its systems, it will require more powerful technologies to support its CAS function, and the ability to acquire those technologies to support that function.

Take Me To The River, Part 2, Schedule Elements–A Digital Inventory of Integrated Program Management Elements

Recent attendance at various forums to speak has interrupted the flow of this series on IPM elements. Among these venues I was engaged in discussions regarding this topic, as well as the effects of acquisition reform on the IT, program, and project management communities in the DoD and A&D marketplace.

For this post I will restrict the topic to what are often called schedule elements, though that is a nebulous term. Also, one should not draw a conclusion that because I am dealing with this topic following cost elements, that it is somehow inferior in importance to those elements. On the contrary, planning and scheduling are integral to applying resources and costs, in tracking cost performance, and in our systemic analysis its activities, artifacts, and elements are antecedent to cost element considerations.

The Relative Position of Schedule

But the takeaway here is this: under no circumstances should any program or project manager believe that cost and schedule systems represent a dichotomy, nor a hierarchy, of disciplines. They are interdependent and the behavior noted in one will be manifested in the other.

This is important to keep in mind, because the software industry, more than any other, has been responsible for reinforcing and solidifying this (erroneous) perspective. During the first generation of desktop application development, software solutions were built to automate the functions of traditional line and staff functions. This made a great deal of sense.

From a sales and revenue perspective, it is easier to sell a limited niche software “tool” to an established customer base that will ensure both quick acceptance and immediate realization of productivity and labor savings. The connection from the purchase to ROI was easily traceable in the time span and at the level of the person performing their workaday tasks.

Thus, solutions were built to satisfy the needs of cost analysts, schedule analysts, systems engineers, cost estimators, and others. Where specific solutions left gaps, such spreadsheet solutions such as Microsoft Excel were employed to fill them. It was in no one’s interest to go beyond their core competency. Once a dominant or set of dominant incumbents (a monoposony) inhabited a niche, they employed the usual strategies for “stickiness” to defend territory and raise barriers to new entries.

What was not anticipated by many organizations was the fact that once you automate a function that the nature of the system, if one is to implement the most effective organizational structure, is transformed to conform to the most efficient flow and use of data–and its resulting transformation into information and intelligence. Oftentimes the skill set to use the intelligence does not exist because the resulting insights and synergy involved in taking larger and more comprehensive datasets which themselves are more credible and accurate was not anticipated in adjusting the organizational structure.

This is changing and must change, because the old way of using limited sets of data in the age of big(ger) data that provide a more comprehensive view of business conditions is not tenable. At least, not if a company or organization wants to stay relevant or profitable.

Characteristics and Basic Elements of the Project Schedule

If one were to perform a Google search of project schedule while reading this post, you would find a number of definitions, some of which overlap. For example, the PMBOK defines a schedule as, quite simply, “the planned dates for performing activities and the planned dates for meeting milestones.”

Thus our elements include planned dates, activities, and milestones. But is that all? Under this definition, any kind of plan, from a minor household renovation or upgrade to building an aircraft carrier would contain only these elements.

I don’t think so.

For complex projects and programs, which is the focus on this blog, our definition of a project schedule is a bit more comprehensive. If you go to the aforementioned A Guide for DoD Program Managers mentioned in my last post, you will find even less specificity.

The reason for this is that what we define as a project schedule is part and parcel of the planning phase of a project, which is then further specified in the specific time-phased planning elements for execution of the project through its lifespan into production. It is the schedule that ties together all of the disciplines in putting together a project–acquisition, systems engineering, cost estimating, and project performance management.

In attending scheduled-focused conferences over the years and in talking to program management colleagues is the refrain that:

a. It is hard to find a good scheduler, and

b. Constructing a schedule is more of an art than a science.

I can only say that this cedes the field to a small cadre of personnel who perform an essential function, but who do so with few objective tests of effectiveness or accountability–until it is too late.

But the reality is quite different from the fuzzy perception of schedule that is often assumed. All critical path method (CPM) schedules describe the same phenomena, though the lexicon will vary based on the specific proprietary application employed.

In government-focused and large commercial projects, the schedule is heart of planning and execution. In the DoD world it is known as the Integrated Master Schedule (IMS), which utilize the inherent bottom-up relationships of elements to determine the critical path. The main sources regarding the IMS have a great deal of overlap, but tend to be either aspirational (and unfortunately not prescriptive in defining the basic characteristics of an IMS) or reflect the “art over science” approach. For those following along these are the DoD Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide of 21 October 2005, the NAVAIR Integrated Master Schedule (IMS) Guidebook of February 2010, and the NDIA Planning and Scheduling Excellence Guide (PASEG) of 9 March 2016 (unfortunately no current direct link).

The key elements that comprise an IMS, in addition to what we identified under the PMBOK are that it is networked schedule consisting of specific durations that are assigned to specific work tasks that must be accomplished in discrete work packages. In most cases these durations will be derived by some kind of either fixed, manual method or through the inherent optimization algorithm being applied by the CPM application. More on this below. But these work packages are discrete, meaning that they represent the full scope of the work that must be accomplished to during the specified duration for the creation of an end product. Discrete work is distinguished from level of effort (LOE) work, the latter being effort that is always expended, such as administrative and management tasks, that are not directly tied to the accomplishment of an end product.

These work packages are tied together to illustrate antecedent and progressive work that show predecessor and successor relationships. Long term planning activities, which cannot be fleshed out until more immediate work is completed are set aside as placeholders called planning packages. Each of the elements that are tracked in the IMS are based on the presentation of established criteria that define completion, events, and specific accomplishments.

The most comprehensive IMSs consist of detailed planning that include resources and elements of cost.

Detailed Elements of the IMS

Given these general elements, the best source of identifying the key elements of detailed schedules is also found in Department of Defense documents. The core document in this case is the Data Item Description for the IMS numbered as DI-MGMT-81650. The latest one is dated March 30, 2005. There are a minimum of 32 data elements, some of these already mentioned and which I will not repeat in this post since they are pretty well listed and identified in the source document.

For those not familiar with these documents, Data Item Descriptions (or DiDs–gotta love acronyms) represent the detailed technical documents for artifacts involved in the management of DoD-related operations. Thus, this provides us with a pretty good inventory of elements to source. But there are others that are implied.

For example, the 81650 DiD identifies an element known as “methodology.” What this means is that each scheduling application has an optimization engine where the true differences in schedule construction and intellectual property reside. Elements that affect these calculations are time-based, duration-based, float, and slack, and those related to resources.

These time-based elements consist of early start, early finish, late start, late finish. Duration-based elements consist of shortest time, longest time, greatest rank weight. An additional element related to schedule float identifies minimum slack. Resources are further delineated by the greatest work content and the greatest cumulative resource content.

I would note that the NDIA PASEG adds some sub-elements to this list that are based on the algorithmic result of the schedule engines and, thus, tends to ignore the antecedent salient elements of validating the optimization engine found above. These additional sub-elements are total float, free float, soft constraints, hard constraints, and–also found in the aforementioned DiD–program, task, and resource calendars.

Normally, this is where a survey would end–with schedule-specific data elements focused on the details of the schedule. But we’re going to challenge our assumptions a bit more.

Framing Assumptions of Schedules and Programs

The essential document that provides a definition of the term “framing assumption” was published by RAND Corporation in 2014 entitled Identifying Acquisition Framing Assumptions Through Structured Deliberation by Mark V. Arena and Lauren A. Mayer.  The definition of a framing assumption is “any explicit or implicit assumption that is central is shaping cost, schedule, or performance expectations.”

As I have explored in my prior post, the use of the term “cost” is a fuzzy one. To some it means earned value management, which measures a small part of the costs of development and ownership of a system. To others it means total cost of ownership. Schedule is an implicit part of this definition, and then we have performance expectations, which I will deal with in a separate post.

But we can apply the concept of framing assumptions in two ways.

The first applies to the assumed purpose of the schedule. What do we construct one? This goes back to my earlier statement that “…the schedule…ties together all of the disciplines in putting together a project–acquisition, systems engineering, cost estimating, and project performance management.”

For the NDIA PASEG the IMS is a “tool, not just a report” that “provides an ever-changing window into the progress (or lack of it) of current work effort. The strategic mission of the schedule is to point out future risks and opportunities.”

For the NAVAIR IMS Guide the IMS “At a top level…contain(ing) the networked, detailed tasks necessary to ensure successful program execution…” that “capture(s) project tasks and task relationships”, “show(s) the magnitude and how long each task will take”, “show(s) resources, durations, and constraints for each task” and “show(s) the critical path.”

For the DiD 81650 “The Integrated Master Schedule (IMS) is an integrated schedule containing the networked, detailed tasks necessary to ensure successful program execution.”

But the most comprehensive definition that goes to the core of the purpose of an IMS can be found in paragraph 1.2 of the DoD Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide (IMP/IMS Guide). The elements of this purpose is worth transcribing, because if we have a requirement and cannot ask the “So What?” question, that is, if we cannot effectively determine why something must be done, then it probably does not need to be done (or we need to apply rigor in the development our expertise).

For what the IMP/IMS Guide does is clearly tie the schedule to the programmatic framing assumptions (used in the context in which RAND meant it) from initial acquisition through planning. Thus, the Integrated Master Plan (IMP) is firmly established as an antecedent and intermediate planning process (not merely an artifact or tool), that results in the program R&D execution process.

Taken in whole these processes and the resulting artifacts of the processes provide:

a. Provides offerors and acquiring activities with detailed execution planning, organization, and scheduling information that sets realistic expectations for the resulting contract action.

b. Serves as the execution plan for how the supplier will meet the contract’s performance requirements within cost and schedule constraints.

c. Provides a basis for integrating all of the functions involved in development and deployment of the system being acquired and, after award, sets the framing assumptions of the program.

d. Provides the basis for determining and assessing progress, identifying risks, determining the basis for contractual award fees and penalties, assess progress on Key Performance Parameters (KPPs) and Technical Performance Measures (TPMs), determine alternative paths to project completion, and determine opportunities for innovation and new acquisitions not apparent at the time of the award.

What all of this means is that the Integrated Master Schedule is too important to be left to the master scheduler. Yes, the schedule is a “tool” to those at the most basic tactical level in work execution. Yes, it is also an artifact and record.

But, more importantly, it is the comprehensive notional representation of the project’s or program’s scope, effort, progress, and assessment.

Private and Government-focused Industry Practice

A word has to mentioned here about the difference in practice between purely private industry practice in managing large projects and programs, and the skewing in the posts that focus on those industries that focus on public sector acquisition.

In the listing of schedule elements listed earlier there is reference to resources and elements of cost, yet here is an area that standard practice diverges. In private industry the application of resource assignments to specific work is standard practice and found in the IMS.

In companies focused on the public sector and DoD, the practice is to establish a different set of data outside of the schedule to manage resources. Needless to say this creates problems of validation of data across disparate systems related to the lowest level of planning and execution of a project or program. The basis for it, I think, relates to viewing the schedule as a “tool” and not the basis for project execution. This “tool” mindset also allows for separate “earned value engines” that oftentimes do not synchronize with the execution of the schedule, not only undermining the practical value of both, but also creating systems complexity and inefficiency where none need exist.

Another gap found in many areas of public acquisition concerns the development of an integrated master plan antecedent to the integrated master schedule. The cause here, once again, I believe is viewing the discipline of systems engineering separate; one that is somehow walled off from the continuing assessment of program execution, though that assumption is not supported by program phasing and milestone planning and achievement.

From the perspective of Integrated Program/Project Management, these considerations cannot be ignored, and so our inventory of essential data elements must include elements from these practices.

But Wait! There’s More!

Most discussions at conferences and professional meetings will usually stop at this point–viewing cost and schedule integration as the essence of IPM–with “cost’ limited to EVM. Some will add some “oh by the ways” such as technical performance and risk. I will address these in the next post as well.

But there are also other systems and processes that are relevant to our inventory. But what I have covered thus far in this series should challenge you if you have been paying attention.

I tackled cost first because of the assumptions implicit in equating it with EVM, and then went on to demonstrate that there are other elements of cost that provide a more comprehensive view. This is not denigrate the value of EVM, since it is an essential process in project management, but to demonstrate that its analytics are not comprehensive and, as with any complex system, require the contribution of additional information, depending on the level and type of work performance and progress being recorded and assessed.

In this post I have tacked the IMS, and have demonstrated that it is not supplementary process, but central to all other processes and actions being taken in the execution of the project or program. Many times people enter the schedule from an assessment of cost performance–tracing cost drivers to specific schedule activities and then tasks. But this has it backwards, based on the best technology available sometime in the late 1990s.

It is the schedule that brings together all relevant information from our execution and control processes and systems. It seems to me that perhaps the first place one goes is the schedule, that the first element to trace are those related to schedule slippage and unexpected resource consumption, and then to trace these to contract cost impact.

But, of course, there is more–and these other elements may turn out to be of greater consequence than just cost and schedule considerations. More on these in my next post.

In Closing: Battle Rhythm and the Plans of the Day and Week

When I was on active duty in the Navy we planned our days and weeks around a Plan of the Day or Plan of the Week. This is a posted agenda so that the entire ship or command understands the major events that affect its operations. It establishes focus on the main events at hand and fosters communication both laterally and vertically within the chain of command.

As one rises in rank and responsibility it is important to understand the operational tempo of the unit or ship, its systems, and subsystems. This is important in avoiding crisis management.This is known as Battle Rhythm.

Baked into the schedule (assuming proper construction and effective integrated product teaming) are the major events, milestones, and expected achievement of the program or project. Thus, there are events that should be planned around and anticipation of these items on a daily, weekly, biweekly, monthly, quarterly, and major milestone basis.

Given an effective battle rhythm, a PM should never complain about performance and progress indicators “looking into the rear view mirror”. If that is the case then perhaps the PM should look at the effectiveness and timeliness of the underlying project and program systems.Thus, when a PMO complains of information and intelligence being too late to be actionable, it is actually describing a condition of ineffective, latent, and disjointed information and intelligence systems.

Thus, our next step in our next post is to identify more salient IPM elements that cut to the heart of the matter.

The Stories of My Death Are Greatly Exaggerated — And A Customer Bill of Rights in Software

I have been quite busy of late–with a good deal of travel mixed in–and so my posts have been stacked up in various states of completion. Thus, given a holiday and more travel next week, my postings will be fairly close to one another. If you missed my recent post on a digital IPM inventory please follow this link.

This post is somewhat focused on business owners, especially those in the technology industry not enamored of flimflam or used car salesman tactics. But it is also of interest to any organization or individuals who procure or are thinking of procuring software and their associated services.

Buying Software is Sometimes Challenging

Buying software and applying it to one’s business processes is fraught with risk and sometimes frustration, even for technology companies. We are in the midst of another technology bubble with a plethora of new products and companies being introduced virtually every week vying for attention.

Some of these use more traditional methods of development and delivery, and others more innovative methods. The hot topics remain Big(ger) Data and Cloud, though the definition of these terms varies considerably in practice. Thus, when looking at the market, companies and organizations need to be mindful of their needs, their expectations, and whether the purpose of acquiring the new technology is to simply improve performance over a legacy system, increasing one’s knowledge capabilities, improve productivity, or driving organizational change with the technology in the vanguard of that change.

To the manager, each of these decisions will determine the amount of risk that she or he can tolerate. Even after acquisition the manager, depending on the scope of the change, must assess and manage risk. I am yet to see any transition from one technology to another to be completely bump-free–and this extends from being on both sides of the table. I have been lucky thus far not to have experienced failure, but the level of effort required to be able to lay claim to that record has varied greatly from organization to organization. In most cases this is a people problem, where the technological challenges are easy to address but the tolerance of the organization to change or rapid change is somewhat limiting.

The explosion of software companies and solutions in each market has created an environment where more entities are vying for the same dollars. This is particularly true where solutions are focused on some niche within a larger vertical. I have seen this in the project and program management discipline, which even has a number of competitors that offer limited tools to address specific concerns. Needless to say, this is not my preferred vision or approach, but all of them offer some value that must be assessed against one’s needs.

From a market perspective, however, this has created a hyper-competitive environment. Consumers may mistakenly believe that this is a good development–and if the focus is on competing to make products better and more effective I would agree. But its the “hyper” part of that word that is the part that creates dysfunction.

Hyper-competition and dysfunctional markets

What this means is that when anyone begins to break away from the pack (or displace less agile incumbents) there are always the less ethical members of the industry that resort to character assassination, and rumor campaigns containing innuendo, misinformation–and outright slander–to try to tilt the balance. Unable to successfully articulate their own vision (if they even have one) and the relative merits of their own products, they see their only choice–usually in desperation–to denigrate others an attempt to bring their target down to their level. Oftentimes they will enlist partisans in the market–even disgruntled employees–to give a sheen of “truthiness” to their whisper campaign. The best liars use half-truths.

This being said, my own personal experience has been mostly positive. I have a very good professional relationship with the overwhelming majority of competitors and semi-competitors, and while we are aware of the issue of competition between us, we are able to respect and socialize in a civil manner.

After all, these aren’t just “competitors”–they are people. I learn and converse with them about, among other things, their concerns and perspectives on child rearing (or grandchild rearing), home renovation, recent travel, social perspectives, customer challenges, and the comedy of life. I learn about their experiences and often come away with a bit of knowledge that I did not possess before–applying the practice of lifetime learning to my social interactions–and I enjoy their company.

When a couple of them found themselves without a job, especially after the last financial crisis and recession, I assisted them in finding new employment and wrote glowing recommendations based on my first-hand experience in dealing with and observing them, even though we were competitors. If I had a matching position at the time, I probably would have brought the best of them into own company.

But with the good comes the bad.

Over the course of time I have occasionally learned a number of things about myself or my firm that I somehow had missed, and which seems to conflict with reality.

For example, I heard (and was asked by a customer about its veracity at the time) that my company was close to insolvency. This rumor made the rounds for about three years and I was surprised in its persistence given the length of time that it was repeated. (The reality: I acquired full control of my company and we’ve experienced years of significant growth, though being privately owned and hence financially opaque to the market provides the environment for this kind of misinformation). At one point a document was anonymously posted on-line that seemed authoritative and embarrassing, combined with a concerted whisper campaign to spin its significance to support the first rumor. (Turned out that it too went the way of the first rumor). Recently I learned that a large customer was so dissatisfied that they were dropping our product and looking for alternatives, another fact of which I was unaware. (Senior management of the customer, after hearing about the rumor, invited anyone who thinks they believe what was said to give him a call and he will provide the facts).

But I don’t feel picked on.

Those aforementioned competitors with whom I have civil relations have from time-to-time been targets by some of these same bad actors in our market. Over lunch or in side conversations we often share the latest slander being spread. Oftentimes it is there that I learn of the latest whisper campaign–except that, unlike those that fuel the rumor, they identify the source.

Why is this important? Because of experience and standards of practice.

First, experience: during the last speculative real estate bubble financial institutions engaged in cutthroat competition, engaging in unethical business conduct to undermine their competitors. The response of those that were targeted was to retaliate in like manner. Pretty soon the entire industry went into the toilet and there were no good people among those that were left when the bottom dropped out. There are many other such examples in which entire markets are held in low esteem due to poorly regulated or transgressive business practices.

When you throw mud at least some of it will stick to you.

Second, standards of practice: all of the firms and government organizations in our industry must conform to a standard of business ethics. Even the appearance of a conflict of interest or unethical behavior is sanctioned. It appears on websites and it is enforced by management. So when you hear “shhh, hey…did you hear…” and the price to this bit of information is maintaining the anonymity of the source, then you know that someone is engaged in unethical practices and you should break the cycle.

So what is one to do? Well, you can get into the gutter and sling mud as well–but refer to the note on mud sticking to the thrower above (as well as the example of the financial services industry). Aggressive legal action is also an option, but anyone who thinks that the legal system is anything but Byzantine–and very expensive–is fooling themselves. It’s usually unnecessary except in the most extreme cases. Litigation comes with its own perceptions.

A Customer Bill of Rights

The purpose of a whisper campaign in business is to eliminate consideration of the target from the competitive range. Unfamiliarity of any sort–with the company, a new or innovative technology or approach, or normal operational security in business–simply provide a rich environmnet for the spinner of tales to operate. It thrives in darkness. Given risk, fear, and trepidation in acquiring a software solution, they oftentimes are a consideration that creates delays and frustration since the rumor must be addressed.

Thus, there is another approach: it is to use sunlight to cleanse the market of the most dirty of practices.

My response to our industry–the software industry–is a Customer Bill of Rights. These are principles and practices to which I abide. I challenge all other software solutions providers in my market to sign on to these same principles. I think the consulting companies that inhabit our market would benefit from these as well.

One–To truthfully represent our own product capabilities and limitations so that the customer can make an assessment of whether the products meet their needs in relation to other products on the market. While criticism and differences will arise regarding competing products and visions, our marketing will not be based on degrading the work of competitors. Our message will focus on differences in functionality and technical approach.

Two–To provide as much transparency as possible to our prospective customers to our existing customer experience and satisfaction levels. This will include, where existing customers have not selected privacy, to allow for unfiltered communication with our present customer base. For existing customers who wish to remain anonymous we will continue to respect their privacy.

Three–Since we are not afraid of our products’ performance, to always demonstrate its capabilities in a live environment that replicates as best as possible the perspective of the customer.

Four–To provide to any prospective customer who requests it, an evaluation copy of our products to test and evaluate to determine if they will best meet their needs, and to provide support for their effort as with any other established customer. In addition to the software itself, the necessary information for evaluation will include full documentation and training materials related to our products so that we meet our goal of full transparency.

Five–In seeking to fully understand customer requirements, to have the moral courage to admit those cases where our products will not satisfy those requirements. This includes honestly identifying competing products or alternative products that may meet customer needs where our own products do not.

Six–To focus on satisfying our customers’ needs by treating every customer as if they are the most important customer, regardless of size or significance to the bottom line. Any customer questions and issues will be addressed immediately and respectfully. At the core of this commitment is to ensure that customer expectations are understood and fully addressed.

Seven–To commit ourselves to constant product and process improvement with a focus on customer satisfaction in seeking new and innovative solutions. As such, while doing our own due diligence in being mindful of our market and its competitive benchmarks, we will also treat our competitors with respect, understanding through our own efforts the work that is usually involved in running and operating a business. Where that respect is not reciprocated our policy will be to simply avoid engaging with those individuals or companies, and sanctioning them as necessary, as anyone caring about business ethics would do.

I am confident that the more my customers and the market knows personally about me the better. Let the chips fall where they may. I am equally confident that the more the market knows about my business team, our products, and our reputation, the more they will like what they experience and see. My personnel and I make no claims of infallibility, but we work hard to address our shortfalls–our most challenging competitor is ourselves and the goals that we strive to meet.

 

Take Me to the River, Part 1, Cost Elements – A Digital Inventory of Integrated Program Management Elements

In a previous post I recommended a venue focused on program managers to define what constitutes integrated program management. Since that time I have been engaged with thought leaders and influencers in both government and industry, many of whom came to a similar conclusion independently, agree in this proposition and who are working to bring it about.

My own interest in this discussion is from the perspective of maximization of the information ecosystem that underlies and describes the systems known as projects and programs. But what do I mean by this? This is more than a gratuitous question, because oftentimes the information essential to defining project and program performance and behavior are intermixed, and therefore diluted and obfuscated, by confusion with those of the overall enterprise.

Project vs. Program

What a mean by the term project in this context is an organization that is established around a defined effort of fixed duration (a defined beginning and projected end) that is specifically planned and organized for the development and deployment of a particular end item, state, or result, with an identified set of resources assigned and allocated to achieve its goals.

A program is defined as a set of interrelated projects and sub-projects which is also of fixed duration that is specifically planned and organized not only for the development and deployment, but also the continues this role through sustainment (including configuration control), of a particular end item, state, or result, with an identified set of resources assigned and allocated to achieve its goals. As such, the program management team also is the first level life-cycle manager of the end item, state, or result, and participates with other levels of the organization in these activities. (More on life-cycle costs below).

Note the difference in scope and perspective, though oftentimes we use these terms interchangeably.

For shorthand, a small project of short duration operates at the tactical level of planning. A larger project, which because of size, complexity, duration, and risk approaches the definition of a program, operates at the operational level, as do most programs. Larger and more complex programs that will affect the core framing assumptions of the enterprise align their goals to the strategic level of planning. Thus, there are differences in scale, complexity and, hence, data points that can be captured at these various levels.

Another aspect of the question of establishing an integrated digital project and program management environment is sufficiency of data, which relates directly to scale. Sufficiency in this regard is defined as whether there is enough data to establish a valid correlation and, hopefully, draw a causation. Micro-economic foundations–and models–often fail because of insufficient data. This is important to keep in mind as we inventory the type of data available to us and its significance. Oftentimes additional data points can make up for those cases where there is insufficiency in the depth and quality of a more limited set of data points. Doing so will also mitigate subjectivity, especially in smaller efforts.

Thus, in constructing a project or program, regardless of its level of planning, we often begin by monitoring the most basic elements. These are usually described as cost, schedule, performance, and risk, though I will discuss and identify other contributors that can be indexed.

This first post will concentrate on the first set of elements–those that constitute cost. In looking at these, however, we will find that the elements within this category are a bit broader than what is currently used in determining project and program performance.

Contract Costs

When we refer to costs in project and program management we oftentimes are referring to those direct and indirect costs expended by the supplier over the course of the effort, particular in Cost Plus contractual efforts. The breakout of cost from a data perspective places it in subcategories:

Note that these are costs within the contract itself, as a cohesive, self-identifying entity. But there are other costs associated with our contracts which feed into program and project management. These are necessary to identify and capture if we are to take an holistic approach to these disciplines.

The costs that are anticipated by the contract are based on cost estimates, which need to be funded. These funded costs will be allocated to particular lines in the contract (CLINs), whether these be supporting contract efforts or deliverables. Thus, additional elements of our digital inventory include these items but lead us to our next categories.

Cost Estimates, Colors of Money, and Cash Flow

Cost estimates are the basis for determining the entire contract effort, and eventually make it into the project and program cost plan. Once cost estimates are applied and progress is tracked through the collection of actual costs, these elements are further traced to project and program activities, products, commodities, and other business categories, such as the indirect costs identified on the right hand side of the chart above.

Our cost plans need to be financed, as with any business entity. Though the most complex projects often are financed by some government entity because of their scale and impact, private industry–even among the largest companies–must obtain financing for the efforts at hand, whether these come from internal or external sources.

Thus two more elements present themselves: “colors” of money, that is, money that is provided for a specific purpose within the project and program cost plan which could also be made available for only some limited period of time, and the availability of that money sufficient to execute particular portions of the project or program, that is, cash flow.

The phase of the project or program will determine the type of money that is made available. These are also contained in the costs that are identified in the next section, but include, from a government financing perspective, Research, Development, Test and Evaluation (RDT&E) money, Procurement, Operations and Maintenance (O&M), and Military Construction (MILCON) dollars. By Congressional appropriation and authorization, each of these types of money may be provided for particular programs, and each type of authorization has a specific period in which they can be committed, obligated, and expended before they expire. The type of money provided also aligns with the phase of the project or program: whether it still be in development, production, deployment and acquisition, sustainment, or retirement.

These costs will be reflected in reporting that reflects actual and projected rates of expenditure, that will be tied to procurement, material management, and resource management systems.

Additional Relative Costs

As with all efforts, the supplier is not the only entity to incur costs on a development project or program. The customer also incurs costs, which must be taken into account in determining the total cost of the effort.

For anyone who has undergone any kind of major effort on their home, or even had to get things other workaday things done, like deciding when to change the tires on the car or when to get to the dentist implicitly understand that there is more effort in timing and determining the completion of these items than the cost of new kitchen cabinets, tires, or a filling. One must decide to take time off from work. One must look to their own cash flow to see if they have sufficient funds not only for the merchant, but for all of the sundry and associated tasks that must be done in preparation for and after the task’s completion. To choose to do one thing is to choose not to do another–an opportunity cost. Other people may be involved in the decision. Perhaps children are in the household and a babysitter is required. Perhaps the home life is so disrupted that another temporary abode is necessary on a short term basis.

All of these are costs that one must take into account, and at the individual level we do these calculations and plan these activities as a matter of fact.

In customer-supplier relationships the former incurs costs above the contract costs, which must be taken into account by the customer project or program executive. In the Department of Defense an associated element is called program management administration (PMA). For private entities this falls into allocated G&A and Overhead costs, aside from direct labor and material costs, but in all cases these are costs that have come about due to the decision to undertake the specific effort.

Other elements of cost on the customer side are contractually furnished facilities, property, material or equipment, and testing and evaluation costs.

Contract Cost Performance: Earned Value Management

I will further discuss EVM in more detail a later installment of this element inventory, but mention must be made of EVM since to exclude it is to be grossly remiss.

At core EVM is a financial measure of value against what has been physically achieved against a performance management baseline (PMB), which ties actual costs and completion of work through a work breakdown structure (WBS). It is focused on the contract level of performance, which in some cases may constitute the entire project, though not necessarily the entire effort for the program.

Linkages to the other cost elements I have delineated elsewhere in this post ranges from strong to non-existent. Thus, while an essential means of linking contractual achievement to work accomplishment that, at various levels of fidelity, is linked to actual technical achievement, it does not capture all of the costs in our data inventory.

An essential overview in understanding what it does capture is best summed up in the following diagram taken from the Defense Acquisition University (DAU) site:

Commercial EVM elements, while not necessarily using the same terminology or highly structured process, possess a similar structure in allocating costs and achievement against baseline costs in developmental efforts to work packages (oftentimes schedule tasks in resource-loaded schedules) under an integrated WBS structure with Management Reserve not included as part of the baseline.

Also note that commercial efforts often include their internal costs as part of the overall contractual effort in assessing earned value against actual work achievement, while government contracting efforts tend to exclude these inherent costs. That being said, it is not that there is no cost control in these elements, since strict ceilings often apply to PMA and other such costs, it is that contract cost performance does not take these costs, among others, into account.

Furthermore, the chart above provides us with additional sub-elements in our inventory that are essential in capturing data at the appropriate level of our project and program hierarchy.

Thus, for IPM, EVM is one of many elements that are part our digital inventory–and one that provides a linkage to other non-cost elements (WBS). But in no way should it be viewed as capturing all essential costs associated with a contractual effort, aside from the more expansive project or program effort.

Portfolio Management and Life-Cycle Costs

There is another level of management that is essential in thinking about project and program management, and that is the program executive level. In the U.S. military services these are called Program Executive Officers (PEOs). In private industry they are often product managers, CIOs, and other positions that often represent the link between the program management teams and the business operations side of the organization. Thus, this is also the level of management organized to oversee a number of individual projects and programs that are interrelated based on mission, commodity, or purpose. As such, this level of management often concentrates on issues across the portfolio of projects and programs.

The main purpose of the portfolio management level is to ensure that project and program efforts are aligned with the strategic goals of the organization, which includes an understanding of the total cost of ownership.

In performing this purpose one of the functions of portfolio management is to identify risks that may manifest within projects and programs, and to determine the most productive use of limited resources across them, since they are essentially competing for the same dollars. This includes cost estimates and re-allocations to address ontological, aleatory, and epistemic risk.

Furthermore, the portfolio level is also concerned with the life-cycle factors of the item under development, so that there is effective hand-off at the production and sustainment phases. The key here is to ensure that each project or program, which is focused on the more immediate goals of project and program execution, continues to meet the goals of the organization in terms of life-cycle costs, and its effectiveness in meeting the established goals essential to the project or program’s framing assumptions.

But here we are focusing on cost, and so the costs involved are trade-off costs and opportunities, assessments of return on investment, and the aforementioned total cost of ownership of the end item or system. The costs that contribute to the total cost of ownership include all of the development costs, external and internal program management costs, procurement costs, operations and support costs, maintenance and life extension costs, and system retirement costs.

Conclusion

I believe that the survey of cost elements presented in this initial post illustrates that present digital project and program management systems are limited and immature–capturing and evaluating only a small portion of the total amount of available data.

These gaps make it impossible, for example, to determine the relative significance any one element–and the analytics that can derived from it–over another; not to mention the inability to provide the linkage among these absent elements that would garner insights into cause-and-effect and predictive behavior so that we have enough time to influence the outcome.

It is also clear that, when we strive to define what constitutes integrated project and program management, that we must learn what is of most importance to the PM in performing those duties that are viewed as essential to success, and which are not yet captured in our analytical and predictive systems.

Only when our systems reach the level of cohesiveness and comprehensiveness in providing organizational insight and intelligence essential to project or program management will PMs ignore them at their own risk. In getting there we must first identify what can be captured from the activities that contribute to our efforts.

My next post will identify essential elements related to planning and scheduling.

 

Note: I am indebted to Defense Acquisition University’s resources in my research across many of my postings and link to them for the edification of the reader. For more insight into many of the points raised in this post I would recommend that readers familiarize themselves with A Guide for DoD Program Managers.

 

Don’t Stop Thinking About Tomorrow–Post-Workshop Blogging…and some Low Comedy

It’s been a while since I posted to my blog due to meetings and–well–day job, but some interesting things occurred during the latest Integrated Program Management (IPMD) of the National Defense Industrial Association (NDIA) meeting that I think are of interest. (You have to love acronyms to be part of this community).

Program Management and Integrated Program Management

First off is the initiative by the Program Management Working Group to gain greater participation by program managers with an eye to more clearly define what constitutes integrated program management. As readers of this blog know, this is a topic that I’ve recently written about.

The Systems Engineering discipline is holding their 21st Annual Systems Engineering Conference in Tampa this year from October 22nd to the 25th. IPMD will collaborate and will be giving a track dedicated to program management. The organizations have issued a call for papers and topics of interest. (Full disclosure: I volunteered this past week to participate as a member of the PM Working Group).

My interest in this topic is based on my belief from my years of wide-ranging experience in duties from having served as a warranted government contracting officer, program manager, business manager, CIO, staff officer, and logistics officer that there is much more to the equation in defining IPM that transcends doing so through the prism of any particular discipline. Furthermore, doing so will require collaboration and cooperation among a number of project management disciplines.

This is a big topic where, I believe, no one group or individual has all of the answers. I’m excited to see where this work goes.

Integrated Digital Environment

Another area of interest that I’ve written about in the past involved two different–but related–initiatives on the part of the Department of Defense to collect information from their suppliers that is necessary in their oversight role not only to ensure accountability of public expenditures, but also to assist in project cost and schedule control, risk management, and assist in cost estimation, particularly as it relates to risk sharing cost-type R&D contracted project efforts.

Two major staffs in the Offices of the Undersecretary of Defense have decided to go with a JSON-type schema for, on the one hand, cost estimating data, and on the other, integrated cost performance, schedule, and risk data. Each initiative seeks to replace the existing schemas in place.

Both have been wrapped around the axle on getting industry to move from form-based reporting and data sharing to a data-agnostic solution that meet the goals of reducing redundancy in data transmission, reducing the number of submissions and data streams, and moving toward one version of truth that allows for SMEs on both sides of the table to concentrate on data analysis and interpretation in jointly working toward the goal of successful project completion and end-item deployment.

As with the first item, I am not a disinterested individual in this topic. Back when I wore a uniform I helped to construct DoD policy to create an integrated digital environment. I’ve written about this experience previously in this blog, so I won’t bore with details, but the need for data sharing on cost-type efforts acknowledges the reality of the linkage between our defense economic and industrial base and the art of the possible in deploying defense-related end items. The same relationship exists for civilian federal agencies with the non-defense portion of the U.S. economy. Needless to say, a good many commercial firms unrelated to defense are going the same way.

The issue here is two-fold, I think, from speaking with individuals working these issues.

The first is, I think, that too much deference is being given to solution providers and some industry stakeholders, influenced by those providers, in “working the refs” through the data. The effect of doing so not only slows down the train and protects entrenched interests, it also gets in the way of innovation, allowing the slowest among the group to hold up the train in favor of–to put it bluntly–learning their jobs on the job at the expense of efficiency and effectiveness. As I expressed in a side conversion with an industry leader, all too often companies–who, after all, are the customer–have allowed themselves to view the possible by the limitations and inflexibility of their solution providers. At some point that dysfunctional relationship must end–and in the case of comments clearly identified as working the refs–they should be ignored. Put your stake in the ground and let innovation and market competition sort it out.

Secondly, cost estimating, which is closely tied to accounting and financial management, is new and considered tangential to other, more mature, performance management systems. My own firm is involved in producing a solution in support of this process, collecting data related to these reports (known collectively in DoD as the 1921 reports), and even after working to place that data in a common data lake, exploring with organizations what it tells us, since we are only now learning what it tells us. This is classical KDD–Knowledge Discovery in Data–and a worthwhile exercise.

I’ve also advocated going one step further in favor of the collection of financial performance data (known as the Contract Funds Status Report), which is an essential reporting requirement, but am frustrated to find no one willing to take ownership of the guidance regarding data collection. The tragedy here is that cost performance, known broadly as Earned Value Management, is a technique related to the value of work performance against other financial and project planning measures (a baseline and actuals). But in a business (or any enterprise), the fuel that drives the engine are finance-related, and two essential measures are margin and cash-flow. The CFSR is a report of program cash-flow and financial execution. It is an early measure of whether a program will execute its work in any given time-frame, and provides a reality check on the statistical measures of performance against baseline. It is also a necessary logic check for comptrollers and other budget decision-makers.

Thus, as it relates to data, there has been some push-back against a settled schema, where the government accepts flat files and converts the data to the appropriate format. I see this as an acceptable transient solution, but not an ultimate one. It is essential to collect both cost estimating and contract funds status information to perform any number of operations that relate to “actionable” intelligence: having the right executable money at the right time, a reality check against statistical and predictive measures, value analysis, and measures of ROI in development, just to name a few.

I look forward to continuing this conversation.

To Be or Not to Be Agile

The Section 809 Panel, which is the latest iteration of acquisition reform panels, has recommended that performance management using earned value not be mandated for efforts using Agile. It goes on, however, to assert that program executive “should approve appropriate project monitoring and control methods, which may include EVM, that provide faith in the quality of data and, at a minimum, track schedule, cost, and estimate at completion.”

Okay…the panel is then mute on what those monitoring and control measure will be. Significantly, if only subtly, the #NoEstimates crowd took a hit since the panel recommends and specifies data quality, schedule, cost and EAC. Sounds a lot like a form of EVM to me.

I must admit to be a skeptic when it comes to swallowing the Agile doctrine whole. Its micro-economic foundations are weak and much of it sounds like ideology–bad ideology at best and disproved ideology at worst (specifically related to the woo-woo about self-organization…think of the last speculative bubble and resulting financial crisis and depression along these lines).

When it comes to named methodologies I am somewhat from Missouri. I apply (and have in previous efforts in the Dark Ages back when I wore a uniform) applied Kanban, teaming, adaptive development (enhanced greatly today by using modern low-code technology), and short sprints that result in releasable modules. But keep in mind that these things were out there long before they were grouped under a common heading.

Perhaps Agile is now a convenient catch-all for best practices. But if that is the case then software development projects using this redefined version of Agile deserve no special dispensation. But I was schooled a bit by an Agile program manager during a side conversation and am always open to understanding things better and revising my perspectives. It’s just that there was never a Waterfall/Agile dichotomy just as there never really was a Spiral/Waterfall dichotomy. These were simply convenient development models to describe a process that were geared to the technology of the moment.

There are very good people on the job exploring these issues on the Agile Working Group in the IPMD and I look forward to seeing what they continue to come up with.

Rip Van Winkle Speaks!

The only disappointing presentation occurred on the second and last day of the meeting. It seemed we were treated by a voice from somewhere around the year 2003 that, in what can only be described as performance art involving free association, talked about wandering the desert, achieving certification for a piece of software (which virtually all of the software providers in the room have successfully navigated at one time or another), discovering that cost and schedule performance data can be integrated (ignoring the work of the last ten years on the part of, well, a good many people in the room), that there was this process known as the Integrated Baseline Review (which, again, a good many people in the room had collaborated on to both define and make workable), and–lo and behold–the software industry uses schemas and APIs to capture data (known in Software Development 101 as ETL). He then topped off his meander by an unethical excursion into product endorsement, selected through an opaque process.

For this last, the speaker was either unaware or didn’t care (usually called tone-deafness) that the event’s expenses were sponsored by a software solution provider (not mine). But it is also as if the individual speaking was completely unaware of the work behind the various many topics that I’ve listed above this subsection, ignoring and undermining the hard work of the other stakeholders that make up our community.

On the whole an entertaining bit of poppycock, which leads me to…

A Word about the Role of Professional Organizations (Somewhat Inside Baseball)

In this blog, and in my interactions with other professionals at–well–professional conferences–I check my self-interest in at the door and publicly take a non-commercial stance. It is a position that is expected and, I think, appreciated. For those who follow me on social networking like LinkedIn, posts from my WordPress blog originate from a separate source from the commercial announcements that are linked to my page that originate from my company.

If there are exhibitor areas, as some conferences and workshops do have, that is one thing. That’s where we compete and play; and in private side conversations customers and strategic partners will sometimes use the opportunity as a convenience to discuss future plans and specific issues that are clearly business-related. But these are the exceptions to the general rule, and there are a couple of reasons for this, especially at this venue.

One is because, given that while it is a large market, it is a small community, and virtually everyone at the regular meetings and conferences I attend already know that I am the CEO and owner of a small software company. But the IPMD is neutral ground. It is a place where government and industry stakeholders, who in other roles and circumstances are in a contractual or competing relationship, come to work out the best way of hashing out processes and procedures that will hopefully improve the discipline of program and project management. It is also a place of discovery, where policies, new ideas, and technologies can be vetted in an environment of collaboration.

Another reason for taking a neutral stance is simply because it is both the most ethical and productive one. Twenty years ago–and even in some of the intervening years–self-serving behavior was acceptable at the IPMD meetings where both leadership and membership used the venue as a basis for advancing personal agendas or those of their friends, often involving backbiting and character assassination. Some of those people, few in number, still attend these meetings.

I am not unfamiliar with the last–having been a target at one point by a couple of them but, at the end of the day, such assertions turned out to be without merit, undermining the credibility of the individuals involved, rightfully calling into question the quality of their character. Such actions cannot help but undermine the credibility and pollute the atmosphere of the organization in which they associate, as well.

Finally, the companies and organizations that sponsor these meetings–which are not cheap to organize, which I know from having done so in the past–deserve to have the benefit of acknowledgment. It’s just good manners to play nice when someone else is footing the bill–you gotta dance with those that brung you. I know my competitors and respect them (with perhaps one or two exceptions). We even occasionally socialize with each other and continue long-term friendships and friendly associations. Burning bridges is just not my thing.

On the whole, however, the NDIA IPMD meetings–and this one, in particular–was a productive and positive one, focused on the future and in professional development. That’s where, I think, that as a community we need to be and need to stay. I always learn something new and get my dose of reality from a broad-based perspective. In getting here the leadership of the organization (and the vast majority of the membership) is to be commended, as well as the recent past and current members of the Department of Defense, especially since the formation of the Performance Assessments and Root Cause Analysis (PARCA) office.

In closing, there were other items of note discussed, along with what can only be described as the best pair of keynote addresses that I’ve heard in one meeting. I’ll have more to say about some of the concepts and ideas that were presented there in future posts.

Here It Is–Integrated Project Management and Its Definition

I was recently at a customer site and, while discussing the topic of this post, I noticed a book displayed prominently on the bookshelf behind my colleague entitled “Project Management Using Earned Value” by Gary Humphreys. It is a book that I have on my shelf as well and is required reading by personnel in my company.

I told my colleague: “One of the problems with our ability to define IPM is the conceit embedded in the title of that book behind you.”

My colleague expressed some surprise at my intentionally provocative comment, but he too felt that EVM had taken on a role that was beyond its intent, and so asked for more clarification. Thus, this post is meant to flesh out some of these ideas.

But before I continue, here was my point: while the awkward wording of the title unintentionally creates a syllogism that can be read as suggesting that applying earned value will result in project management–a invalid conclusion based on a specious assumption–there are practitioners who would lend credence to that idea.

Some History in Full Disclosure

Before I begin some full disclosure is in order. When I was on active duty in the United State Navy I followed my last mentor to the Pentagon, who felt that my perspective on acquisition and program management would be best served on the staff of the Undersecretary of Defense for Acquisition and Technology, which subsequently also came to include logistics.

The presence of a uniformed member of the Armed Forces was unusual for that staff at the time (1996). My boss, Dan Czelusniak, a senior SES who was a highly respected leader, program manager, engineer, thought leader, and, for me, mentor, had first brought me on his staff at the U.S. Navy Naval Air Systems Command in PEO(A), and gave me free reign to largely define my job.

For that assignment I combined previously separate duties, taking on the job as Program Manager of an initiative to develop a methodology of assessing technical performance measurement, as Business Manager of the PEO which led me to the use of earned value management and its integration with other program indicators and systems, the development of a risk assessment system for the programs in the PEO for the establishment of a DoD financial management reserve, support to the program managers and their financial managers in the budget hearing process, and as CIO for the programs in identifying and introducing new information technologies in their support.

While there, I had decided to retire from the service after more than 22 years on active duty, but my superiors felt that I had a few more ideas to contribute to the DoD, and did what they could to convince me to stay on a while longer. Having made commitments in my transition, I set my date in the future, but agreed to do the obligatory Pentagon tour of duty to cap my career. Dan had moved over to Undersecretary of Defense for Acquisition and Technology (USD(A&T)) and decided that he wanted me on the OUSD(A&T) staff.

As in PEO(A), Mr. Czelusniak gave me the freedom to define my position with the approval of my immediate superior, Mr. Gary Christle. I chose the title as Lead Action Officer, Integrated Program Management. Mr. Christle, who was a brilliant public servant and thought leader as well, widely heralded in the EVM community, asked me with a bemused expression, “What is integrated program management?” I responded: “I don’t know yet sir but I intend to find out.” Though I did not have a complete definition, I had a seed of an idea.

My initiatives on the staff began with an exploration of data and information. My thinking along these lines early in my career were influenced by a book entitled “Logistics in the National Defense” by retired Admiral Henry E. Eccles, written in 1959. It is a work that still resonates today and established the important concept that “logistics serves as the bridge between a nation’s economy and its forces and defines the operational reach of the joint force commander.” The U.S. Army site referenced for this quote calls him the Clausewitz of logistics.

Furthermore my work as Program Manager of the Technical Performance Management project, and earlier assignments as program manager of IT and IM projects, provided me with insights into the interrelationships of essential data that was being collected as a matter of course in R&D efforts that would provide the basis for a definition of IPM.

In concluding my career on the OSD staff, I produced two main products, among others: a methodology for the integration of technical performance risk in project management performance, and the policy of moving toward what became the DoD-wide policy for an Integrated Digital Environment (IDE). This last initiative was produced with significant contributions from the staff of the Deputy Assistant Secretary of Defense for Command, Control, Communications, and Intelligence (C3I) as well as additional work by my colleague on the A&T staff, Reed White.

Products from IDE included the adoption of the ANSI X12 839 transaction set. Its successors, such as the DCARC UN/CEFACT XML and other similar initiatives are based on that same concept and policy, though, removed by many years, the individuals involved may be only vaguely aware of that early policy–or the controversy that had to be overcome in its publication given its relatively common sense aspects from today’s perspective.

The Present State

Currently there are at least four professional organizations that have attempted to tackle the issue of integrated program and project management. These are the Project Management Institute, NDIA’s Integrated Program Management Division, the College of Performance Management, and the American Association of Cost Engineers. There are also other groups focused in systems engineering, contracting, and cost estimating that contribute to the literature.

PMI is an expansive organization and, oftentimes, the focus of the group is on aspirational goals by those who wish to obtain a credential in the discipline. The other groups tend to emphasize their roots in earned value management or cost engineering as the basis for a definition of IPM. The frustration of many professionals in the A&D and DoD world is that the essential input and participation of the program manager to define the essential data needed to define IPM, which goes beyond the seas of separation that define islands of data and expertise is missing.

Things didn’t used to be this way.

When I served at NAVAIR and the Pentagon the jointly-sponsored fall Integrated Program Management Conference held in Tyson’s Corner, Virginia, would draw more than 600 attendees. Entire contingents from the military systems commands and program offices–as well as U.S. allied countries–would attend, lending the conference a synergy and forward-looking environment not found in other venues. Industries outside of aerospace and defense would also send representatives and contribute to the literature.

As anyone engaged in a scientific or engineering effort can attest, sharing expertise and perspectives among other like professionals from both industry and government is essential to developing a vital, professional, and up-to-date community of knowledge.

During the intervening years the overreaction of the public and the resulting political reaction to a few isolated embarrassing incidents at other professional conferences, and constraints on travel and training budgets, has contributed to a noticeable drop in attendance at these essential venues. But, I think there is also an internal contributing factor within the organizations themselves. That factor is that each views itself and its discipline as the nexus of IPM. Thus, to PMI, a collection of KPIs are the definition of IPM. To CPM and NDIA IPMD, earned value management is the link to IPM, and to AACEI Total Cost Management is the basis for IPM.

All of them cannot be correct and none possesses an overwhelming claim.

The present state currently finds members of each of these groups–all valuable subject matter experts, leaders,  and managers in their areas of concentration–essentially talking to themselves and each other, insulated in a bubble. There is little challenge in convincing another EVM SME that EVM is the basis for the integration of other disciplines. What is not being done is making a convincing case to program managers based on the merits.

A Modest Recommendation

Subsequent to the customer meeting that sent me to, once again, contemplate IPM, Gordon Kranz, President of Enlightened Integrated Program Management LLC posted the following question to LinkedIn:

Integrated Program Management – What is it?  Systems Engineering? Earned Value Management? Agile Development? Lean? Quality? Logistics? Building Information Modeling? …

He then goes on to list some basic approaches that may lead to answering that question. Still the question exists.

Mr. Kranz was the Deputy Director for Earned Value Management policy at the Office of the Secretary of Defense from 2011-2015. During his term in that position I witnessed more innovation and improved relations between government and industry, which resulted in process improvements in accountability and transparency, than I had seen come out of that office over the previous ten years. He brings with him a wealth of knowledge concerning program management from both government and private industry. Now a private consultant, Gordon’s question goes to the heart of the debate in addressing the claims of those who claim to be the nexus of IPM.

So what is Integrated Project or Program Management? Am I any closer to answering that question than when Gary Christle first asked it of me over twenty years ago?

I think so but I abstain in answering the question, but only because in the end it is the community of program management in their respective verticals that must ultimately answer it. Only the participation and perspectives of practicing program managers and corporate management will determine the definition of IPM and the elements that underlie it. Self-interested software publishers, of which I am one, cannot be allowed to define and frame the definition, as much as it is tempting to do so.

These elements must be specific and must address the most recent misunderstandings that have arisen in the PM discipline, such as that there is a dichotomy between EVM and Agile–a subject fit for a different blog post.

So here is my modest recommendation: that the leaders of the program management community from the acquisition organizations in both industry and government–where the real power to make decisions resides and where the discussions that sparked this blog post began–find a sponsor for an IPM workshop that addresses this topic, with the goal of answering the core question. Make no mistake–despite my deference in this post, I intend to be part of the conversation in defining that term. But, in my opinion, no one individual or small group of specialized SMEs are qualified to do so.

Furthermore, doing so, I believe, is essential to the very survival of these essential areas of expertise, particularly given our ability to deploy more powerful information systems that allow us to process larger sets of data. The paradox of more powerful processing of bigger data results in a level of precision that reveals the need for fewer, not more, predictive indicators and less isolated line-and-staff specialized expertise. Discovery-driven project management is here today, bridging islands of data, and providing intelligence in new and better ways that allow for a more systemic approach to project management.

Thus, in this context, a robust definition of Integrated Project Management is an essential undertaking for the discipline.