Take Me to the River, Part 1, Cost Elements – A Digital Inventory of Integrated Program Management Elements

In a previous post I recommended a venue focused on program managers to define what constitutes integrated program management. Since that time I have been engaged with thought leaders and influencers in both government and industry, many of whom came to a similar conclusion independently, agree in this proposition and who are working to bring it about.

My own interest in this discussion is from the perspective of maximization of the information ecosystem that underlies and describes the systems known as projects and programs. But what do I mean by this? This is more than a gratuitous question, because oftentimes the information essential to defining project and program performance and behavior are intermixed, and therefore diluted and obfuscated, by confusion with those of the overall enterprise.

Project vs. Program

What a mean by the term project in this context is an organization that is established around a defined effort of fixed duration (a defined beginning and projected end) that is specifically planned and organized for the development and deployment of a particular end item, state, or result, with an identified set of resources assigned and allocated to achieve its goals.

A program is defined as a set of interrelated projects and sub-projects which is also of fixed duration that is specifically planned and organized not only for the development and deployment, but also the continues this role through sustainment (including configuration control), of a particular end item, state, or result, with an identified set of resources assigned and allocated to achieve its goals. As such, the program management team also is the first level life-cycle manager of the end item, state, or result, and participates with other levels of the organization in these activities. (More on life-cycle costs below).

Note the difference in scope and perspective, though oftentimes we use these terms interchangeably.

For shorthand, a small project of short duration operates at the tactical level of planning. A larger project, which because of size, complexity, duration, and risk approaches the definition of a program, operates at the operational level, as do most programs. Larger and more complex programs that will affect the core framing assumptions of the enterprise align their goals to the strategic level of planning. Thus, there are differences in scale, complexity and, hence, data points that can be captured at these various levels.

Another aspect of the question of establishing an integrated digital project and program management environment is sufficiency of data, which relates directly to scale. Sufficiency in this regard is defined as whether there is enough data to establish a valid correlation and, hopefully, draw a causation. Micro-economic foundations–and models–often fail because of insufficient data. This is important to keep in mind as we inventory the type of data available to us and its significance. Oftentimes additional data points can make up for those cases where there is insufficiency in the depth and quality of a more limited set of data points. Doing so will also mitigate subjectivity, especially in smaller efforts.

Thus, in constructing a project or program, regardless of its level of planning, we often begin by monitoring the most basic elements. These are usually described as cost, schedule, performance, and risk, though I will discuss and identify other contributors that can be indexed.

This first post will concentrate on the first set of elements–those that constitute cost. In looking at these, however, we will find that the elements within this category are a bit broader than what is currently used in determining project and program performance.

Contract Costs

When we refer to costs in project and program management we oftentimes are referring to those direct and indirect costs expended by the supplier over the course of the effort, particular in Cost Plus contractual efforts. The breakout of cost from a data perspective places it in subcategories:

Note that these are costs within the contract itself, as a cohesive, self-identifying entity. But there are other costs associated with our contracts which feed into program and project management. These are necessary to identify and capture if we are to take an holistic approach to these disciplines.

The costs that are anticipated by the contract are based on cost estimates, which need to be funded. These funded costs will be allocated to particular lines in the contract (CLINs), whether these be supporting contract efforts or deliverables. Thus, additional elements of our digital inventory include these items but lead us to our next categories.

Cost Estimates, Colors of Money, and Cash Flow

Cost estimates are the basis for determining the entire contract effort, and eventually make it into the project and program cost plan. Once cost estimates are applied and progress is tracked through the collection of actual costs, these elements are further traced to project and program activities, products, commodities, and other business categories, such as the indirect costs identified on the right hand side of the chart above.

Our cost plans need to be financed, as with any business entity. Though the most complex projects often are financed by some government entity because of their scale and impact, private industry–even among the largest companies–must obtain financing for the efforts at hand, whether these come from internal or external sources.

Thus two more elements present themselves: “colors” of money, that is, money that is provided for a specific purpose within the project and program cost plan which could also be made available for only some limited period of time, and the availability of that money sufficient to execute particular portions of the project or program, that is, cash flow.

The phase of the project or program will determine the type of money that is made available. These are also contained in the costs that are identified in the next section, but include, from a government financing perspective, Research, Development, Test and Evaluation (RDT&E) money, Procurement, Operations and Maintenance (O&M), and Military Construction (MILCON) dollars. By Congressional appropriation and authorization, each of these types of money may be provided for particular programs, and each type of authorization has a specific period in which they can be committed, obligated, and expended before they expire. The type of money provided also aligns with the phase of the project or program: whether it still be in development, production, deployment and acquisition, sustainment, or retirement.

These costs will be reflected in reporting that reflects actual and projected rates of expenditure, that will be tied to procurement, material management, and resource management systems.

Additional Relative Costs

As with all efforts, the supplier is not the only entity to incur costs on a development project or program. The customer also incurs costs, which must be taken into account in determining the total cost of the effort.

For anyone who has undergone any kind of major effort on their home, or even had to get things other workaday things done, like deciding when to change the tires on the car or when to get to the dentist implicitly understand that there is more effort in timing and determining the completion of these items than the cost of new kitchen cabinets, tires, or a filling. One must decide to take time off from work. One must look to their own cash flow to see if they have sufficient funds not only for the merchant, but for all of the sundry and associated tasks that must be done in preparation for and after the task’s completion. To choose to do one thing is to choose not to do another–an opportunity cost. Other people may be involved in the decision. Perhaps children are in the household and a babysitter is required. Perhaps the home life is so disrupted that another temporary abode is necessary on a short term basis.

All of these are costs that one must take into account, and at the individual level we do these calculations and plan these activities as a matter of fact.

In customer-supplier relationships the former incurs costs above the contract costs, which must be taken into account by the customer project or program executive. In the Department of Defense an associated element is called program management administration (PMA). For private entities this falls into allocated G&A and Overhead costs, aside from direct labor and material costs, but in all cases these are costs that have come about due to the decision to undertake the specific effort.

Other elements of cost on the customer side are contractually furnished facilities, property, material or equipment, and testing and evaluation costs.

Contract Cost Performance: Earned Value Management

I will further discuss EVM in more detail a later installment of this element inventory, but mention must be made of EVM since to exclude it is to be grossly remiss.

At core EVM is a financial measure of value against what has been physically achieved against a performance management baseline (PMB), which ties actual costs and completion of work through a work breakdown structure (WBS). It is focused on the contract level of performance, which in some cases may constitute the entire project, though not necessarily the entire effort for the program.

Linkages to the other cost elements I have delineated elsewhere in this post ranges from strong to non-existent. Thus, while an essential means of linking contractual achievement to work accomplishment that, at various levels of fidelity, is linked to actual technical achievement, it does not capture all of the costs in our data inventory.

An essential overview in understanding what it does capture is best summed up in the following diagram taken from the Defense Acquisition University (DAU) site:

Commercial EVM elements, while not necessarily using the same terminology or highly structured process, possess a similar structure in allocating costs and achievement against baseline costs in developmental efforts to work packages (oftentimes schedule tasks in resource-loaded schedules) under an integrated WBS structure with Management Reserve not included as part of the baseline.

Also note that commercial efforts often include their internal costs as part of the overall contractual effort in assessing earned value against actual work achievement, while government contracting efforts tend to exclude these inherent costs. That being said, it is not that there is no cost control in these elements, since strict ceilings often apply to PMA and other such costs, it is that contract cost performance does not take these costs, among others, into account.

Furthermore, the chart above provides us with additional sub-elements in our inventory that are essential in capturing data at the appropriate level of our project and program hierarchy.

Thus, for IPM, EVM is one of many elements that are part our digital inventory–and one that provides a linkage to other non-cost elements (WBS). But in no way should it be viewed as capturing all essential costs associated with a contractual effort, aside from the more expansive project or program effort.

Portfolio Management and Life-Cycle Costs

There is another level of management that is essential in thinking about project and program management, and that is the program executive level. In the U.S. military services these are called Program Executive Officers (PEOs). In private industry they are often product managers, CIOs, and other positions that often represent the link between the program management teams and the business operations side of the organization. Thus, this is also the level of management organized to oversee a number of individual projects and programs that are interrelated based on mission, commodity, or purpose. As such, this level of management often concentrates on issues across the portfolio of projects and programs.

The main purpose of the portfolio management level is to ensure that project and program efforts are aligned with the strategic goals of the organization, which includes an understanding of the total cost of ownership.

In performing this purpose one of the functions of portfolio management is to identify risks that may manifest within projects and programs, and to determine the most productive use of limited resources across them, since they are essentially competing for the same dollars. This includes cost estimates and re-allocations to address ontological, aleatory, and epistemic risk.

Furthermore, the portfolio level is also concerned with the life-cycle factors of the item under development, so that there is effective hand-off at the production and sustainment phases. The key here is to ensure that each project or program, which is focused on the more immediate goals of project and program execution, continues to meet the goals of the organization in terms of life-cycle costs, and its effectiveness in meeting the established goals essential to the project or program’s framing assumptions.

But here we are focusing on cost, and so the costs involved are trade-off costs and opportunities, assessments of return on investment, and the aforementioned total cost of ownership of the end item or system. The costs that contribute to the total cost of ownership include all of the development costs, external and internal program management costs, procurement costs, operations and support costs, maintenance and life extension costs, and system retirement costs.

Conclusion

I believe that the survey of cost elements presented in this initial post illustrates that present digital project and program management systems are limited and immature–capturing and evaluating only a small portion of the total amount of available data.

These gaps make it impossible, for example, to determine the relative significance any one element–and the analytics that can derived from it–over another; not to mention the inability to provide the linkage among these absent elements that would garner insights into cause-and-effect and predictive behavior so that we have enough time to influence the outcome.

It is also clear that, when we strive to define what constitutes integrated project and program management, that we must learn what is of most importance to the PM in performing those duties that are viewed as essential to success, and which are not yet captured in our analytical and predictive systems.

Only when our systems reach the level of cohesiveness and comprehensiveness in providing organizational insight and intelligence essential to project or program management will PMs ignore them at their own risk. In getting there we must first identify what can be captured from the activities that contribute to our efforts.

My next post will identify essential elements related to planning and scheduling.

 

Note: I am indebted to Defense Acquisition University’s resources in my research across many of my postings and link to them for the edification of the reader. For more insight into many of the points raised in this post I would recommend that readers familiarize themselves with A Guide for DoD Program Managers.

 

Don’t Stop Thinking About Tomorrow–Post-Workshop Blogging…and some Low Comedy

It’s been a while since I posted to my blog due to meetings and–well–day job, but some interesting things occurred during the latest Integrated Program Management (IPMD) of the National Defense Industrial Association (NDIA) meeting that I think are of interest. (You have to love acronyms to be part of this community).

Program Management and Integrated Program Management

First off is the initiative by the Program Management Working Group to gain greater participation by program managers with an eye to more clearly define what constitutes integrated program management. As readers of this blog know, this is a topic that I’ve recently written about.

The Systems Engineering discipline is holding their 21st Annual Systems Engineering Conference in Tampa this year from October 22nd to the 25th. IPMD will collaborate and will be giving a track dedicated to program management. The organizations have issued a call for papers and topics of interest. (Full disclosure: I volunteered this past week to participate as a member of the PM Working Group).

My interest in this topic is based on my belief from my years of wide-ranging experience in duties from having served as a warranted government contracting officer, program manager, business manager, CIO, staff officer, and logistics officer that there is much more to the equation in defining IPM that transcends doing so through the prism of any particular discipline. Furthermore, doing so will require collaboration and cooperation among a number of project management disciplines.

This is a big topic where, I believe, no one group or individual has all of the answers. I’m excited to see where this work goes.

Integrated Digital Environment

Another area of interest that I’ve written about in the past involved two different–but related–initiatives on the part of the Department of Defense to collect information from their suppliers that is necessary in their oversight role not only to ensure accountability of public expenditures, but also to assist in project cost and schedule control, risk management, and assist in cost estimation, particularly as it relates to risk sharing cost-type R&D contracted project efforts.

Two major staffs in the Offices of the Undersecretary of Defense have decided to go with a JSON-type schema for, on the one hand, cost estimating data, and on the other, integrated cost performance, schedule, and risk data. Each initiative seeks to replace the existing schemas in place.

Both have been wrapped around the axle on getting industry to move from form-based reporting and data sharing to a data-agnostic solution that meet the goals of reducing redundancy in data transmission, reducing the number of submissions and data streams, and moving toward one version of truth that allows for SMEs on both sides of the table to concentrate on data analysis and interpretation in jointly working toward the goal of successful project completion and end-item deployment.

As with the first item, I am not a disinterested individual in this topic. Back when I wore a uniform I helped to construct DoD policy to create an integrated digital environment. I’ve written about this experience previously in this blog, so I won’t bore with details, but the need for data sharing on cost-type efforts acknowledges the reality of the linkage between our defense economic and industrial base and the art of the possible in deploying defense-related end items. The same relationship exists for civilian federal agencies with the non-defense portion of the U.S. economy. Needless to say, a good many commercial firms unrelated to defense are going the same way.

The issue here is two-fold, I think, from speaking with individuals working these issues.

The first is, I think, that too much deference is being given to solution providers and some industry stakeholders, influenced by those providers, in “working the refs” through the data. The effect of doing so not only slows down the train and protects entrenched interests, it also gets in the way of innovation, allowing the slowest among the group to hold up the train in favor of–to put it bluntly–learning their jobs on the job at the expense of efficiency and effectiveness. As I expressed in a side conversion with an industry leader, all too often companies–who, after all, are the customer–have allowed themselves to view the possible by the limitations and inflexibility of their solution providers. At some point that dysfunctional relationship must end–and in the case of comments clearly identified as working the refs–they should be ignored. Put your stake in the ground and let innovation and market competition sort it out.

Secondly, cost estimating, which is closely tied to accounting and financial management, is new and considered tangential to other, more mature, performance management systems. My own firm is involved in producing a solution in support of this process, collecting data related to these reports (known collectively in DoD as the 1921 reports), and even after working to place that data in a common data lake, exploring with organizations what it tells us, since we are only now learning what it tells us. This is classical KDD–Knowledge Discovery in Data–and a worthwhile exercise.

I’ve also advocated going one step further in favor of the collection of financial performance data (known as the Contract Funds Status Report), which is an essential reporting requirement, but am frustrated to find no one willing to take ownership of the guidance regarding data collection. The tragedy here is that cost performance, known broadly as Earned Value Management, is a technique related to the value of work performance against other financial and project planning measures (a baseline and actuals). But in a business (or any enterprise), the fuel that drives the engine are finance-related, and two essential measures are margin and cash-flow. The CFSR is a report of program cash-flow and financial execution. It is an early measure of whether a program will execute its work in any given time-frame, and provides a reality check on the statistical measures of performance against baseline. It is also a necessary logic check for comptrollers and other budget decision-makers.

Thus, as it relates to data, there has been some push-back against a settled schema, where the government accepts flat files and converts the data to the appropriate format. I see this as an acceptable transient solution, but not an ultimate one. It is essential to collect both cost estimating and contract funds status information to perform any number of operations that relate to “actionable” intelligence: having the right executable money at the right time, a reality check against statistical and predictive measures, value analysis, and measures of ROI in development, just to name a few.

I look forward to continuing this conversation.

To Be or Not to Be Agile

The Section 809 Panel, which is the latest iteration of acquisition reform panels, has recommended that performance management using earned value not be mandated for efforts using Agile. It goes on, however, to assert that program executive “should approve appropriate project monitoring and control methods, which may include EVM, that provide faith in the quality of data and, at a minimum, track schedule, cost, and estimate at completion.”

Okay…the panel is then mute on what those monitoring and control measure will be. Significantly, if only subtly, the #NoEstimates crowd took a hit since the panel recommends and specifies data quality, schedule, cost and EAC. Sounds a lot like a form of EVM to me.

I must admit to be a skeptic when it comes to swallowing the Agile doctrine whole. Its micro-economic foundations are weak and much of it sounds like ideology–bad ideology at best and disproved ideology at worst (specifically related to the woo-woo about self-organization…think of the last speculative bubble and resulting financial crisis and depression along these lines).

When it comes to named methodologies I am somewhat from Missouri. I apply (and have in previous efforts in the Dark Ages back when I wore a uniform) applied Kanban, teaming, adaptive development (enhanced greatly today by using modern low-code technology), and short sprints that result in releasable modules. But keep in mind that these things were out there long before they were grouped under a common heading.

Perhaps Agile is now a convenient catch-all for best practices. But if that is the case then software development projects using this redefined version of Agile deserve no special dispensation. But I was schooled a bit by an Agile program manager during a side conversation and am always open to understanding things better and revising my perspectives. It’s just that there was never a Waterfall/Agile dichotomy just as there never really was a Spiral/Waterfall dichotomy. These were simply convenient development models to describe a process that were geared to the technology of the moment.

There are very good people on the job exploring these issues on the Agile Working Group in the IPMD and I look forward to seeing what they continue to come up with.

Rip Van Winkle Speaks!

The only disappointing presentation occurred on the second and last day of the meeting. It seemed we were treated by a voice from somewhere around the year 2003 that, in what can only be described as performance art involving free association, talked about wandering the desert, achieving certification for a piece of software (which virtually all of the software providers in the room have successfully navigated at one time or another), discovering that cost and schedule performance data can be integrated (ignoring the work of the last ten years on the part of, well, a good many people in the room), that there was this process known as the Integrated Baseline Review (which, again, a good many people in the room had collaborated on to both define and make workable), and–lo and behold–the software industry uses schemas and APIs to capture data (known in Software Development 101 as ETL). He then topped off his meander by an unethical excursion into product endorsement, selected through an opaque process.

For this last, the speaker was either unaware or didn’t care (usually called tone-deafness) that the event’s expenses were sponsored by a software solution provider (not mine). But it is also as if the individual speaking was completely unaware of the work behind the various many topics that I’ve listed above this subsection, ignoring and undermining the hard work of the other stakeholders that make up our community.

On the whole an entertaining bit of poppycock, which leads me to…

A Word about the Role of Professional Organizations (Somewhat Inside Baseball)

In this blog, and in my interactions with other professionals at–well–professional conferences–I check my self-interest in at the door and publicly take a non-commercial stance. It is a position that is expected and, I think, appreciated. For those who follow me on social networking like LinkedIn, posts from my WordPress blog originate from a separate source from the commercial announcements that are linked to my page that originate from my company.

If there are exhibitor areas, as some conferences and workshops do have, that is one thing. That’s where we compete and play; and in private side conversations customers and strategic partners will sometimes use the opportunity as a convenience to discuss future plans and specific issues that are clearly business-related. But these are the exceptions to the general rule, and there are a couple of reasons for this, especially at this venue.

One is because, given that while it is a large market, it is a small community, and virtually everyone at the regular meetings and conferences I attend already know that I am the CEO and owner of a small software company. But the IPMD is neutral ground. It is a place where government and industry stakeholders, who in other roles and circumstances are in a contractual or competing relationship, come to work out the best way of hashing out processes and procedures that will hopefully improve the discipline of program and project management. It is also a place of discovery, where policies, new ideas, and technologies can be vetted in an environment of collaboration.

Another reason for taking a neutral stance is simply because it is both the most ethical and productive one. Twenty years ago–and even in some of the intervening years–self-serving behavior was acceptable at the IPMD meetings where both leadership and membership used the venue as a basis for advancing personal agendas or those of their friends, often involving backbiting and character assassination. Some of those people, few in number, still attend these meetings.

I am not unfamiliar with the last–having been a target at one point by a couple of them but, at the end of the day, such assertions turned out to be without merit, undermining the credibility of the individuals involved, rightfully calling into question the quality of their character. Such actions cannot help but undermine the credibility and pollute the atmosphere of the organization in which they associate, as well.

Finally, the companies and organizations that sponsor these meetings–which are not cheap to organize, which I know from having done so in the past–deserve to have the benefit of acknowledgment. It’s just good manners to play nice when someone else is footing the bill–you gotta dance with those that brung you. I know my competitors and respect them (with perhaps one or two exceptions). We even occasionally socialize with each other and continue long-term friendships and friendly associations. Burning bridges is just not my thing.

On the whole, however, the NDIA IPMD meetings–and this one, in particular–was a productive and positive one, focused on the future and in professional development. That’s where, I think, that as a community we need to be and need to stay. I always learn something new and get my dose of reality from a broad-based perspective. In getting here the leadership of the organization (and the vast majority of the membership) is to be commended, as well as the recent past and current members of the Department of Defense, especially since the formation of the Performance Assessments and Root Cause Analysis (PARCA) office.

In closing, there were other items of note discussed, along with what can only be described as the best pair of keynote addresses that I’ve heard in one meeting. I’ll have more to say about some of the concepts and ideas that were presented there in future posts.

Post-Workshop Talking Blues — No Bucks, No Buck Rogers: Cashflow Analysis in Projects (Somewhat Wonkish)

When I used this analogy the week before last during the last Integrated Project Management Workshop in the D.C. area I was accused of dating myself–and perhaps it is true. For those wondering the quote was popularized by the 1983 movie The Right Stuff, which was based on the 1979 book written by Tom Wolfe of the same title. The book and movie was about the beginnings of the U.S. space program culminating in the creation of NASA and the Project Mercury program.

A clip from the movie follows:

It goes without saying that while I was familiar as a boy with Project Mercury and followed the seven astronauts as did the rest of the country, transfixed on the prospect of space exploration during the days of the New Frontier, Buck Rogers was from the childhood of my father’s generation through, at first, its radio program, and then through the serials that were released to the movie theaters during the 1930s.

The point of the quote, of course, is that Project Mercury’s success was based on its ability to obtain funding and, no doubt, the Mercury 7 astronauts so inspired the imagination of the nation that even the most parsimonious Member of Congress could not help but provide it with sufficient funding for success. That this was also the era of the “space race” with the Soviet Union, which also helped to spur funding.

The lesson of “No Bucks, No Buck Rogers” also applies to project management, but not just in the use of imagery and marketing to gain funding. Instead, the principle applies through a more mundane part of the discipline: financial management and the relationship between cash flow and project performance.

What I am referring to as cash flow is not the burn rate of expenditures against an end point, but the intersection of sufficient money at the right time programmed in accordance with the project plan (in alignment with both the IMS and PMB), and informed by project performance.

To those unfamiliar with this method it sounds similar to earned value management, but it is not. EVM informs our decision, but the analysis is not the same.

First, in using this analysis the cumulative actual cost of work performed (ACWP in earned value) should be compared to accrued expenditures for the project. These figures will not be exact, but will provide an indication whether accruals to date have been in line with what was forecasted. In government contracting and project management, these figures will also be somewhat off because earned value figures do not include fee or profit, while financial management figures will include fee or profit. Understanding the profit center from which the financial expenditures are being accrued will allow for a reconciliation of these differences.

Secondly, if projected accruals against the project plan begin to deviate, it is an early indication of programmatic risk being manifested in the physical expenditures of the project. For example, if management anticipates that there will be a delay in project execution in some area, they may decide to defer acquisition of spare parts used in the construction of a component, or they may delay the award of a subcontract that was meant to augment staff in an area requiring specialized expertise.

Third, and conversely, deviations of expenditures for needed materials or manpower may adversely affect project execution, and provide an early warning that such shortages or misalignments will move project accomplishment to the right. For example, a company may have underestimated the combined Procurement Action Lead Time (PALT) and delivery of critical materials, which will now arrive much later than anticipated. This misalignment will cascade through the schedule and future planned work.

For both of these previous conditions, the proper determination of cause-and-effect is essential, since either may appear to suggest the opposite cause.

Fourth, variances in performance either in earned value achievement or schedule performance may require an adjustment to the type of money being provided. For example, when a project fails to execute and risk is manifested in terms of cost and/or schedule, financial management and budgeting personnel, always under pressure to apply excess funds to more immediate needs, may mistakenly believe that a budget mark (a decrease) is appropriate since the allocated money will not be executed in the current time-frame.

But this is not necessarily the case. Performance management data tracks the performance measurement baseline (PMB) for the life of the project, but funding has a finite period in which it can be executed. In government contracting it is not uncommon for there to be different “colors” of money: Research, Development, Test & Evaluation (RDT&E), Procurement, Operations and Maintenance (O&M), and others. Furthermore, these types of appropriations have different expiration dates: two years in terms of RDT&E, three years for procurement, and one year for O&M. The financial management plan takes into account the life of money allocated to the project, as well as the costs of activities necessary to project execution. The time frame for financial execution is shorter and, therefore, more sensitive to risks or variances than project plans that are projected across a longer period of time.

For an R&D program experiencing risk during a particular portion of its PMB, for example, a variance this year may require not only a steady funding profile, but a larger expenditure to handle risk. Marking two-year RDT&E money in its first year in this case would be a mistake, of course, but *not* properly anticipating the proper level of risk adjusted expenditures to handle risk may exacerbate the ability of the project to recover and execute, causing it to fall into a spiral of compounding misalignments and variances from which it may never recover.

Thus, what we can see is that, oftentimes, the availability of cash–and the right kind of cash at the right time–will have as much impact on project execution as the factors of technical and engineering risk. Furthermore, tracking and reconciling the financial plan against actual accomplishment will provide a very detailed early indicator into project performance since it is sensitive to deviations in the fiscal plan.

Postscript.

For those not savvy about the cultural reference to Buck Rogers what follows is a sampling of the first of what became a movie serial in the 1930s, which originated as a radio “space opera”. Later it became a TV series in 1950 as well. For the record, I was not around yet when these were popular, though I did watch the reruns on Saturday mornings in the 1960s and early 1970s.

 

 

 

 

 

 

 

 

 

 

Ground Control from Major Tom — Breaking Radio Silence: New Perspectives on Project Management

Since I began this blog I have used it as a means of testing out and sharing ideas about project management, information systems, as well to cover occasional thoughts about music, the arts, and the meaning of wisdom.

My latest hiatus from writing was due to the fact that I was otherwise engaged in a different sort of writing–tech writing–and in exploring some mathematical explorations related to my chosen vocation, aside from running a business and–you know–living life.  There are only so many hours in the day.  Furthermore, when one writes over time about any one topic it seems that one tends to repeat oneself.  I needed to break that cycle so that I could concentrate on bringing something new to the table.  After all, it is not as if this blog attracts a massive audience–and purposely so.  The topics on which I write are highly specialized and the members of the community that tend to follow this blog and send comments tend to be specialized as well.  I air out thoughts here that are sometimes only vaguely conceived so that they can be further refined.

Now that that is out of the way, radio silence is ending until, well, the next contemplation or massive workload that turns into radio silence.

Over the past couple of months I’ve done quite a bit of traveling, and so have some new perspectives that and trends that I noted and would like to share, and which will be the basis (in all likelihood) of future, more in depth posts.  But here is a list that I have compiled:

a.  The time of niche analytical “tools” as acceptable solutions among forward-leaning businesses and enterprises is quickly drawing to a close.  Instead, more comprehensive solutions that integrate data across domains are taking the market and disrupting even large players that have not adapted to this new reality.  The economics are too strong to stay with the status quo.  In the past the barrier to integration of more diverse and larger sets of data was the high cost of traditional BI with its armies of data engineers and analysts providing marginal value that did not always square with the cost.  Now virtually any data can be accessed and visualized.  The best solutions, providing pre-built domain knowledge for targeted verticals, are the best and will lead and win the day.

b.  Along these same lines, apps and services designed around the bureaucratic end-of-month chart submission process are running into the new paradigm among project management leaders that this cycle is inadequate, inefficient, and ineffective.  The incentives are changing to reward actual project management in lieu of project administration.  The core fallacy of apps that provide standard charts based solely on user’s perceptions of looking at data is that they assume that the PM domain knows what it needs to see.  The new paradigm is instead to provide a range of options based on the knowledge that can be derived from data.  Thus, while the options in the new solutions provide the standard charts and reports that have always informed management, KDD (knowledge discovery in database) principles are opening up new perspectives in understanding project dynamics and behavior.

c.  Earned value is *not* the nexus of Integrated Project Management (IPM).  I’m sure many of my colleagues in the community will find this statement to be provocative, only because it is what they are thinking but have been hesitant to voice.  A big part of their hesitation is that the methodology is always under attack by those who wish to avoid accountability for program performance.  Thus, let me make a point about Earned Value Management (EVM) for clarity–it is an essential methodology in assessing project performance and the probability of meeting the constraints of the project budget.  It also contributes data essential to project predictive analytics.  What the data shows from a series of DoD studies (currently sadly unpublished), however, is that it is planning (via a Integrated Master Plan) and scheduling (via an Integrated Master Schedule) that first ties together the essential elements of the project, and will record the baking in of risk within the project.  Risk manifested in poorly tying contract requirements, technical performance measures, and milestones to the plan, and then manifested in poor execution will first be recorded in schedule (time-based) performance.  This is especially true for firms that apply resource-loading in their schedules.  By the time this risk translates and is recorded in EVM metrics, the project management team is performing risk handling and mitigation to blunt the impact on the performance management baseline (the money).  So this still raises the question: what is IPM?  I have a few ideas and will share those in other posts.

d.  Along these lines, there is a need for a Schedule (IMS) Gold Card that provides the essential basis of measurement of programmatic risk during project execution.  I am currently constructing one with collaboration and will put out a few ideas.

e.  Finally, there is still room for a lot of improvement in project management.  For all of the gurus, methodologies, consultants, body shops, and tools that are out there, according to PMI, more than a third of projects fail to meet project goals, almost half to meet budget expectations, less than half finished on time, and almost half experienced scope creep, which, I suspect, probably caused “failure” to be redefined and under-reported in their figures.  The assessment for IT projects is also consistent with this report, with CIO.com reporting that more than half of IT projects fail in terms of meeting performance, cost, and schedule goals.  From my own experience and those of my colleagues, the need to solve the standard 20-30% slippage in schedule and similar overrun in costs is an old refrain.  So too is the frustration that it need take 23 years to deploy a new aircraft.  A .5 CPI and SPI (to use EVM terminology) is not an indicator of success.  What this indicates, instead, is that there need to be some adjustments and improvements in how we do business.  The first would be to adjust incentives to encourage and reward the identification of risk in project performance.  The second is to deploy solutions that effectively access and provide information to the project team that enable them to address risk.  As with all of the points noted in this post, I have some other ideas in this area that I will share in future posts.

Onward and upward.

Post-Blogging NDIA Blues — The Latest News (Project Management Wonkish)

The National Defense Industrial Association’s Integrated Program Management Division (NDIA IPMD) just had its quarterly meeting here in sunny Orlando where we braved the depths of sub-60 degrees F temperatures to start out each day.

For those not in the know, these meetings are an essential coming together of policy makers, subject matter experts, and private industry practitioners regarding the practical and mundane state-of-the-practice in complex project management, particularly focused on the concerns of the the federal government and the Department of Defense.  The end result of these meetings is to publish white papers and recommendations regarding practice to support continuous process improvement and the practical application of project management practices–allowing for a cross-pollination of commercial and government lessons learned.  This is also the intersection where innovation among the large and small are given an equal vetting and an opportunity to introduce new concepts and solutions.  This is an idealized description, of course, and most of the petty personality conflicts, competition, and self-interest that plagues any group of individuals coming together under a common set of interests also plays out here.  But generally the days are long and the workshops generally produce good products that become the de facto standard of practice in the industry. Furthermore the control that keeps the more ruthless personalities in check is the fact that, while it is a large market, the complex project management community tends to be a relatively small one, which reinforces professionalism.

The “blues” in this case is not so much borne of frustration or disappointment but, instead, from the long and intense days that the sessions offer.  The biggest news from an IT project management and application perspective was twofold. The data stream used by the industry in sharing data in an open systems manner will be simplified.  The other was the announcement that the technology used to communicate will move from XML to JSON.

Human readable formatting to Data-focused formatting.  Under Kendall’s Better Buying Power 3.0 the goal of the Department of Defense (DoD) has been to incorporate better practices from private industry where they can be applied.  I don’t see initiatives for greater efficiency and reduction of duplication going away in the new Administration, regardless of what a new initiative is called.

In case this is news to you, the federal government buys a lot of materials and end items–billions of dollars worth.  Accountability must be put in place to ensure that the money is properly spent to acquire the things being purchased.  Where technology is pushed and where there are no commercial equivalents that can be bought off the shelf, as in the systems purchased by the Department of Defense, there are measures of progress and performance (given that the contract is under a specification) that are submitted to the oversight agency in DoD.  This is a lot of data and to be brutally frank the method and format of delivery has been somewhat chaotic, inefficient, and duplicative.  The Department moved to address this by a somewhat modest requirement of open systems submission of an application-neutral XML file under the standards established by the UN/CEFACT XML organization.  This was called the Integrated Program Management Report (IMPR).  This move garnered some improvement where it has been applied, but contracts are long-term, so incorporating improvements though new contractual requirements tends to take time.  Plus, there is always resistance to change.  The Department is moving to accelerate addressing these inefficiencies in their data streams by eliminating the unnecessary overhead associated with specifications of formatting data for paper forms and dealing with data as, well, data.  Great idea and bravo!  The rub here is that in making the change, the Department has proposed dropping XML as the technology used to transfer data and move to JSON.

XML to JSON. Before I spark another techie argument about the relative merits of each, there are some basics to understand here.  First, XML is a language, JSON is simply data exchange format.  This means that XML is specifically designed to deal with hierarchical and structured data that can be queried and where validation and fidelity checks within the data are inherent in the technology. Furthermore, XML is known to scale while maintaining the integrity of the data, which is intended for use in relational databases.  Furthermore, XML is hard to break.  It is meant for editing and will maintain its structure and integrity afterward.

The counter argument encountered is that JSON is new! and uses fewer characters! (which usually turns out to be inconsequential), and people are talking about it for Big Data and NoSQL! (but this happened after the fact and the reason for shoehorning it this way is discussed below).

So does it matter?  Yes and no.  As a supplier specializing in delivering solutions that normalize and rationalize data across proprietary file structures and leverage database capabilities, I don’t care.  I can adapt quickly and will have a proof-of-concept solution out within 30 days of receiving the schema.

The risk here, which applies to DoD and the industry, is that the decision to go to JSON is made only because it is the shiny new thing used by gamers and social networking developers.  There has also been a move to adapt to other uses because of the history of significant security risks that had been found in Java, so much so that an entire Wikipedia page is devoted to them.  Oracle just killed off Java applets, though Java hangs on.  JSON, of course, isn’t Java, but it was designed from birth as JavaScript Object Notation (hence the acronym JSON), with the purpose of handling relatively small bits of data across web servers in a number of proprietary settings.

To address JSON deficiencies relative to XML, a number of tools have been and are being developed to replicate the fidelity and reliability found in XML.  Whether this is sufficient to be effective against a structured LANGUAGE is to be seen.  Much of the overhead that technies complain about in XML is due to the native functionality related to the power it brings to the table.  No doubt, a bicycle is simpler than a Formula One racer–and this is an apt comparison.  Claiming “simpler” doesn’t pass the “So What?” test knowing the business processes involved.  The technology needs to be fit to the solution.  The purpose of data transmission using APIs is not only to make it easy to produce but for it to–you know–achieve the goals of normalization and rationalization so that it can be used on the receiving end which is where the consumer (which we usually consider to be the customer) sits.

At the end of the day the ability to scale and handle hierarchical, structured data will rely on the quality and strength of the schema and the tools that are published to enforce its fidelity and compliance.  Otherwise consuming organizations will be receiving a dozen different proprietary JSON files, and that does not address the present chaos but simply adds to it.  These issues were aired out during the meeting and it seems that everyone is aware of the risks and that they can be addressed.  Furthermore, as the schema is socialized across solutions providers, it will be apparent early if the technology will be able handle the project performance data resulting from the development of a high performance aircraft or a U.S. Navy destroyer.

The (Contract) is parent to the (Project)

It’s been a late spring filled with travel and tragedy.  Blogging had taken a hiatus, except for AITS.org, which I highly encourage you check out.  My next item will be posted there the first week of July.  The news from Orlando is that we are united and strong as a community, facing down both crackpots and opportunists, and so it is back to work.

At a recent conference one of the more interesting conversations surrounded the difference between contract and project management.  To many people this is one of the same–and a simple Google search reinforces this perception–but, I think, this is a misconception.

The context of the discussion was interesting in that it occurred during an earned value management-focused event.  EVM pitches itself as the glue that binds together the parts of project management that further constitutes integrated project management, but I respectfully disagree.  If we ignore the self-promotion of this position and like good engineers stick to our empiricist approach, we will find that EVM is a method of deriving the financial value of effort within a project.  It is also a pretty good indicator of cost risk manifestation.  This last shouldn’t be taken too far.

A recent DoD study, which is not yet published, demonstrated that early warning cannot be had by EVM even when diving into the details.  Instead, ensuring integration and traceability to the work package level tied to schedule activities could be traced to the slips in schedule (and the associated impact of the bow wave) against the integrated master schedule (IMS), which then served as the window to early warning.  So within the limited context of project performance, EVM itself is just one of many points of entry to eventually get to the answer.  This answer, of course, needs to be both timely and material.

Material in this case refers to the ability to understand the relevance and impact of the indicator.  The latest buzz-phrase to this condition is “actionable” but that’s just a marketing ploy to make a largely esoteric and mundane evolution sound more exciting.  No indicator by itself is ever actionable.  In some cases the best action is no action.  Furthermore, a seemingly insignificant effort may have asymmetrical impacts that threaten the project itself.  This is where risk enters the picture.

When speaking of risk, all too often the discussion comes down to simulated Monte Carlo analysis.  For the project professional situated within the earned value domain, this is a convenient way to pigeonhole the concept and keep it bounded within familiar pathways, but it does little to add new information.  When applied within this context the power of Monte Carlo is limited to a range of probable outcomes within the predictive capabilities of EVM and the IMS.  This is not to minimize the importance of applying the method to these artifacts but, instead, a realization that it is a limited application.

For risk also includes factors that are external to these measurements.  Oftentimes this is called qualitative risk, but that is an all too familiar categorization that makes it seem fuzzy.  These external factors are usually the driving environment factors that limit the ability of the project to adapt.  These factors also incorporate the framing assumptions underlying the justification for the project effort.  Thus, we are led to financing and the conditions needed to achieve the next milestone for financing.  In government project management, this is known as the budget hearing cycle, and it can be both contentious and risky.

Thus, as with the title of this post, the project is really the child of the contract.  Yet when speaking of contract management the terms or often intertwined, or are relegated to the prosaic legalese of contract clauses and, in government, to the Federal Acquisition Regulation (FAR).  But that does not constitute contract management.

This is where our discussions became interesting.  Because need invoke only one element not incorporated into consideration to prove the point.  Let’s take Contract Budget Base (CBB).  This number is made up of the negotiated contract cost (NCC) plus authorized unpriced work (AUW).  In order to take these elements into account, since existing systems act as if they are external to consideration, ephemeral tools or spreadsheets are used to augment the tracking and incorporation of AUW and its impact on the CBB, though the risk of incorrectly tracking and incorporating this work is immeasurably more risky than any single work package or control account in the more closely monitored program management baseline (PMB).  The same goes with management reserve (MR), and even within the PMB itself, undistributed budget (UB), work authorizations (WADs), and change order tracking and impact analysis are often afterthoughts.

But back to the contract itself, the highest elements of the contract are the total allocated budget (TAB) and profit/fee.  But this is simply shorthand for the other elements that affect the TAB.  For example, some contracts have contract clauses that provide incentives and/or penalties that are tied to technical achievement or milestones, yet our project systems act as if these conditions are unanticipated events that fall from the sky.  Only by augmenting project management indicators are these important contract management anticipated and their impacts assessed.

In my own experience, in looking at the total contract, I have seen projects fail for want of the right “color” of money being provided within the window for decisive impact on risk manifestation.  Thus, cashflow–and the manner in which cashflow is released to fund a project–enters the picture.  But more to the point, I have seen the decision regarding cashflow made based on inadequate or partial data that was collected at a level of the structure that was largely irrelevant.  When looking at the life-cycle management of a system–another level up in our hierarchy–our need for awareness–and the information systems that can augment that awareness–becomes that much more acute.

The point here is that, while we are increasingly concerned about the number of angels dancing on the head of the EVM pin, we are ignoring other essential elements of project success.  When speaking of integrated project management, we are speaking of slightly expanding our attention span in understanding the project ecosystem–and yet even those moderate efforts meet resistance.  Given new technology, it is time to begin incorporating those elements that go well beyond the integration of cost, schedule, and bounded schedule risk.

 

Don’t Know Much…–Knowledge Discovery in Data

A short while ago I found myself in an odd venue where a question was posed about my being an educated individual, as if it were an accusation.  Yes, I replied, but then, after giving it some thought, I made some qualifications to my response.  Educated regarding what?

It seems that, despite a little more than a century of public education and widespread advanced education having been adopted in the United States, along with the resulting advent of widespread literacy, that we haven’t entirely come to grips with what it means.  For the question of being an “educated person” has its roots in an outmoded concept–an artifact of the 18th and 19th century–where education was delineated, and availability determined, by class and profession.  Perhaps this is the basis for the large strain of anti-intellectualism and science denial in the society at large.

Virtually everyone today is educated in some way.  Being “educated” means nothing–it is a throwaway question, an affectation.  The question is whether the relevant education meets the needs of the subject being addressed.  An interesting discussion about this very topic is explored at Sam Harris’ blog in the discussion he held with amateur historian Dan Carlin.

In reviewing my own education, it is obvious that there are large holes in what I understand about the world around me, some of them ridiculously (and frustratingly) prosaic.  This shouldn’t be surprising.  For even the most well-read person is ignorant about–well–virtually everything in some manner.  Wisdom is reached, I think, when you accept that there are a few things that you know for certain (or have a high probability and level of confidence in knowing), and that there are a host of things that constitute the entire library of knowledge encompassing anything from a particular domain to that of the entire universe, which you don’t know.

To sort out a well read dilettante from someone who can largely be depended upon to speak with some authority on a topic, educational institutions, trade associations, trade unions, trade schools, governmental organizations, and professional organizations have established a system of credentials.  No system is entirely perfect and I am reminded (even discounting fraud and incompetence) that half of all doctors and lawyers–two professions that have effectively insulated themselves from rigorous scrutiny and accountability to the level of almost being a protected class–graduate in the bottom half of their class.  Still, we can sort out a real brain surgeon from someone who once took a course in brain physiology when we need medical care (to borrow an example from Sam Harris in the same link above).

Furthermore, in the less potentially life-threatening disciplines we find more variation.  There are credentialed individuals who constantly get things wrong.  Among economists, for example, I am more likely to follow those who got the last financial crisis and housing market crash right (Joe Stiglitz, Dean Baker, Paul Krugman, and others), and those who have adjusted their models based on that experience (Brad DeLong, Mark Thoma, etc.), than those who have maintained an ideological conformity and continuity despite evidence.  Science–both what are called the hard and soft sciences–demands careful analysis and corroborating evidence to be tied to any assertions in their most formalized contexts.  Even well accepted theories among a profession are contingent–open to new information and discovery that may modify, append, or displace them.  Furthermore, we can find polymaths and self-taught individuals who have equaled or exceeded credentialed peers.  In the end the proof is in the pudding.

My point here is threefold.  First, in most cases we don’t know what we don’t know.  Second, complete certainty is not something that exists in this universe, except perhaps at death.  Third, we are now entering a world where new technologies allow us to discover new insights in accessing previously unavailable or previously opaque data.

One must look back at the revolution in information over the last fifty years and its resulting effect on knowledge to see what this means in our day-to-day existence.  When I was a small boy in school we largely relied on the published written word.  Books and periodicals were the major means of imparting information, aside from collocated collaborative working environments, the spoken word, and the old media of magazines, radio, and television.  Information was hard to come by–libraries were limited in their collections and there were centers of particular domain knowledge segmented by geography.   Furthermore, after the introduction of television, society had developed  trusted sources and gatekeepers to keep the cranks and flimflam out.

Today, new media–including all forms of digitized information–has expanded and accelerated the means of transmitting information.  Unlike old media, books, and social networking, there are also fewer gatekeepers in new media: editors, fact checkers, domain experts, credentialed trusted sources, etc. that ensure quality control, reliability, fidelity of the information, and provide context.  It’s the wild west of information and those wooed by the voodoo of self-organization contribute to the high risk associated with relying on information provided through these sources.  Thus, organizations and individuals who wish to stay within the fact-based community have had to sort out reliable, trusted sources and, even in these cases, develop–for lack of a better shorthand–BS detectors.  There are two purposes to this exercise: to expand the use of the available data and leverage the speed afforded by new media, and to ensure that the data is reliable and can reliably tell us something important about our subject of interest.

At the level of the enterprise, the sector, or the project management organization, we similarly are faced with the situation in which the scope of data that can be converted into information is rapidly expanding.  Unlike the larger information market, this data on the microeconomic level is more controlled.  Given that data at this level suffers from significance because it records isolated events, or small sample sizes, the challenge has been to derive importance from data where sometimes significance is minimal.

Furthermore, our business systems, because of the limitations of the selected technology, have been self-limiting.  I come across organizations all the time who cannot imagine the incorporation and integration of additional data sets largely because the limitations of their chosen software solution has inculcated that approach–that belief–into the larger corporate culture.  We do not know what we do not know.

Unfortunately, it’s what you do not know that, more often than not, will play a significant role in your organization’s destiny, just as an individual that is more self-aware is better prepared to deal with the challenges that manifest themselves as risk and its resultant probabilities.  Organizations must become more aware and look at things differently, especially since so many of the more conventional means of determining risk and opportunities seems to be failing to keep up with the times, which is governed by the capabilities of new media.

This is the imperative of applying knowledge discovery in data at the organizational and enterprise level–and in shifting one’s worldview from focusing on the limitations of “tools”: how they paint a screen, whether data is displayed across the x or y axis, what shade of blue indicates good performance, how many keystrokes does it take to perform an operation, and all manner of glorified PowerPoint minutia–to a focus on data:  the ability of solutions to incorporate more data, more efficiently, more quickly, from a wider range of sources, and processed in a more effective manner, so that it is converted into information to be able to be used to inform decision making at the most decisive moment.