Take Me To The River, Part 2, Schedule Elements–A Digital Inventory of Integrated Program Management Elements

Recent attendance at various forums to speak has interrupted the flow of this series on IPM elements. Among these venues I was engaged in discussions regarding this topic, as well as the effects of acquisition reform on the IT, program, and project management communities in the DoD and A&D marketplace.

For this post I will restrict the topic to what are often called schedule elements, though that is a nebulous term. Also, one should not draw a conclusion that because I am dealing with this topic following cost elements, that it is somehow inferior in importance to those elements. On the contrary, planning and scheduling are integral to applying resources and costs, in tracking cost performance, and in our systemic analysis its activities, artifacts, and elements are antecedent to cost element considerations.

The Relative Position of Schedule

But the takeaway here is this: under no circumstances should any program or project manager believe that cost and schedule systems represent a dichotomy, nor a hierarchy, of disciplines. They are interdependent and the behavior noted in one will be manifested in the other.

This is important to keep in mind, because the software industry, more than any other, has been responsible for reinforcing and solidifying this (erroneous) perspective. During the first generation of desktop application development, software solutions were built to automate the functions of traditional line and staff functions. This made a great deal of sense.

From a sales and revenue perspective, it is easier to sell a limited niche software “tool” to an established customer base that will ensure both quick acceptance and immediate realization of productivity and labor savings. The connection from the purchase to ROI was easily traceable in the time span and at the level of the person performing their workaday tasks.

Thus, solutions were built to satisfy the needs of cost analysts, schedule analysts, systems engineers, cost estimators, and others. Where specific solutions left gaps, such spreadsheet solutions such as Microsoft Excel were employed to fill them. It was in no one’s interest to go beyond their core competency. Once a dominant or set of dominant incumbents (a monoposony) inhabited a niche, they employed the usual strategies for “stickiness” to defend territory and raise barriers to new entries.

What was not anticipated by many organizations was the fact that once you automate a function that the nature of the system, if one is to implement the most effective organizational structure, is transformed to conform to the most efficient flow and use of data–and its resulting transformation into information and intelligence. Oftentimes the skill set to use the intelligence does not exist because the resulting insights and synergy involved in taking larger and more comprehensive datasets which themselves are more credible and accurate was not anticipated in adjusting the organizational structure.

This is changing and must change, because the old way of using limited sets of data in the age of big(ger) data that provide a more comprehensive view of business conditions is not tenable. At least, not if a company or organization wants to stay relevant or profitable.

Characteristics and Basic Elements of the Project Schedule

If one were to perform a Google search of project schedule while reading this post, you would find a number of definitions, some of which overlap. For example, the PMBOK defines a schedule as, quite simply, “the planned dates for performing activities and the planned dates for meeting milestones.”

Thus our elements include planned dates, activities, and milestones. But is that all? Under this definition, any kind of plan, from a minor household renovation or upgrade to building an aircraft carrier would contain only these elements.

I don’t think so.

For complex projects and programs, which is the focus on this blog, our definition of a project schedule is a bit more comprehensive. If you go to the aforementioned A Guide for DoD Program Managers mentioned in my last post, you will find even less specificity.

The reason for this is that what we define as a project schedule is part and parcel of the planning phase of a project, which is then further specified in the specific time-phased planning elements for execution of the project through its lifespan into production. It is the schedule that ties together all of the disciplines in putting together a project–acquisition, systems engineering, cost estimating, and project performance management.

In attending scheduled-focused conferences over the years and in talking to program management colleagues is the refrain that:

a. It is hard to find a good scheduler, and

b. Constructing a schedule is more of an art than a science.

I can only say that this cedes the field to a small cadre of personnel who perform an essential function, but who do so with few objective tests of effectiveness or accountability–until it is too late.

But the reality is quite different from the fuzzy perception of schedule that is often assumed. All critical path method (CPM) schedules describe the same phenomena, though the lexicon will vary based on the specific proprietary application employed.

In government-focused and large commercial projects, the schedule is heart of planning and execution. In the DoD world it is known as the Integrated Master Schedule (IMS), which utilize the inherent bottom-up relationships of elements to determine the critical path. The main sources regarding the IMS have a great deal of overlap, but tend to be either aspirational (and unfortunately not prescriptive in defining the basic characteristics of an IMS) or reflect the “art over science” approach. For those following along these are the DoD Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide of 21 October 2005, the NAVAIR Integrated Master Schedule (IMS) Guidebook of February 2010, and the NDIA Planning and Scheduling Excellence Guide (PASEG) of 9 March 2016 (unfortunately no current direct link).

The key elements that comprise an IMS, in addition to what we identified under the PMBOK are that it is networked schedule consisting of specific durations that are assigned to specific work tasks that must be accomplished in discrete work packages. In most cases these durations will be derived by some kind of either fixed, manual method or through the inherent optimization algorithm being applied by the CPM application. More on this below. But these work packages are discrete, meaning that they represent the full scope of the work that must be accomplished to during the specified duration for the creation of an end product. Discrete work is distinguished from level of effort (LOE) work, the latter being effort that is always expended, such as administrative and management tasks, that are not directly tied to the accomplishment of an end product.

These work packages are tied together to illustrate antecedent and progressive work that show predecessor and successor relationships. Long term planning activities, which cannot be fleshed out until more immediate work is completed are set aside as placeholders called planning packages. Each of the elements that are tracked in the IMS are based on the presentation of established criteria that define completion, events, and specific accomplishments.

The most comprehensive IMSs consist of detailed planning that include resources and elements of cost.

Detailed Elements of the IMS

Given these general elements, the best source of identifying the key elements of detailed schedules is also found in Department of Defense documents. The core document in this case is the Data Item Description for the IMS numbered as DI-MGMT-81650. The latest one is dated March 30, 2005. There are a minimum of 32 data elements, some of these already mentioned and which I will not repeat in this post since they are pretty well listed and identified in the source document.

For those not familiar with these documents, Data Item Descriptions (or DiDs–gotta love acronyms) represent the detailed technical documents for artifacts involved in the management of DoD-related operations. Thus, this provides us with a pretty good inventory of elements to source. But there are others that are implied.

For example, the 81650 DiD identifies an element known as “methodology.” What this means is that each scheduling application has an optimization engine where the true differences in schedule construction and intellectual property reside. Elements that affect these calculations are time-based, duration-based, float, and slack, and those related to resources.

These time-based elements consist of early start, early finish, late start, late finish. Duration-based elements consist of shortest time, longest time, greatest rank weight. An additional element related to schedule float identifies minimum slack. Resources are further delineated by the greatest work content and the greatest cumulative resource content.

I would note that the NDIA PASEG adds some sub-elements to this list that are based on the algorithmic result of the schedule engines and, thus, tends to ignore the antecedent salient elements of validating the optimization engine found above. These additional sub-elements are total float, free float, soft constraints, hard constraints, and–also found in the aforementioned DiD–program, task, and resource calendars.

Normally, this is where a survey would end–with schedule-specific data elements focused on the details of the schedule. But we’re going to challenge our assumptions a bit more.

Framing Assumptions of Schedules and Programs

The essential document that provides a definition of the term “framing assumption” was published by RAND Corporation in 2014 entitled Identifying Acquisition Framing Assumptions Through Structured Deliberation by Mark V. Arena and Lauren A. Mayer.  The definition of a framing assumption is “any explicit or implicit assumption that is central is shaping cost, schedule, or performance expectations.”

As I have explored in my prior post, the use of the term “cost” is a fuzzy one. To some it means earned value management, which measures a small part of the costs of development and ownership of a system. To others it means total cost of ownership. Schedule is an implicit part of this definition, and then we have performance expectations, which I will deal with in a separate post.

But we can apply the concept of framing assumptions in two ways.

The first applies to the assumed purpose of the schedule. What do we construct one? This goes back to my earlier statement that “…the schedule…ties together all of the disciplines in putting together a project–acquisition, systems engineering, cost estimating, and project performance management.”

For the NDIA PASEG the IMS is a “tool, not just a report” that “provides an ever-changing window into the progress (or lack of it) of current work effort. The strategic mission of the schedule is to point out future risks and opportunities.”

For the NAVAIR IMS Guide the IMS “At a top level…contain(ing) the networked, detailed tasks necessary to ensure successful program execution…” that “capture(s) project tasks and task relationships”, “show(s) the magnitude and how long each task will take”, “show(s) resources, durations, and constraints for each task” and “show(s) the critical path.”

For the DiD 81650 “The Integrated Master Schedule (IMS) is an integrated schedule containing the networked, detailed tasks necessary to ensure successful program execution.”

But the most comprehensive definition that goes to the core of the purpose of an IMS can be found in paragraph 1.2 of the DoD Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide (IMP/IMS Guide). The elements of this purpose is worth transcribing, because if we have a requirement and cannot ask the “So What?” question, that is, if we cannot effectively determine why something must be done, then it probably does not need to be done (or we need to apply rigor in the development our expertise).

For what the IMP/IMS Guide does is clearly tie the schedule to the programmatic framing assumptions (used in the context in which RAND meant it) from initial acquisition through planning. Thus, the Integrated Master Plan (IMP) is firmly established as an antecedent and intermediate planning process (not merely an artifact or tool), that results in the program R&D execution process.

Taken in whole these processes and the resulting artifacts of the processes provide:

a. Provides offerors and acquiring activities with detailed execution planning, organization, and scheduling information that sets realistic expectations for the resulting contract action.

b. Serves as the execution plan for how the supplier will meet the contract’s performance requirements within cost and schedule constraints.

c. Provides a basis for integrating all of the functions involved in development and deployment of the system being acquired and, after award, sets the framing assumptions of the program.

d. Provides the basis for determining and assessing progress, identifying risks, determining the basis for contractual award fees and penalties, assess progress on Key Performance Parameters (KPPs) and Technical Performance Measures (TPMs), determine alternative paths to project completion, and determine opportunities for innovation and new acquisitions not apparent at the time of the award.

What all of this means is that the Integrated Master Schedule is too important to be left to the master scheduler. Yes, the schedule is a “tool” to those at the most basic tactical level in work execution. Yes, it is also an artifact and record.

But, more importantly, it is the comprehensive notional representation of the project’s or program’s scope, effort, progress, and assessment.

Private and Government-focused Industry Practice

A word has to mentioned here about the difference in practice between purely private industry practice in managing large projects and programs, and the skewing in the posts that focus on those industries that focus on public sector acquisition.

In the listing of schedule elements listed earlier there is reference to resources and elements of cost, yet here is an area that standard practice diverges. In private industry the application of resource assignments to specific work is standard practice and found in the IMS.

In companies focused on the public sector and DoD, the practice is to establish a different set of data outside of the schedule to manage resources. Needless to say this creates problems of validation of data across disparate systems related to the lowest level of planning and execution of a project or program. The basis for it, I think, relates to viewing the schedule as a “tool” and not the basis for project execution. This “tool” mindset also allows for separate “earned value engines” that oftentimes do not synchronize with the execution of the schedule, not only undermining the practical value of both, but also creating systems complexity and inefficiency where none need exist.

Another gap found in many areas of public acquisition concerns the development of an integrated master plan antecedent to the integrated master schedule. The cause here, once again, I believe is viewing the discipline of systems engineering separate; one that is somehow walled off from the continuing assessment of program execution, though that assumption is not supported by program phasing and milestone planning and achievement.

From the perspective of Integrated Program/Project Management, these considerations cannot be ignored, and so our inventory of essential data elements must include elements from these practices.

But Wait! There’s More!

Most discussions at conferences and professional meetings will usually stop at this point–viewing cost and schedule integration as the essence of IPM–with “cost’ limited to EVM. Some will add some “oh by the ways” such as technical performance and risk. I will address these in the next post as well.

But there are also other systems and processes that are relevant to our inventory. But what I have covered thus far in this series should challenge you if you have been paying attention.

I tackled cost first because of the assumptions implicit in equating it with EVM, and then went on to demonstrate that there are other elements of cost that provide a more comprehensive view. This is not denigrate the value of EVM, since it is an essential process in project management, but to demonstrate that its analytics are not comprehensive and, as with any complex system, require the contribution of additional information, depending on the level and type of work performance and progress being recorded and assessed.

In this post I have tacked the IMS, and have demonstrated that it is not supplementary process, but central to all other processes and actions being taken in the execution of the project or program. Many times people enter the schedule from an assessment of cost performance–tracing cost drivers to specific schedule activities and then tasks. But this has it backwards, based on the best technology available sometime in the late 1990s.

It is the schedule that brings together all relevant information from our execution and control processes and systems. It seems to me that perhaps the first place one goes is the schedule, that the first element to trace are those related to schedule slippage and unexpected resource consumption, and then to trace these to contract cost impact.

But, of course, there is more–and these other elements may turn out to be of greater consequence than just cost and schedule considerations. More on these in my next post.

In Closing: Battle Rhythm and the Plans of the Day and Week

When I was on active duty in the Navy we planned our days and weeks around a Plan of the Day or Plan of the Week. This is a posted agenda so that the entire ship or command understands the major events that affect its operations. It establishes focus on the main events at hand and fosters communication both laterally and vertically within the chain of command.

As one rises in rank and responsibility it is important to understand the operational tempo of the unit or ship, its systems, and subsystems. This is important in avoiding crisis management.This is known as Battle Rhythm.

Baked into the schedule (assuming proper construction and effective integrated product teaming) are the major events, milestones, and expected achievement of the program or project. Thus, there are events that should be planned around and anticipation of these items on a daily, weekly, biweekly, monthly, quarterly, and major milestone basis.

Given an effective battle rhythm, a PM should never complain about performance and progress indicators “looking into the rear view mirror”. If that is the case then perhaps the PM should look at the effectiveness and timeliness of the underlying project and program systems.Thus, when a PMO complains of information and intelligence being too late to be actionable, it is actually describing a condition of ineffective, latent, and disjointed information and intelligence systems.

Thus, our next step in our next post is to identify more salient IPM elements that cut to the heart of the matter.

Take Me to the River, Part 1, Cost Elements – A Digital Inventory of Integrated Program Management Elements

In a previous post I recommended a venue focused on program managers to define what constitutes integrated program management. Since that time I have been engaged with thought leaders and influencers in both government and industry, many of whom came to a similar conclusion independently, agree in this proposition and who are working to bring it about.

My own interest in this discussion is from the perspective of maximization of the information ecosystem that underlies and describes the systems known as projects and programs. But what do I mean by this? This is more than a gratuitous question, because oftentimes the information essential to defining project and program performance and behavior are intermixed, and therefore diluted and obfuscated, by confusion with those of the overall enterprise.

Project vs. Program

What a mean by the term project in this context is an organization that is established around a defined effort of fixed duration (a defined beginning and projected end) that is specifically planned and organized for the development and deployment of a particular end item, state, or result, with an identified set of resources assigned and allocated to achieve its goals.

A program is defined as a set of interrelated projects and sub-projects which is also of fixed duration that is specifically planned and organized not only for the development and deployment, but also the continues this role through sustainment (including configuration control), of a particular end item, state, or result, with an identified set of resources assigned and allocated to achieve its goals. As such, the program management team also is the first level life-cycle manager of the end item, state, or result, and participates with other levels of the organization in these activities. (More on life-cycle costs below).

Note the difference in scope and perspective, though oftentimes we use these terms interchangeably.

For shorthand, a small project of short duration operates at the tactical level of planning. A larger project, which because of size, complexity, duration, and risk approaches the definition of a program, operates at the operational level, as do most programs. Larger and more complex programs that will affect the core framing assumptions of the enterprise align their goals to the strategic level of planning. Thus, there are differences in scale, complexity and, hence, data points that can be captured at these various levels.

Another aspect of the question of establishing an integrated digital project and program management environment is sufficiency of data, which relates directly to scale. Sufficiency in this regard is defined as whether there is enough data to establish a valid correlation and, hopefully, draw a causation. Micro-economic foundations–and models–often fail because of insufficient data. This is important to keep in mind as we inventory the type of data available to us and its significance. Oftentimes additional data points can make up for those cases where there is insufficiency in the depth and quality of a more limited set of data points. Doing so will also mitigate subjectivity, especially in smaller efforts.

Thus, in constructing a project or program, regardless of its level of planning, we often begin by monitoring the most basic elements. These are usually described as cost, schedule, performance, and risk, though I will discuss and identify other contributors that can be indexed.

This first post will concentrate on the first set of elements–those that constitute cost. In looking at these, however, we will find that the elements within this category are a bit broader than what is currently used in determining project and program performance.

Contract Costs

When we refer to costs in project and program management we oftentimes are referring to those direct and indirect costs expended by the supplier over the course of the effort, particular in Cost Plus contractual efforts. The breakout of cost from a data perspective places it in subcategories:

Note that these are costs within the contract itself, as a cohesive, self-identifying entity. But there are other costs associated with our contracts which feed into program and project management. These are necessary to identify and capture if we are to take an holistic approach to these disciplines.

The costs that are anticipated by the contract are based on cost estimates, which need to be funded. These funded costs will be allocated to particular lines in the contract (CLINs), whether these be supporting contract efforts or deliverables. Thus, additional elements of our digital inventory include these items but lead us to our next categories.

Cost Estimates, Colors of Money, and Cash Flow

Cost estimates are the basis for determining the entire contract effort, and eventually make it into the project and program cost plan. Once cost estimates are applied and progress is tracked through the collection of actual costs, these elements are further traced to project and program activities, products, commodities, and other business categories, such as the indirect costs identified on the right hand side of the chart above.

Our cost plans need to be financed, as with any business entity. Though the most complex projects often are financed by some government entity because of their scale and impact, private industry–even among the largest companies–must obtain financing for the efforts at hand, whether these come from internal or external sources.

Thus two more elements present themselves: “colors” of money, that is, money that is provided for a specific purpose within the project and program cost plan which could also be made available for only some limited period of time, and the availability of that money sufficient to execute particular portions of the project or program, that is, cash flow.

The phase of the project or program will determine the type of money that is made available. These are also contained in the costs that are identified in the next section, but include, from a government financing perspective, Research, Development, Test and Evaluation (RDT&E) money, Procurement, Operations and Maintenance (O&M), and Military Construction (MILCON) dollars. By Congressional appropriation and authorization, each of these types of money may be provided for particular programs, and each type of authorization has a specific period in which they can be committed, obligated, and expended before they expire. The type of money provided also aligns with the phase of the project or program: whether it still be in development, production, deployment and acquisition, sustainment, or retirement.

These costs will be reflected in reporting that reflects actual and projected rates of expenditure, that will be tied to procurement, material management, and resource management systems.

Additional Relative Costs

As with all efforts, the supplier is not the only entity to incur costs on a development project or program. The customer also incurs costs, which must be taken into account in determining the total cost of the effort.

For anyone who has undergone any kind of major effort on their home, or even had to get things other workaday things done, like deciding when to change the tires on the car or when to get to the dentist implicitly understand that there is more effort in timing and determining the completion of these items than the cost of new kitchen cabinets, tires, or a filling. One must decide to take time off from work. One must look to their own cash flow to see if they have sufficient funds not only for the merchant, but for all of the sundry and associated tasks that must be done in preparation for and after the task’s completion. To choose to do one thing is to choose not to do another–an opportunity cost. Other people may be involved in the decision. Perhaps children are in the household and a babysitter is required. Perhaps the home life is so disrupted that another temporary abode is necessary on a short term basis.

All of these are costs that one must take into account, and at the individual level we do these calculations and plan these activities as a matter of fact.

In customer-supplier relationships the former incurs costs above the contract costs, which must be taken into account by the customer project or program executive. In the Department of Defense an associated element is called program management administration (PMA). For private entities this falls into allocated G&A and Overhead costs, aside from direct labor and material costs, but in all cases these are costs that have come about due to the decision to undertake the specific effort.

Other elements of cost on the customer side are contractually furnished facilities, property, material or equipment, and testing and evaluation costs.

Contract Cost Performance: Earned Value Management

I will further discuss EVM in more detail a later installment of this element inventory, but mention must be made of EVM since to exclude it is to be grossly remiss.

At core EVM is a financial measure of value against what has been physically achieved against a performance management baseline (PMB), which ties actual costs and completion of work through a work breakdown structure (WBS). It is focused on the contract level of performance, which in some cases may constitute the entire project, though not necessarily the entire effort for the program.

Linkages to the other cost elements I have delineated elsewhere in this post ranges from strong to non-existent. Thus, while an essential means of linking contractual achievement to work accomplishment that, at various levels of fidelity, is linked to actual technical achievement, it does not capture all of the costs in our data inventory.

An essential overview in understanding what it does capture is best summed up in the following diagram taken from the Defense Acquisition University (DAU) site:

Commercial EVM elements, while not necessarily using the same terminology or highly structured process, possess a similar structure in allocating costs and achievement against baseline costs in developmental efforts to work packages (oftentimes schedule tasks in resource-loaded schedules) under an integrated WBS structure with Management Reserve not included as part of the baseline.

Also note that commercial efforts often include their internal costs as part of the overall contractual effort in assessing earned value against actual work achievement, while government contracting efforts tend to exclude these inherent costs. That being said, it is not that there is no cost control in these elements, since strict ceilings often apply to PMA and other such costs, it is that contract cost performance does not take these costs, among others, into account.

Furthermore, the chart above provides us with additional sub-elements in our inventory that are essential in capturing data at the appropriate level of our project and program hierarchy.

Thus, for IPM, EVM is one of many elements that are part our digital inventory–and one that provides a linkage to other non-cost elements (WBS). But in no way should it be viewed as capturing all essential costs associated with a contractual effort, aside from the more expansive project or program effort.

Portfolio Management and Life-Cycle Costs

There is another level of management that is essential in thinking about project and program management, and that is the program executive level. In the U.S. military services these are called Program Executive Officers (PEOs). In private industry they are often product managers, CIOs, and other positions that often represent the link between the program management teams and the business operations side of the organization. Thus, this is also the level of management organized to oversee a number of individual projects and programs that are interrelated based on mission, commodity, or purpose. As such, this level of management often concentrates on issues across the portfolio of projects and programs.

The main purpose of the portfolio management level is to ensure that project and program efforts are aligned with the strategic goals of the organization, which includes an understanding of the total cost of ownership.

In performing this purpose one of the functions of portfolio management is to identify risks that may manifest within projects and programs, and to determine the most productive use of limited resources across them, since they are essentially competing for the same dollars. This includes cost estimates and re-allocations to address ontological, aleatory, and epistemic risk.

Furthermore, the portfolio level is also concerned with the life-cycle factors of the item under development, so that there is effective hand-off at the production and sustainment phases. The key here is to ensure that each project or program, which is focused on the more immediate goals of project and program execution, continues to meet the goals of the organization in terms of life-cycle costs, and its effectiveness in meeting the established goals essential to the project or program’s framing assumptions.

But here we are focusing on cost, and so the costs involved are trade-off costs and opportunities, assessments of return on investment, and the aforementioned total cost of ownership of the end item or system. The costs that contribute to the total cost of ownership include all of the development costs, external and internal program management costs, procurement costs, operations and support costs, maintenance and life extension costs, and system retirement costs.

Conclusion

I believe that the survey of cost elements presented in this initial post illustrates that present digital project and program management systems are limited and immature–capturing and evaluating only a small portion of the total amount of available data.

These gaps make it impossible, for example, to determine the relative significance any one element–and the analytics that can derived from it–over another; not to mention the inability to provide the linkage among these absent elements that would garner insights into cause-and-effect and predictive behavior so that we have enough time to influence the outcome.

It is also clear that, when we strive to define what constitutes integrated project and program management, that we must learn what is of most importance to the PM in performing those duties that are viewed as essential to success, and which are not yet captured in our analytical and predictive systems.

Only when our systems reach the level of cohesiveness and comprehensiveness in providing organizational insight and intelligence essential to project or program management will PMs ignore them at their own risk. In getting there we must first identify what can be captured from the activities that contribute to our efforts.

My next post will identify essential elements related to planning and scheduling.

 

Note: I am indebted to Defense Acquisition University’s resources in my research across many of my postings and link to them for the edification of the reader. For more insight into many of the points raised in this post I would recommend that readers familiarize themselves with A Guide for DoD Program Managers.

 

Don’t Stop Thinking About Tomorrow–Post-Workshop Blogging…and some Low Comedy

It’s been a while since I posted to my blog due to meetings and–well–day job, but some interesting things occurred during the latest Integrated Program Management (IPMD) of the National Defense Industrial Association (NDIA) meeting that I think are of interest. (You have to love acronyms to be part of this community).

Program Management and Integrated Program Management

First off is the initiative by the Program Management Working Group to gain greater participation by program managers with an eye to more clearly define what constitutes integrated program management. As readers of this blog know, this is a topic that I’ve recently written about.

The Systems Engineering discipline is holding their 21st Annual Systems Engineering Conference in Tampa this year from October 22nd to the 25th. IPMD will collaborate and will be giving a track dedicated to program management. The organizations have issued a call for papers and topics of interest. (Full disclosure: I volunteered this past week to participate as a member of the PM Working Group).

My interest in this topic is based on my belief from my years of wide-ranging experience in duties from having served as a warranted government contracting officer, program manager, business manager, CIO, staff officer, and logistics officer that there is much more to the equation in defining IPM that transcends doing so through the prism of any particular discipline. Furthermore, doing so will require collaboration and cooperation among a number of project management disciplines.

This is a big topic where, I believe, no one group or individual has all of the answers. I’m excited to see where this work goes.

Integrated Digital Environment

Another area of interest that I’ve written about in the past involved two different–but related–initiatives on the part of the Department of Defense to collect information from their suppliers that is necessary in their oversight role not only to ensure accountability of public expenditures, but also to assist in project cost and schedule control, risk management, and assist in cost estimation, particularly as it relates to risk sharing cost-type R&D contracted project efforts.

Two major staffs in the Offices of the Undersecretary of Defense have decided to go with a JSON-type schema for, on the one hand, cost estimating data, and on the other, integrated cost performance, schedule, and risk data. Each initiative seeks to replace the existing schemas in place.

Both have been wrapped around the axle on getting industry to move from form-based reporting and data sharing to a data-agnostic solution that meet the goals of reducing redundancy in data transmission, reducing the number of submissions and data streams, and moving toward one version of truth that allows for SMEs on both sides of the table to concentrate on data analysis and interpretation in jointly working toward the goal of successful project completion and end-item deployment.

As with the first item, I am not a disinterested individual in this topic. Back when I wore a uniform I helped to construct DoD policy to create an integrated digital environment. I’ve written about this experience previously in this blog, so I won’t bore with details, but the need for data sharing on cost-type efforts acknowledges the reality of the linkage between our defense economic and industrial base and the art of the possible in deploying defense-related end items. The same relationship exists for civilian federal agencies with the non-defense portion of the U.S. economy. Needless to say, a good many commercial firms unrelated to defense are going the same way.

The issue here is two-fold, I think, from speaking with individuals working these issues.

The first is, I think, that too much deference is being given to solution providers and some industry stakeholders, influenced by those providers, in “working the refs” through the data. The effect of doing so not only slows down the train and protects entrenched interests, it also gets in the way of innovation, allowing the slowest among the group to hold up the train in favor of–to put it bluntly–learning their jobs on the job at the expense of efficiency and effectiveness. As I expressed in a side conversion with an industry leader, all too often companies–who, after all, are the customer–have allowed themselves to view the possible by the limitations and inflexibility of their solution providers. At some point that dysfunctional relationship must end–and in the case of comments clearly identified as working the refs–they should be ignored. Put your stake in the ground and let innovation and market competition sort it out.

Secondly, cost estimating, which is closely tied to accounting and financial management, is new and considered tangential to other, more mature, performance management systems. My own firm is involved in producing a solution in support of this process, collecting data related to these reports (known collectively in DoD as the 1921 reports), and even after working to place that data in a common data lake, exploring with organizations what it tells us, since we are only now learning what it tells us. This is classical KDD–Knowledge Discovery in Data–and a worthwhile exercise.

I’ve also advocated going one step further in favor of the collection of financial performance data (known as the Contract Funds Status Report), which is an essential reporting requirement, but am frustrated to find no one willing to take ownership of the guidance regarding data collection. The tragedy here is that cost performance, known broadly as Earned Value Management, is a technique related to the value of work performance against other financial and project planning measures (a baseline and actuals). But in a business (or any enterprise), the fuel that drives the engine are finance-related, and two essential measures are margin and cash-flow. The CFSR is a report of program cash-flow and financial execution. It is an early measure of whether a program will execute its work in any given time-frame, and provides a reality check on the statistical measures of performance against baseline. It is also a necessary logic check for comptrollers and other budget decision-makers.

Thus, as it relates to data, there has been some push-back against a settled schema, where the government accepts flat files and converts the data to the appropriate format. I see this as an acceptable transient solution, but not an ultimate one. It is essential to collect both cost estimating and contract funds status information to perform any number of operations that relate to “actionable” intelligence: having the right executable money at the right time, a reality check against statistical and predictive measures, value analysis, and measures of ROI in development, just to name a few.

I look forward to continuing this conversation.

To Be or Not to Be Agile

The Section 809 Panel, which is the latest iteration of acquisition reform panels, has recommended that performance management using earned value not be mandated for efforts using Agile. It goes on, however, to assert that program executive “should approve appropriate project monitoring and control methods, which may include EVM, that provide faith in the quality of data and, at a minimum, track schedule, cost, and estimate at completion.”

Okay…the panel is then mute on what those monitoring and control measure will be. Significantly, if only subtly, the #NoEstimates crowd took a hit since the panel recommends and specifies data quality, schedule, cost and EAC. Sounds a lot like a form of EVM to me.

I must admit to be a skeptic when it comes to swallowing the Agile doctrine whole. Its micro-economic foundations are weak and much of it sounds like ideology–bad ideology at best and disproved ideology at worst (specifically related to the woo-woo about self-organization…think of the last speculative bubble and resulting financial crisis and depression along these lines).

When it comes to named methodologies I am somewhat from Missouri. I apply (and have in previous efforts in the Dark Ages back when I wore a uniform) applied Kanban, teaming, adaptive development (enhanced greatly today by using modern low-code technology), and short sprints that result in releasable modules. But keep in mind that these things were out there long before they were grouped under a common heading.

Perhaps Agile is now a convenient catch-all for best practices. But if that is the case then software development projects using this redefined version of Agile deserve no special dispensation. But I was schooled a bit by an Agile program manager during a side conversation and am always open to understanding things better and revising my perspectives. It’s just that there was never a Waterfall/Agile dichotomy just as there never really was a Spiral/Waterfall dichotomy. These were simply convenient development models to describe a process that were geared to the technology of the moment.

There are very good people on the job exploring these issues on the Agile Working Group in the IPMD and I look forward to seeing what they continue to come up with.

Rip Van Winkle Speaks!

The only disappointing presentation occurred on the second and last day of the meeting. It seemed we were treated by a voice from somewhere around the year 2003 that, in what can only be described as performance art involving free association, talked about wandering the desert, achieving certification for a piece of software (which virtually all of the software providers in the room have successfully navigated at one time or another), discovering that cost and schedule performance data can be integrated (ignoring the work of the last ten years on the part of, well, a good many people in the room), that there was this process known as the Integrated Baseline Review (which, again, a good many people in the room had collaborated on to both define and make workable), and–lo and behold–the software industry uses schemas and APIs to capture data (known in Software Development 101 as ETL). He then topped off his meander by an unethical excursion into product endorsement, selected through an opaque process.

For this last, the speaker was either unaware or didn’t care (usually called tone-deafness) that the event’s expenses were sponsored by a software solution provider (not mine). But it is also as if the individual speaking was completely unaware of the work behind the various many topics that I’ve listed above this subsection, ignoring and undermining the hard work of the other stakeholders that make up our community.

On the whole an entertaining bit of poppycock, which leads me to…

A Word about the Role of Professional Organizations (Somewhat Inside Baseball)

In this blog, and in my interactions with other professionals at–well–professional conferences–I check my self-interest in at the door and publicly take a non-commercial stance. It is a position that is expected and, I think, appreciated. For those who follow me on social networking like LinkedIn, posts from my WordPress blog originate from a separate source from the commercial announcements that are linked to my page that originate from my company.

If there are exhibitor areas, as some conferences and workshops do have, that is one thing. That’s where we compete and play; and in private side conversations customers and strategic partners will sometimes use the opportunity as a convenience to discuss future plans and specific issues that are clearly business-related. But these are the exceptions to the general rule, and there are a couple of reasons for this, especially at this venue.

One is because, given that while it is a large market, it is a small community, and virtually everyone at the regular meetings and conferences I attend already know that I am the CEO and owner of a small software company. But the IPMD is neutral ground. It is a place where government and industry stakeholders, who in other roles and circumstances are in a contractual or competing relationship, come to work out the best way of hashing out processes and procedures that will hopefully improve the discipline of program and project management. It is also a place of discovery, where policies, new ideas, and technologies can be vetted in an environment of collaboration.

Another reason for taking a neutral stance is simply because it is both the most ethical and productive one. Twenty years ago–and even in some of the intervening years–self-serving behavior was acceptable at the IPMD meetings where both leadership and membership used the venue as a basis for advancing personal agendas or those of their friends, often involving backbiting and character assassination. Some of those people, few in number, still attend these meetings.

I am not unfamiliar with the last–having been a target at one point by a couple of them but, at the end of the day, such assertions turned out to be without merit, undermining the credibility of the individuals involved, rightfully calling into question the quality of their character. Such actions cannot help but undermine the credibility and pollute the atmosphere of the organization in which they associate, as well.

Finally, the companies and organizations that sponsor these meetings–which are not cheap to organize, which I know from having done so in the past–deserve to have the benefit of acknowledgment. It’s just good manners to play nice when someone else is footing the bill–you gotta dance with those that brung you. I know my competitors and respect them (with perhaps one or two exceptions). We even occasionally socialize with each other and continue long-term friendships and friendly associations. Burning bridges is just not my thing.

On the whole, however, the NDIA IPMD meetings–and this one, in particular–was a productive and positive one, focused on the future and in professional development. That’s where, I think, that as a community we need to be and need to stay. I always learn something new and get my dose of reality from a broad-based perspective. In getting here the leadership of the organization (and the vast majority of the membership) is to be commended, as well as the recent past and current members of the Department of Defense, especially since the formation of the Performance Assessments and Root Cause Analysis (PARCA) office.

In closing, there were other items of note discussed, along with what can only be described as the best pair of keynote addresses that I’ve heard in one meeting. I’ll have more to say about some of the concepts and ideas that were presented there in future posts.

One-Trick Pony — Software apps and the new Project Management paradigm

Recently I have been engaged in an exploration and discussion regarding the utilization of large amounts of data and how applications derive importance from that data.  In an on-line discussion with the ever insightful Dave Gordon, I first postulated that we need to transition into a world where certain classes of data are open so that the qualitative content can be normalized.  This is what for many years was called the Integrated Digital Environment (IDE for short).  Dave responded with his own post at the AITS.org blogging alliance, countering that while such standards are necessary in very specific and limited applications, that modern APIs provide most of the solution.  I then responded directly to Dave here, countering that IDE is nothing more than data neutrality.  Then also at AITS.org I expanded on what I proposed to be a general approach in understanding big data, noting the dichotomy in the software approaches that organize the external characteristics of the data to generalize systems and note trends, as opposed to those that are focused on the qualitative content within the data.

It should come as no surprise then, given these differences in approaching data, that we also find similar differences in the nature of applications that are found on the market.  With the recent advent of on-line and hosted solutions, there are literally thousands of applications in some categories of software that propose to do one thing with data, or that are focused one-trick pony applications that can be mixed and matched to somehow provide an integrated solution.

There are several problems with this sudden explosion of applications of this nature.

The first is in the very nature of the explosion.  This is a classic tech bubble, albeit limited to a particular segment of the software market, and it will soon burst.  As soon as consumers find that all of that information traveling over the web with the most minimal of protections is compromised by the next trophy hack, or that too many software providers have entered the market prematurely–not understanding the full needs of their targeted verticals–it will hit like the last one in 2000.  It only requires a precipitating event that triggers a tipping point.

You don’t have to take my word for it.  Just type in a favorite keyword into your browser now (and I hope you’re using VPN doing it) for a type of application for which you have a need–let’s say “knowledge base” or “software ticket systems.”  What you will find is that there are literally hundreds if not thousands of apps built for this function.  You cannot test them all.  Basic information economics, however, dictates that you must invest some effort in understanding the capabilities and limitations of the systems on the market.  Surely there are a couple of winners out there.  But basic economics also dictates that 95% of those presently in the market will be gone in short order.  Being the “best” or the “best value” does not always win in this winnowing out.  Certainly chance, the vagaries of your standing in the search engine results, industry contacts–virtually any number of factors–will determine who is still standing and who is gone a year from now.

Aside from this obvious problem with the bubble itself, the approach of the application makers harkens back to an earlier generation of one-off applications that attempt to achieve integration through marketing while actually achieving, at best, only old-fashioned interfacing.  In the world of project management, for example, organizations can little afford to revert to the division of labor, which is what would be required to align with these approaches in software design.  It’s almost as if, having made their money in an earlier time, that software entrepreneurs cannot extend themselves beyond their comfort zones in taking advantage of the last TEN software generations that provide new, more flexible approaches to data optimization.  All they can think to do is party like it’s 1995.

For the new paradigm in project management is to get beyond the traditional division of labor.  For example, is scheduling such a highly specialized discipline rising to the level of a profession that it is separate from all of the other aspects of project management?  Of course not.  Scheduling is a discipline–a sub-specialty actually–that is inextricably linked to all other aspects of project management in a continuum.  The artifacts of the process of establishing project systems and controls constitutes the project itself.

No doubt there are entities and companies that still ostensibly organize themselves into specialties as they did twenty years ago: cost analysts, schedule analysts, risk management specialists, among others.  But given that the information from the these systems: schedule, cost management, project financial management, risk management, technical performance, and all the rest, can be integrated at the appropriate level of their interrelationships to provide us a cohesive, holistic view of the complex system that we call a project, is such division still necessary?  In practice the industry has already moved to position itself to integration, realizing the urgency of making the shift.

For example, to utilize an application to query cost management information in 1995 was a significant achievement during the first wave of software deployment that mimicked the division of labor.  In 2015, not so much.  Introducing a one-trick pony EVM “tool” in 2015 is laziness–hoping to turn back the clock in ignoring the obsolescence of such an approach–regardless of which slick new user interface is selected.

I recently attended a project management meeting of senior government and industry representatives.  During one of my side sessions I heard a colleague propose the discipline of Project Management Analyst in lieu of previously stove-piped specialties.  His proposal is a breath of fresh air in an industry that develops and manufacturers the latest aircraft and space technology, but has hobbled itself with systems and procedures designed for an earlier era that no longer align with the needs of doing business.  I believe the timely deployment of systems has suffered as a result during this period of transition. 

Software must lead, and accelerate the transition to the new integration paradigm.

Thus, in 2015 the choice is not between data that adheres to conventions of data neutrality, or to those that utilize data access via APIs, but in favor of applications that do both.

It is not between different hard-coded applications that provide the old “what-you-see-is-what-you-get” approach.  It is instead between such limited hard-coded applications, and those that provide flexibility so that business managers can choose among a nearly unlimited pallet of choices of how and which data, converted into information, is available to the user or classes of user based on their role and need to know; aggregated at the appropriate level of detail for the consumer to derive significance from the information being presented.

It is not between “best-of-breed” and “mix-and-match” solutions that leverage interfaces to achieve integration.  It is instead between such solution “consortiums” that drive up implementation and sustainment costs, bringing with them high overhead, against those that achieve integration by leveraging the source of the data itself, reducing the number of applications that need to be managed, allowing data to be enriched in an open and flexible environment, achieving transformation into useful information.

Finally, the choice isn’t among applications that save their attributes in a proprietary format so that the customer must commit themselves to a proprietary solution.  Instead, it is between such restrictive applications and those that open up data access, clearly establishing that it is the consumer that owns the data.

Note: I have made minor changes from the original version of this post for purposes of clarification.

Over at AITS.org — The Need for an Integrated Digital Environment Strategy in Project Management

To be an effective project manager, one must possess a number of skills in order to successfully guide the project to completion. This includes having a working knowledge of the information coming from multiple sources and the ability to make sense of that information in a cohesive manner. This is so that, when brought together, it provides an accurate picture of where the project has been, where it is in its present state, and what actions must be taken to keep it (or bring it back) on track….Read More