The Need for an Integrated Digital Environment (IDE) Strategy in Project Management*

Putting the Pieces Together

To be an effective project manager, one must possess a number of skills in order to successfully guide the project to completion. This includes having a working knowledge of the information coming from multiple sources and the ability to make sense of that information in a cohesive manner. This is so that, when brought together, it provides an accurate picture of where the project has been, where it is in its present state, and what actions must be taken to keep it (or bring it back) on track.

Looking Back to the Dawn of Digital

Starting in about the early 1990s, with the first wave of PC-based digitization in business systems, software focused on automating functions based on specializations as defined by the division of labor. Thus, early deployments of software solutions during the initial tech bubble focused on so-called “best-of-breed” approaches in business systems, in which those applications that performed a specific function best were knitted together to form the fabric of a strategic tool-set.

Thus, project management organizations would usually begin by selecting a scheduling application (Microsoft® Project, Oracle Primavera, Deltek Open Plan, Artemis ProjectView, etc.) and then select other applications that mirrored the skill-sets required, some of these being selected by higher C-level managers: financial management, resource and personnel management, acquisition management, cost performance, risk management, configuration control, and others.

Once put in place, the need arose, similar to but significantly different from pre-digitized days, for a method of integration and control of these systems, which relied heavily on the establishment of manual systems and procedures. Labor effort was shifted from ensuring the validity of information as it related to the work of project management, to the validity, reconciliation, and management of data under rigorous non-automated controls that are vulnerable to human error.

Where It Went Wrong

Thus, in a perverse way, digitization reduced the value of labor of the specialist. But this was not the end of the transformation. The economic basis for digitization was and is improved productivity and reduction of labor overhead. As such, organizations—having acquired the requisite technology and armed with their business plans—reduced workforce and even the level (and salaries) of the knowledge workers remaining. Implicit in the business plans was the assumption that while in many cases only providing the 80% or 90% solution, that the software technology deployed would eventually be developed to fill the gaps. This has not happened.

Instead, the gaps have been filled by custom one-off solutions, often based in manual systems to fill the data required in PowerPoint reports, the use of Microsoft® Excel spreadsheets, and customized Access applications. Widespread sub-optimization is now the rule, with demands on a smaller workforce requiring that people—many of whom do not have the required skills since they came in after the transformation—must interpret and understand the significance of the data, reconcile the data, and manage the data.

Fulfilling the Vision

In order to break the cycle of increasing complexity and lack of interoperability created by best-of-breed and one-off solutions in an environment built on the reduction of available labor and resources, I believe it is imperative for organizations to go to the next step and finally realize what was expected to begin with. That is, to develop an integrated digital environment (IDE) strategy.

Since I first worked on this concept as a senior U.S. Navy Commander almost 18 years ago, the definition has morphed to be almost undefinable, so let me tell you what I mean by IDE: it is the ability to receive, transform and integrate essential project management data regardless of proprietary source, and to then aggregate and deconstruct that data so that it can be delivered on a near real-time basis to the appropriate level of project management in a manner that is useful to that level. The following are the characteristics of what this means.

First, that transformation is to be defined by the successful acceptance of industry-adopted data schemas that reduce the constituent parts of the data—whether it be schedule, cost, risk, or financial data—so that the basis of syntax to support each discipline is objectively consistent. For example, the U.S. Department of Defense and the aerospace and defense industry have spearheaded the adoption of an international schema for standardization of earned value and schedule information with risk and other data to be included over time. While ostensibly designed to support DoD-type work, the schema, particularly as it relates to project schedule data, is expansive enough to allow for adoption and use by other commercial industries and across national economies.

Second, that integration be established through either acceptance of the standard data schema—with a corresponding consistent manner of hosting that data in a relational database—or through the direct access to the appropriate data, via data communication protocols which forgo proprietary development or hard-coding to establish their connections (that is, utilization of standard protocols for communication to access data). This approach precludes methods such as internal data swapping, transfer or interfacing that require constant manual reconciliation.

Third, the use of new open-systems user-interface technologies that forgo one-trick-pony solutions, which allow for the configuration of solutions that fill the gap and achieve integration of data in an automated fashion—fulfilling the goals of improved productivity and the elimination of labor dedicated to data reconciliation and management, and one-off solutions built on Excel and PowerPoint.

Fourth, that the technology used to achieve these characteristics is repeatable and sustainable. That once configured and proved out initially, that the environment utilizes digitized technologies without constant manual intervention except by exception.

Fifth, that the integration of data from disparate sources process data so that it is useable as information, leveraging integration to provide new parametric techniques and leading indicators based on the insight provided by data that is properly associated at the optimum logical level. The first digital wave automated what it could, constrained by swim lanes of specialization. The next wave must break these constraints to do what automation is best at doing: quantitative data processing, facilitating the person to do what they are best doing: qualitative assessment

*This post first appeared on December 2, 2014 at AITS.org. Since that time AITS has changed ownership and the temporary use of the content of that site by the authors who contributed their work has expired and is back into the hands of the content creators. The new owners of AITS have decided to remove all previous content, including this and other articles and posts. In the interest of maintaining the internet record, I have decided to republish those works that are most appropos to the subject of this blog and have made minor edits, as needed. Most of the issues raised back then remain current today. As with any edited work, I would like to thank John Friscia who was my editor at CAI’s Accelerating IT Success during the period that I was a regular contributor to that site. His patience with my desire to explore unconventional approaches to IT project management issues, and his commitment to good writing made me sound more prescient and contributed to making me a better writer. One final note: as a former member of the National Writer’s Union, my heartfelt bravo to the members of the Writer’s Guild for standing up for the rights of creators of content and artistic works everywhere.