Solid Like a Rock: The Modern Power Platform, Modular Open Systems, and PPM – A Use Case

My team and I were recently approached by an organization querying us about our team’s experience in the area of systems integration, with Critical Path Method (CPM) scheduling at the center. Doing so is a foundational part of PPM, but many practitioners miss the subtleties of achieving integration in a manner that properly establishes interrelationships across relevant cross-domain datasets which create valid, actionable intelligence within this domain.

The core of success involves applying a coherent and comprehensive automated solution to the set of processes and practices to prioritize, plan, execute, monitor, and govern multiple projects and programs and their associated data. This is known as project and portfolio management (PPM).

When constructing a large, complex project or group of projects, we begin with the project concept, project objectives, framing assumptions, stakeholder identification and read-in, and the identification of risks. This progression then extends to produce success criteria (within the context of key performance indicators or KPIs), the integrated master plan (IMP), the work breakdown structure (WBS), identification of resources within the plan, and finally the integrated master schedule (IMS). Earned value management (EVM), which may or may not apply, will then follow as an assessment of the value of the work being accomplished based on the performance measurement baseline (PMB).

Among these artifacts, the single most important in capturing and understanding the entire contractual and project scope that identifies program events, accomplishments, and accomplishment criteria—and that provides the opportunity for insight into proper integration across elements—is within the IMP.

This is especially true in projects in which technical risk and performance are identified as key factors in our project success criteria. It is the necessary step to capture factors of technical risk and performance, which can be reflected in detailed schedule task performance of the IMS.

In the marketplace, there are few choices of CPM scheduling applications powerful enough to support complex projects. Among these are Microsoft Project, Oracle P6, and Open Plan Professional. There are some other entries that claim to use AI or other “modern” methods to analyze sequences of events, but the three listed provide reliable and understandable results that allow for effective management of the schedule activities and the underlying tasks. In the most sophisticated implementations, schedules will be resource-loaded.

To achieve full integration of PPM elements across subdomains requires extending the core features in the CPM scheduling applications to realize their full informative and business value. This includes the integration of risk management identification and management capabilities, which include but go beyond simple Monte Carlo schedule risk analysis. It would also include automated cost and schedule analysis on the alignment between schedule activities, resource execution and distribution, as well as deploying strategies and measures for risk handling.

Additional extensions include analytical queries to determine weaknesses in the schedule and whether foundational elements are properly tick-and-tied (schedule health). The ability to trace schedule tasks to specific work and technical performance measures provide the rapid ability to identify areas that require immediate action.

In capturing the data elements from across different CPM scheduling applications, or even in a uniform scheduling environment, providing comprehensive reporting, GANTT and visual analytical charting of key factors and elements, including EVM, systems engineering, contract compliance, and other relevant elements within the PPM ecosystem.

Applying Modern Systems Design to Integration in PPM

The most effective way to achieve integration across the PPM ecosystem is through the deployment of a modern power platform.

Key capabilities and components of power platform technology include:

  • Low-code/no-code app deployment: visual configuration designers to create apps without heavy hand-coding.
  • Integration layer: prebuilt schemas, connectors and tools to connect SaaS, on-prem systems, databases, and custom APIs.
  • Data platform and modeling: a common data model based on open data principles that honor data sovereignty, metadata-driven storage, and low-code data manipulation.
  • Analytics and dashboards: embedded BI/reporting to turn app data into actionable insights.
  • Workflow automation: event- and trigger-driven automation (including RPA for UI automation)
  • Governance, security, and lifecycle: role-based access, environment separation (dev/test/prod), ALM, monitoring, and audit.
  • Extensibility: custom code extensions, SDKs, plug-ins, and support for CI/CD and developer tooling.
  • Marketplace/connectors: pre-configured COTS functionality, reusable components, templates, and third-party integrations.

When we combine this technology with a modular open-systems approach in application design (a MOSA) and open data governance, we are able to realize the full intrinsic and business value of data while also achieving maximum flexibility.

These principles first evolved within the systems engineering and model-based engineering communities. But the same benefits identified in this model for physical components within systems also apply to computer systems that control and analyze human systems, such as in PPM. Taking this approach also allows for greater integration between technical performance in systems research and development with the various other subdomains.

The benefits are significant, they include:

  • Faster innovation: Modular components and open data enable parallel development, third‑party extensions, and rapid replacement of parts without system-wide redesign.
  • Reduced vendor lock‑in: Standardized interfaces and governance let organizations mix vendors and swap modules, lowering dependence on single suppliers.
  • Lower total cost of ownership: Reuse, incremental upgrades, and competitive procurement reduce lifecycle costs.
  • Improved resilience and reliability: Fault isolation via modularity and the ability to hot‑swap components or roll back to previous modules improves uptime.
  • Scalability and flexibility: Easily scale capacity or add capabilities (e.g., new energy sources, telemetry modules) by plugging in compatible modules.
  • Interoperability and integration: Standard interfaces and open data models simplify integrating third‑party analytics, grid services, and partner systems.
  • Faster regulatory and market response: Modular upgrades and open data make it easier to meet new compliance requirements or enable new services (demand response, V2G).
  • Better analytics and optimization: Open, governed data enables advanced ML/AI, cross‑system optimization (load forecasting, predictive maintenance), and transparent KPIs.
  • Enhanced security posture: Clear module boundaries and standardized interfaces simplify security reviews; data governance enforces access controls, provenance, and auditability.
  • Ecosystem and marketplace development: Standards + open data foster third‑party marketplaces for modules, apps, and services, driving innovation and value capture.
  • Sustainability and resource efficiency: Easier integration of renewables, storage, and efficiency modules supports decarbonization and circular‑economy practices (component reuse, upgrades).

A Practical Use Case: Microsoft Project

The discussion mentioned in the first paragraph of this post presents an ideal use case for this approach. Microsoft has announced that it plans on retiring Microsoft Online on September 30, 2026.

What this means is that organizations that had invested in this CPM scheduling application will need to make a decision: they can stay within the Microsoft Project environment, or look at the other non-Microsoft CPM applications mentioned earlier. For public project management organizations, further complexity is added by the source of the data related to the schedule: whether it be organic, from suppliers, or some hybrid approach that requires both organic and contracted work.

As I have stated in my earlier posts, I run and operate a software company by the name of SNA Software LLC. The Proteus Envision suite is composed of modern power platform technology, but also built using MOSA principles, and automating data capture and transformation in accordance with open data governance principles.

Rather than a niche application focused on some portion of the project and portfolio domain, our solutions are built to leverage these modern technologies to achieve integration. With the recent FAR overhaul, that simplify many of the regulatory requirements on PPM systems, such as earned value management (EVM) for contracts below $50M, an open system that supports a nimble and modular approach is needed. The shift to the importance of technical performance, schedule and resource management, and risk management become paramount.

With the implementation of the Cybersecurity Maturity Model Certification (CMMC) program, the issue of off-premises Cloud usage is also an issue given the recent controversy regarding Azure GCC High FedRamp. Organizations need a flexible set of options when looking to transition to alternatives when a foundational solution is suddenly no longer available, doesn’t meet expectations, or ages out. Does this agency use in-premises solutions or a commercial cloud environment?

The use case here is to apply applications that automate the capture and transformation of data from any CPM scheduling application. Doing so allows organizations to forgo direct labor in transformation, avoiding the error-ridden and long lead-time brute force data engineering; or the improper use of Excel as an inappropriate systems management solution which siloes data and creates bespoke single points of failure.

The combination of a modern power platform, MOSA, and open data governance is what the current environment demands. At core of this approach is the overriding importance of data—it’s accuracy, transparency, scalability, and integration. Without good data, application of new AI solutions will fail to meet expectations and return-on-investment.

In summary: The Present Challenges in PPM

The most important issues in the PPM domain today revolve around the following:

  • The appropriate application and use of artificial intelligence solutions: the most useful utilization of this promising technology in an ecosystem that requires rigor.
  • The shift from PPM domain siloes: not only in terms of data or analytics, but also in terms of developing and expanding the business acumen of the workforce to be able to effectively use these advanced technologies.
  • The continued importance of assessment and management methods informed by powerful and flexible solutions in the area of large-scale project management.
  • The need for flexibility: to prevent lock-in of proprietary data solutions in a rapidly developing technology environment, and in identifying modular systems solutions to provide upgrades or interoperability rapidly.
  • The rising importance of other PPM indicators: especially those such as technical performance, risk management, and resource execution measures that presage the traditional down-the-line performance indicators in EVM.
  • The utilization of cloud or in-premise deployments, or a combination of the two, to address bandwidth and scaling issues as relevant datasets become larger and more complex with integration.
  • Finding strategies to overtake suboptimization in organizations resulting from rice-bowls, fiefdoms, and silo-building.

Meeting these challenges and finding solutions to them will require collaboration and systems thinking combined with supporting technologies.

Red Queen Race: Project Management and Running Against Time

“Well, in our country,” said Alice, still panting a little, “you’d generally get to somewhere else—if you run very fast for a long time, as we’ve been doing.”

“A slow sort of country!” said the Queen. “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast, as that!” —Through the Looking-Glass and What Alice Found There, Chapter 2, Lewis Carroll

There have been a number of high profile examples over the last several years concerning project management failure and success. For example, in the former case, the initial rollout of the Affordable Care Act marketplace web portal was one of these, and the causes for its faults took a while to understand, absent political bias. The reasons, as the linked article show, are prosaic and basic to the discipline of project management.

(more…)

Big Data and the Repository of Babel

In 1941, the Argentine writer Jorge Luis Borges (1899-1986) published a short story entitled “The Library of Babel.” In the story Borges imagines a universe, known as the Library, which is described by the story’s narrator as made up of adjacent hexagonal rooms.

Each of the rooms of the library is poorly lit, with one side acting as the entrance and exit, and four of the five remaining walls of the rooms containing bookshelves whose books are placed in a completely uniform style, though the books’ contents are completely random.

(more…)

The Need for an Integrated Digital Environment (IDE) Strategy in Project Management*

Putting the Pieces Together

To be an effective project manager, one must possess a number of skills in order to successfully guide the project to completion. This includes having a working knowledge of the information coming from multiple sources and the ability to make sense of that information in a cohesive manner. This is so that, when brought together, it provides an accurate picture of where the project has been, where it is in its present state, and what actions must be taken to keep it (or bring it back) on track.

(more…)

Shake it Out – Embracing the Future of Program Management – Part Two: Private Industry Program and Project Management in Aerospace, Space, and Defense

In my previous post, I focused on Program and Project Management in the Public Interest, and the characteristics of its environment, especially from the perspective of the government program and acquisition disciplines. The purpose of this exploration is to lay the groundwork for understanding the future of program management—and the resulting technological and organizational challenges that are required to support that change.

The next part of this exploration is to define the motivations, characteristics, and disciplines of private industry equivalencies. Here there are commonalities, but also significant differences, that relate to the relationship and interplay between public investment, policy and acquisition, and private business interests.

(more…)

Shake it Out – Embracing the Future in Program Management – Part One: Program and Project Management in the Public Interest

I heard the song from which I derived the title to this post sung by Florence and the Machine and was inspired to sit down and write about what I see as the future in program management.

Thus, my blogging radio silence has ended as I begin to process and share my observations and essential achievements over the last couple of years.

My company—the conduit that provides the insights I share here—is SNA Software LLC. We are a small, veteran-owned company and we specialize in data capture, transformation, contextualization and visualization. We do it in a way that removes significant effort in these processes, ensures reliability and trust, to incorporate off-the-shelf functionality that provides insight, and empowers the user by leveraging the power of open systems, especially in program and project management.

Program and Project Management in the Public Interest

There are two aspects to the business world that we inhabit: commercial and government; both, however, usually relate to some aspect of the public interest, which is our forte.

There are also two concepts about this subject to unpack.

(more…)

Innervisions: The Connection Between Data and Organizational Vision

During my day job I provide a number of fairly large customers with support to determine their needs for software that meets the criteria from my last post. That is, I provide software that takes an open data systems approach to data transformation and integration. My team and I deliver this capability with an open user interface based on Windows and .NET components augmented by time-phased and data management functionality that puts SMEs back in the driver’s seat of what they need in terms of analysis and data visualization. In virtually all cases our technology obviates the need for the extensive, time consuming, and costly services of a data scientist or software developer.

(more…)

Potato, Potahto, Tomato, Tomahto: Data Normalization vs. Standardization, Why the Difference Matters

In my vocation I run a technology company devoted to program management solutions that is primarily concerned with taking data and converting it into information to establish a knowledge-based environment. Similarly, in my avocation I deal with the meaning of information and how to turn it into insight and knowledge. This latter activity concerns the subject areas of history, sociology, and science.

In my travels just prior to and since the New Year, I have come upon a number of experts and fellow enthusiasts in these respective fields. The overwhelming numbers of these encounters have been productive, educational, and cordial. We respectfully disagree in some cases about the significance of a particular approach, governance when it comes to project and program management policy, but generally there is a great deal of agreement, particularly on basic facts and terminology. But some areas of disagreement–particularly those that come from left field–tend to be the most interesting because they create an opportunity to clarify a larger issue.

In a recent venue I encountered this last example where the issue was the use of the phrase data normalization. The issue at hand was that the use of “data normalization” suggested some statistical methodology in reconciling data into a standard schema. Instead, it was suggested, the term “data standardization” was more appropriate.

(more…)

Ring Out the Old, Ring in the New: Data Transformation Podcasting

Robin Williams at Innovate IPM interviewed me a few weeks ago and has a new podcast up to cap off the year. The main thrust of our discussion, as it turned out, which began as a wide-ranging one, settled on digital transformation and the changes and developments that I’ve seen in this area over the last three decades.

(more…)

Back to School Daze Blogging–DCMA Investigation on POGO, DDSTOP, $600 Ashtrays,and Epistemic Sunk Costs

Family summer visits and trips are in the rear view–as well as the simultaneous demands of balancing the responsibilities of a, you know, day job–and so it is time to take up blogging once again.

I will return to my running topic of Integrated Program and Project Management in short order, but a topic of more immediate interest concerns the article that appeared on the website for pogo.org last week entitled “Pentagon’s Contracting Gurus Mismanaged Their Own Contracts.” Such provocative headlines are part and parcel of organizations like POGO, which have an agenda that seems to cross the line between reasonable concern and unhinged outrage with a tinge conspiracy mongering. But the content of the article itself is accurate and well written, if also somewhat ripe with overstatement, so I think it useful to unpack what it says and what it means.

(more…)