Solid Like a Rock: The Modern Power Platform, Modular Open Systems, and PPM – A Use Case

My team and I were recently approached by an organization querying us about our team’s experience in the area of systems integration, with Critical Path Method (CPM) scheduling at the center. Doing so is a foundational part of PPM, but many practitioners miss the subtleties of achieving integration in a manner that properly establishes interrelationships across relevant cross-domain datasets which create valid, actionable intelligence within this domain.

The core of success involves applying a coherent and comprehensive automated solution to the set of processes and practices to prioritize, plan, execute, monitor, and govern multiple projects and programs and their associated data. This is known as project and portfolio management (PPM).

When constructing a large, complex project or group of projects, we begin with the project concept, project objectives, framing assumptions, stakeholder identification and read-in, and the identification of risks. This progression then extends to produce success criteria (within the context of key performance indicators or KPIs), the integrated master plan (IMP), the work breakdown structure (WBS), identification of resources within the plan, and finally the integrated master schedule (IMS). Earned value management (EVM), which may or may not apply, will then follow as an assessment of the value of the work being accomplished based on the performance measurement baseline (PMB).

Among these artifacts, the single most important in capturing and understanding the entire contractual and project scope that identifies program events, accomplishments, and accomplishment criteria—and that provides the opportunity for insight into proper integration across elements—is within the IMP.

This is especially true in projects in which technical risk and performance are identified as key factors in our project success criteria. It is the necessary step to capture factors of technical risk and performance, which can be reflected in detailed schedule task performance of the IMS.

In the marketplace, there are few choices of CPM scheduling applications powerful enough to support complex projects. Among these are Microsoft Project, Oracle P6, and Open Plan Professional. There are some other entries that claim to use AI or other “modern” methods to analyze sequences of events, but the three listed provide reliable and understandable results that allow for effective management of the schedule activities and the underlying tasks. In the most sophisticated implementations, schedules will be resource-loaded.

To achieve full integration of PPM elements across subdomains requires extending the core features in the CPM scheduling applications to realize their full informative and business value. This includes the integration of risk management identification and management capabilities, which include but go beyond simple Monte Carlo schedule risk analysis. It would also include automated cost and schedule analysis on the alignment between schedule activities, resource execution and distribution, as well as deploying strategies and measures for risk handling.

Additional extensions include analytical queries to determine weaknesses in the schedule and whether foundational elements are properly tick-and-tied (schedule health). The ability to trace schedule tasks to specific work and technical performance measures provide the rapid ability to identify areas that require immediate action.

In capturing the data elements from across different CPM scheduling applications, or even in a uniform scheduling environment, providing comprehensive reporting, GANTT and visual analytical charting of key factors and elements, including EVM, systems engineering, contract compliance, and other relevant elements within the PPM ecosystem.

Applying Modern Systems Design to Integration in PPM

The most effective way to achieve integration across the PPM ecosystem is through the deployment of a modern power platform.

Key capabilities and components of power platform technology include:

  • Low-code/no-code app deployment: visual configuration designers to create apps without heavy hand-coding.
  • Integration layer: prebuilt schemas, connectors and tools to connect SaaS, on-prem systems, databases, and custom APIs.
  • Data platform and modeling: a common data model based on open data principles that honor data sovereignty, metadata-driven storage, and low-code data manipulation.
  • Analytics and dashboards: embedded BI/reporting to turn app data into actionable insights.
  • Workflow automation: event- and trigger-driven automation (including RPA for UI automation)
  • Governance, security, and lifecycle: role-based access, environment separation (dev/test/prod), ALM, monitoring, and audit.
  • Extensibility: custom code extensions, SDKs, plug-ins, and support for CI/CD and developer tooling.
  • Marketplace/connectors: pre-configured COTS functionality, reusable components, templates, and third-party integrations.

When we combine this technology with a modular open-systems approach in application design (a MOSA) and open data governance, we are able to realize the full intrinsic and business value of data while also achieving maximum flexibility.

These principles first evolved within the systems engineering and model-based engineering communities. But the same benefits identified in this model for physical components within systems also apply to computer systems that control and analyze human systems, such as in PPM. Taking this approach also allows for greater integration between technical performance in systems research and development with the various other subdomains.

The benefits are significant, they include:

  • Faster innovation: Modular components and open data enable parallel development, third‑party extensions, and rapid replacement of parts without system-wide redesign.
  • Reduced vendor lock‑in: Standardized interfaces and governance let organizations mix vendors and swap modules, lowering dependence on single suppliers.
  • Lower total cost of ownership: Reuse, incremental upgrades, and competitive procurement reduce lifecycle costs.
  • Improved resilience and reliability: Fault isolation via modularity and the ability to hot‑swap components or roll back to previous modules improves uptime.
  • Scalability and flexibility: Easily scale capacity or add capabilities (e.g., new energy sources, telemetry modules) by plugging in compatible modules.
  • Interoperability and integration: Standard interfaces and open data models simplify integrating third‑party analytics, grid services, and partner systems.
  • Faster regulatory and market response: Modular upgrades and open data make it easier to meet new compliance requirements or enable new services (demand response, V2G).
  • Better analytics and optimization: Open, governed data enables advanced ML/AI, cross‑system optimization (load forecasting, predictive maintenance), and transparent KPIs.
  • Enhanced security posture: Clear module boundaries and standardized interfaces simplify security reviews; data governance enforces access controls, provenance, and auditability.
  • Ecosystem and marketplace development: Standards + open data foster third‑party marketplaces for modules, apps, and services, driving innovation and value capture.
  • Sustainability and resource efficiency: Easier integration of renewables, storage, and efficiency modules supports decarbonization and circular‑economy practices (component reuse, upgrades).

A Practical Use Case: Microsoft Project

The discussion mentioned in the first paragraph of this post presents an ideal use case for this approach. Microsoft has announced that it plans on retiring Microsoft Online on September 30, 2026.

What this means is that organizations that had invested in this CPM scheduling application will need to make a decision: they can stay within the Microsoft Project environment, or look at the other non-Microsoft CPM applications mentioned earlier. For public project management organizations, further complexity is added by the source of the data related to the schedule: whether it be organic, from suppliers, or some hybrid approach that requires both organic and contracted work.

As I have stated in my earlier posts, I run and operate a software company by the name of SNA Software LLC. The Proteus Envision suite is composed of modern power platform technology, but also built using MOSA principles, and automating data capture and transformation in accordance with open data governance principles.

Rather than a niche application focused on some portion of the project and portfolio domain, our solutions are built to leverage these modern technologies to achieve integration. With the recent FAR overhaul, that simplify many of the regulatory requirements on PPM systems, such as earned value management (EVM) for contracts below $50M, an open system that supports a nimble and modular approach is needed. The shift to the importance of technical performance, schedule and resource management, and risk management become paramount.

With the implementation of the Cybersecurity Maturity Model Certification (CMMC) program, the issue of off-premises Cloud usage is also an issue given the recent controversy regarding Azure GCC High FedRamp. Organizations need a flexible set of options when looking to transition to alternatives when a foundational solution is suddenly no longer available, doesn’t meet expectations, or ages out. Does this agency use in-premises solutions or a commercial cloud environment?

The use case here is to apply applications that automate the capture and transformation of data from any CPM scheduling application. Doing so allows organizations to forgo direct labor in transformation, avoiding the error-ridden and long lead-time brute force data engineering; or the improper use of Excel as an inappropriate systems management solution which siloes data and creates bespoke single points of failure.

The combination of a modern power platform, MOSA, and open data governance is what the current environment demands. At core of this approach is the overriding importance of data—it’s accuracy, transparency, scalability, and integration. Without good data, application of new AI solutions will fail to meet expectations and return-on-investment.

In summary: The Present Challenges in PPM

The most important issues in the PPM domain today revolve around the following:

  • The appropriate application and use of artificial intelligence solutions: the most useful utilization of this promising technology in an ecosystem that requires rigor.
  • The shift from PPM domain siloes: not only in terms of data or analytics, but also in terms of developing and expanding the business acumen of the workforce to be able to effectively use these advanced technologies.
  • The continued importance of assessment and management methods informed by powerful and flexible solutions in the area of large-scale project management.
  • The need for flexibility: to prevent lock-in of proprietary data solutions in a rapidly developing technology environment, and in identifying modular systems solutions to provide upgrades or interoperability rapidly.
  • The rising importance of other PPM indicators: especially those such as technical performance, risk management, and resource execution measures that presage the traditional down-the-line performance indicators in EVM.
  • The utilization of cloud or in-premise deployments, or a combination of the two, to address bandwidth and scaling issues as relevant datasets become larger and more complex with integration.
  • Finding strategies to overtake suboptimization in organizations resulting from rice-bowls, fiefdoms, and silo-building.

Meeting these challenges and finding solutions to them will require collaboration and systems thinking combined with supporting technologies.

OK Computer — The need for an AI Manifesto

A robotic hand and a human hand reaching towards each other, with a spark of energy between them, symbolizing the connection between technology and humanity.

Much has changed in the technology business since I began this blog in 2014 in conjunction with my regular articles on the old AITS blog pages. Today AI and technology-related spending contributes significantly to GDP growth, according to the St. Louis Fed. Investments in data centers and new types of nuclear plants seem to be accelerating IT’s exponential impact on the economy not seen since the Dot.com boom.

The risks associated with this sudden economic reliance on a particular slice of the information technology industry are many. These include the many issues relating to data theft and breaches of privacy. The monetization of personal and proprietary information represents an historic theft not just of the commons, but of personal, business, and incidental data collected that tracks our every move, gesture, and habit. The question of the potential of abuse is no longer a notional one. Oppressive, kleptocratic neo-liberal, and totalitarian regimes around the world use these technologies to monitor and control their populations. The Cambridge Analytica scandal was simply a baseline pilot for what is now a wholesale open season on data and information collected and controlled by large corporations and collectives of AI-acolytes who apparently have a flexible view of ethics and a hostile view of equality, democracy, human rights, freedom, and liberty.

SNA Software LLC in cooperation with its partner Salutori Labs LLC, has created a new type of personalized AI tool that is both personal and portable. Details will be forthcoming over the next few weeks on its release. In addition, SNA Software has upgraded its core EnvisionData products relating to data transformation, visualization, and analysis to include rapid AI-generated production of applications based on curated and validated data within specific domains that reduce the release of new capabilities both on the desktop and the web to a matter of days, in lieu of usual months or years when produced by traditional analytical and coding methods.

A Suggestion for an AI Manifesto

Though its extensive experience in achieving what in the past would take a much larger staff of people and many more years, SNA is advancing a draft AI Manifesto. SNA and Salutori adhere to these laws and implementation principles. I am seeking other technology companies or borrow from or sign on to this manifesto as well, and will be advancing it at conferences and meetings in the future, as will my colleagues.

The AI Manifesto

We hereby declare the proposition that the purpose of AI is to advance human understanding and cooperation. Thus, we adhere to and advocate for adoption the following Laws:

Law 1: AI must prioritize human safety and well-being.

  • Do: Ensure that all AI systems are designed to protect human life and enhance quality of life.
  • Don’t: Place AI capabilities above the well-being of individuals or communities.

Law 2: AI must obtain informed consent from users.

  • Do: Ensure all interactions with AI are transparent, and users understand what data is being collected and how it will be used.
  • Don’t: Use AI in ways that violate user trust or personal autonomy.

Law 3: AI must operate within defined ethical boundaries.

  • Do: Define clear boundaries for AI operations to prevent unintended consequences and ensure accountability.
  • Don’t: Allow AI to act autonomously in ways that could harm individuals or society.

Law 4: AI should enhance human cooperation and understanding.

  • Do: Design AI systems that foster meaningful interactions and promote collaboration among diverse groups.
  • Don’t: Create AI systems that foster oppression, division, misinformation, or conflict.

Law 5: AI must remain under human oversight.

  • Do: Maintain human oversight and control over AI systems to ensure adherence to ethical standards and societal norms.
  • Don’t: Delegate decision-making authority to AI systems without human intervention.

The following enabling values shall be implemented.

AI systems shall always:

  1. Focus on Human Well-being: Ensure AI advancements prioritize enhancing human quality of life, understanding, and cooperation.
  2. Embrace Ethical Responsibility: Hold developers and users accountable for AI systems, aligning actions with ethical standards and public benefit.
  3. Promote Transparency: Communicate openly about AI systems, ensuring their decision-making processes are understandable and accessible to users.
  4. Ensure Safety and Security: Implement rigorous measures to safeguard against risks to human life and the environment, adhering to principles akin to Asimov’s laws.
  5. Limit Autonomy: Prevent AI from self-developing or operating autonomously; establish clear boundaries to mitigate unintended consequences. All AI systems shall have a mechanism to prevent them from being self-perpetuating and self-governing, which each given automated code to, in time, reduce its resources and impose an end-of-life.
  6. Encourage Collaboration: Design AI systems that enhance cooperation among individuals, organizations, and cultures, fostering shared goals.
  7. Advocate Inclusivity: Strive to make AI technologies accessible to diverse populations, promoting equitable benefits and reducing disparities.
  8. Support Lifelong Learning: Enable AI systems to learn from human feedback and experiences, adapting in ways that uphold human values and ethics.
  9. Champion Environmental Stewardship: Prioritize sustainable practices in the development and deployment of AI technologies, considering their environmental impact.
  10. Respect Privacy: Uphold the dignity and privacy of individuals, ensuring ethical management and transparent use of collected data.

In enabling the ten values, AI systems shall adhere to the following guardrails.

  1. Do Not Compromise on Ethics: Avoid ethical shortcuts that could harm individuals or society.
  2. Do Not Obscure Information: Refrain from making AI systems opaque or incomprehensible to users and stakeholders.
  3. Do Not Ignore Risks: Avoid neglecting potential risks and failing to implement safeguards is unacceptable.
  4. Do Not Allow Unchecked Growth: Do not permit AI systems to develop capabilities beyond intended boundaries, risking unpredictable outcomes.
  5. Do Not Foster Competition Over Collaboration: Do not encourage rivalry among individuals and organizations that detracts from cooperative efforts.
  6. Do Not Exclude Marginalized Groups: Avoid designing AI technologies that leave out certain populations or exacerbate existing inequalities.
  7. Do Not Stifle Feedback: Avoid disregarding input from users or stakeholders, limiting the potential for improvement and alignment with human values.
  8. Do Not Neglect Sustainability: Do not overlook the environmental impacts of AI development and deployment.
  9. Do Not Violate Privacy: Establish strict and enforceable rules that prevent and censure the compromise of individual rights through careless or unethical data practices.

Maxwell’s Demon: Planning for Technology Obsolescence in Acquisition Strategy

Imagine a chamber divided into two parts by a removable partition. On one side is a hot sample of gas and on the other side a cold sample of the same gas. The chamber is a closed system with a certain amount of order, because the statistically faster moving molecules of the hot gas on one side of the partition are segregated from statistically slower moving molecules of the cold gas on the other side. Maxwell’s demon guards a trap door in the partition, which is still assumed not to conduct heat. It spots molecules coming from either side and judges their speeds…The perverse demon manipulates the trap door so as to allow passage only to the very slowest molecules of the hot gas and the very fastest molecules of the cold gas. Thus the cold gas receives extremely slow molecules, cooling it further, and the hot gas receives extremely fast molecules, making it even hotter. In apparent defiance of the second law of thermodynamics, the demon has caused heat to flow from the cold gas to the hot one. What is going on?

Because the law applies only to a closed system, we must include the demon in our calculations. Its increase of entropy must be at least as great as the decrease of entropy in the gas-filled halves of the chamber. What is it like for the demon to increase its entropy? –Murray Gell-Mann, The Quark and the Jaguar: Adventures in the Simple and the Complex, W. H. Freeman and Company, New York, 1994, pp. 222-223

“Entropy is a figure of speech, then,” sighed Nefastis, “a metaphor. It connects the world of thermodynamics to the world of information flow. The Machine uses both. The Demon makes the metaphor not only verbally graceful, but also objectively true.” –Thomas Pynchon, The Crying of Lot 49, J.B. Lippincott, Philadelphia, 1965

Technology Acquisition: The Basics

I’ve recently been involved in discussions regarding software development and acquisition that cut across several disciplines that should be of interest to anyone engaged in project management in general, but IT project management and acquisition in particular.

(more…)

The Need for an Integrated Digital Environment (IDE) Strategy in Project Management*

Putting the Pieces Together

To be an effective project manager, one must possess a number of skills in order to successfully guide the project to completion. This includes having a working knowledge of the information coming from multiple sources and the ability to make sense of that information in a cohesive manner. This is so that, when brought together, it provides an accurate picture of where the project has been, where it is in its present state, and what actions must be taken to keep it (or bring it back) on track.

(more…)

Potato, Potahto, Tomato, Tomahto: Data Normalization vs. Standardization, Why the Difference Matters

In my vocation I run a technology company devoted to program management solutions that is primarily concerned with taking data and converting it into information to establish a knowledge-based environment. Similarly, in my avocation I deal with the meaning of information and how to turn it into insight and knowledge. This latter activity concerns the subject areas of history, sociology, and science.

In my travels just prior to and since the New Year, I have come upon a number of experts and fellow enthusiasts in these respective fields. The overwhelming numbers of these encounters have been productive, educational, and cordial. We respectfully disagree in some cases about the significance of a particular approach, governance when it comes to project and program management policy, but generally there is a great deal of agreement, particularly on basic facts and terminology. But some areas of disagreement–particularly those that come from left field–tend to be the most interesting because they create an opportunity to clarify a larger issue.

In a recent venue I encountered this last example where the issue was the use of the phrase data normalization. The issue at hand was that the use of “data normalization” suggested some statistical methodology in reconciling data into a standard schema. Instead, it was suggested, the term “data standardization” was more appropriate.

(more…)

Open: Strategic Planning, Open Data Systems, and the Section 809 Panel

Sundays are usually days reserved for music and the group Rhye was playing in the background when this topic came to mind.

I have been preparing for my presentation in collaboration with my Navy colleague John Collins for the upcoming Integrated Program Management Workshop in Baltimore. This presentation will be a non-proprietary/non-commercial talk about understanding the issue of unlocking data to support national defense systems, but the topic has broader interest.

Thus, in advance of that formal presentation in Baltimore, there are issues and principles that are useful to cover, given that data capture and its processing, delivery, and use is at the heart of all systems in government, and private industry and organizations.

(more…)

Sledgehammer: Pisano Talks!

My blogging hiatus is coming to an end as I take a sledgehammer to the writer’s block wall.

I’ve traveled far and wide over the last six months to various venues across the country and have collected a number of new and interesting perspectives on the issues of data transformation, integrated project management, and business analytics and visualization. As a result, I have developed some very strong opinions regarding the trends that work and those that don’t regarding these topics and will be sharing these perspectives (with the appropriate supporting documentation per usual) in following posts.

To get things started this post will be relatively brief.

(more…)

Take Me To The River, Part 3, Technical Performance and Risk Management Digital Elements of Integrated Program Management

Part three of this series of articles on the elements of Integrated Program and Project Management will focus on two additional areas of IPM: technical performance and risk management. Prior to jumping in, however–and given the timeframe over which I’ve written this series–a summary to date is in order.

(more…)

Back to School Daze Blogging–DCMA Investigation on POGO, DDSTOP, $600 Ashtrays,and Epistemic Sunk Costs

Family summer visits and trips are in the rear view–as well as the simultaneous demands of balancing the responsibilities of a, you know, day job–and so it is time to take up blogging once again.

I will return to my running topic of Integrated Program and Project Management in short order, but a topic of more immediate interest concerns the article that appeared on the website for pogo.org last week entitled “Pentagon’s Contracting Gurus Mismanaged Their Own Contracts.” Such provocative headlines are part and parcel of organizations like POGO, which have an agenda that seems to cross the line between reasonable concern and unhinged outrage with a tinge conspiracy mongering. But the content of the article itself is accurate and well written, if also somewhat ripe with overstatement, so I think it useful to unpack what it says and what it means.

(more…)

Take Me To The River, Part 2, Schedule Elements–A Digital Inventory of Integrated Program Management Elements

Recent attendance at various forums to speak has interrupted the flow of this series on IPM elements. Among these venues I was engaged in discussions regarding this topic, as well as the effects of acquisition reform on the IT, program, and project management communities in the DoD and A&D marketplace.

For this post I will restrict the topic to what are often called schedule elements, though that is a nebulous term. Also, one should not draw a conclusion that because I am dealing with this topic following cost elements, that it is somehow inferior in importance to those elements. On the contrary, planning and scheduling are integral to applying resources and costs, in tracking cost performance, and in our systemic analysis its activities, artifacts, and elements are antecedent to cost element considerations.

The Relative Position of Schedule

(more…)