As many of my colleagues in project management know, I wrote a series of articles on the application of technical performance risk in project management back in 1997, one of which made me an award recipient from the institution now known as Defense Acquisition University. Over the years various researchers and project organizations have asked me if I have any additional thoughts on the subject and the response up until now has been: no. From a practical standpoint, other responsibilities took me away from the domain of determining the best way of recording technical achievement in complex projects. Furthermore, I felt that the field was not ripe for further development until there were mathematics and statistical methods that could better approach the behavior of complex adaptive systems.
(more…)project management data
Big Time — Elements of Data Size in Scaling
I’ve run into additional questions about scalability. It is significant to understand the concept in terms of assessing software against data size, since there are actually various aspect of approaching the issue.
Unlike situations where data is already sorted and structured as part of the core functionality of the software service being provided, this is in dealing in an environment where there are many third-party software “tools” that put data into proprietary silos. These act as barriers to optimizing data use and gaining corporate intelligence. The goal here is to apply in real terms the concept that the customers generating the data (or stakeholders who pay for the data) own the data and should have full use of it across domains. In project management and corporate governance this is an essential capability.
(more…)The Monster Mash — Zombie Ideas in Project and Information Management
Just completed a number of meetings and discussions among thought leaders in the area of complex project management this week, and I was struck by a number of zombie ideas in project management, especially related to information, that just won’t die. The use of the term zombie idea is usually attributed to the Nobel economist Paul Krugman from his excellent and highly engaging (as well as brutally honest) posts at the New York Times, but for those not familiar, a zombie idea is “a proposition that has been thoroughly refuted by analysis and evidence, and should be dead — but won’t stay dead because it serves a political purpose, appeals to prejudices, or both.”
(more…)The Future — Data Focus vs. “Tools” Focus
The title in this case is from the Leonard Cohen song.
Over the last few months I’ve come across this issue quite a bit and it goes to the heart of where software technology is leading us. The basic question that underlies this issue can be boiled down into the issue of whether software should be thought of as a set of “tools” or an overarching solution that can handle data in a way that the organization requires. It is a fundamental question because what we call Big Data–despite all of the hoopla–is really a relative term that changes with hardware, storage, and software scalability. What was Big Data in 1997 is not Big Data in 2016, and will not be Big Data in 2030.
(more…)For What It’s Worth — More on the Materiality and Prescriptiveness Debate and How it Affects Technological Solutions
The underlying basis on the materiality vs. prescriptiveness debate that I previously wrote about lies in two areas: contractual compliance, especially in the enforcement of public contracts, and the desired outcomes under the establishment of a regulatory regime within an industry. Sometimes these purposes are in agreement and sometimes they are in conflict and work at cross-purposes to one another.
(more…)Stay Open — Open and Proprietary Databases (and Why It Matters)
The last couple of weeks have been fairly intense workwise and so blogging has lagged a bit. Along the way the matter of databases came up at a customer site and what constitutes open data and what comprises proprietary data. The reason why this issue matters to customers rests of several foundations.
First, in any particular industry or niche there is a wide variety of specialized apps that have blossomed. This is largely due to Moore’s Law. Looking at the number of hosted and web apps alone can be quite overwhelming, particularly given the opaqueness of what one is buying at any particular time when it comes to software technology.
(more…)New Directions — Fourth Generation apps, Agile, and the New Paradigm
The world is moving forward and Moore’s Law is accelerating in interesting ways on the technology side, which opens new opportunities, especially in software. In the past I have spoken of the flexibility of Fourth Generation software, that is, software that doesn’t rely on structured hardcoding, but instead, is focused on the data to deliver information to the user in more interesting and essential ways. I work in this area for my day job, and so using such technology has tipped over more than a few rice bowls.
(more…)Do You Believe in Magic? — Big Data, Buzz Phrases, and Keeping Feet Planted Firmly on the Ground
My alternative title for this post was “Money for Nothing,” which is along the same lines. I have been engaged in discussions regarding Big Data, which has become a bit of a buzz phrase of late in both business and government. Under the current drive to maximize the value of existing data, every data source, stream, lake, and repository (and the list goes on) has been subsumed by this concept. So, at the risk of being a killjoy, let me point out that not all large collections of data is “Big Data.” Furthermore, once a category of data gets tagged as Big Data, the further one seems to depart from the world of reality in determining how to approach and use the data. So for of you who find yourself in this situation, let’s take a collective deep breath and engage our critical thinking skills.
(more…)The Water is Wide — Data Streams and Data Reservoirs
I’ve had a lot of opportunities lately, in a practical way, to focus on data quality and approaches to data. There is some criticism in our industry about using metaphors to describe concepts in computing.
Like any form of literature, however, there are good and bad metaphors. Opposing them in general, I think, is contrarian posing. Metaphors, after all, often allow us to discover insights into an otherwise opaque process, clarifying in our mind’s eye what is being observed through the process of deriving similarities to something more familiar. Strong metaphors allow us to identify analogues among the phenomena being observed, providing a ready path to establishing a hypothesis. Having served this purpose, we can test that hypothesis to see if the metaphor serves our purposes in contributing to understanding.
(more…)The Song Remains the Same (But the Paradigm Is Shifting) — Data Driven Assessment and Better Software in Project Management
Probably the biggest DoD-centric project management news this past week was the unofficial announcement by Frank Kendall, who is the Undersecretary of Defense for Acquisition, Technology, and Logistics USD(AT&L), that thresholds would be raised for mandatory detailed surveillance of programs to $100M from the present requirement of $20M. While earned value management implementation and reporting will still be required on programs based on dollar value, risk, and other key factors, especially the $20M threshold for R&D-type projects, the raising of the threshold for mandatory surveillance reviews was seen as good news all around for reducing some regulatory burden. The big proviso in this announcement, however, was that it is to go into effect later this summer and that, if the data in reporting submissions show inconsistencies and other anomalies that call into question the validity of performance management data, then all bets are off and the surveillance regime is once again imposed, though by exception.
(more…)
You must be logged in to post a comment.