Days of Future Passed — Legacy Data and Project Parametrics

I’ve had a lot of discussions lately on data normalization, including being asked the question of what constitutes normalization when dealing with legacy data, specifically in the field of project management.  A good primer can be found at About.com, but there are also very good older papers out on the web from various university IS departments.  The basic principals of data normalization today consist of finding a common location in the database for each value, reducing redundancy, properly establishing relationships among the data elements, and providing flexibility so that the data can be properly retrieved and further processed into intelligence in such as way as the objects produced possess significance.

The reason why answering this question is so important is because our legacy data is of such a size and of such complexity that it falls into the broad category of Big Data.  The condition of the data itself provides wide variations in terms of quality and completeness.  Without understanding the context, interrelationships, and significance of the elements of the data, the empirical approach to project management is threatened, since being able to use this data for purposes of establishing trends and parametric analysis is limited.

A good paper that deals with this issue was authored by Alleman and Coonce, though it was limited to Earned Value Management (EVM).  I would argue that EVM, especially in the types of industries in which the discipline is used, is pretty well structured already.  The challenge is in the other areas that are probably of more significance in getting a fuller understanding of what is happening in the project.  These areas of schedule, risk, and technical performance measures.

In looking at the Big Data that has been normalized to date–and I have participated with others in putting a significant dent in this area–it is apparent that processes in these other areas lack discipline, consistency, completeness, and veracity.  By normalizing data in sub-specialties that have experienced an erosion in enforcing standards of quality and consistency, technology becomes a driver for process improvement.

A greybeard in IT project management once said to me (and I am not long in joining that category): “Data is like water, the more it flows downstream the cleaner it becomes.”  What he meant is that the more that data is exposed in the organizational stream, the more it is questioned and becomes a part of our closed feedback loop: constantly being queried, verified, utilized in decision making, and validated against reality.  Over time more sophisticated and reliable statistical methods can be applied to the data, especially if we are talking about performance data of one sort or another, that takes periodic volatility into account in trending and provides us with a means for ensuring credibility in using the data.

In my last post on Four Trends in Project Management, I posited that the question wasn’t more or less data but utilization of data in a more effective manner, and identifying what is significant and therefore “better” data.  I recently heard this line repeated back to me as a means of arguing against providing data.  This conclusion was a misreading of what I was proposing.  One level of reporting data in today’s environment is no more work than reporting on any other particular level of a project hierarchy.  So cost is no longer a valid point for objecting to data submission (unless, of course, the one taking that position must admit to the deficiencies in their IT systems or the unreliability of their data).

Our projects must be measured against the framing assumptions in which they were first formed, as well as the established measures of effectiveness, measures of performance, and measures of technical achievement.  In order to view these factors one must have access to data originating from a variety of artifacts: the Integrated Master Schedule, the Schedule and Cost Risk Analysis, and the systems engineering/technical performance plan.  I would propose that project financial execution metrics are also essential in getting a complete, integrated, view of our projects.

There may be other supplemental data that is necessary as well.  For example, the NDIA Integrated Program Management Division has a proposed revision to what is known as the Integrated Baseline Review (IBR).  For the uninitiated, this is a process in which both the supplier and government customer project teams can come together, review the essential project artifacts that underlie project planning and execution, and gain a full understanding of the project baseline.  The reporting systems that identify the data that is to be reported against the baseline are identified and verified at this review.  But there are also artifacts submitted here that contain data that is relevant to the project and worthy of continuing assessment, precluding manual assessments and reviews down the line.

We don’t yet know the answer to these data issues and won’t until all of the data is normalized and analyzed.  Then the wheat from the chaff can be separated and a more precise set of data be identified for submittal, normalized and placed in an analytical framework to give us more precise information that is timely so that project stakeholders can make decisions in handling any risks that manifest themselves during the window that they can be handled (or make the determination that they cannot be handled).  As the farmer says in the Chinese proverb:  “We shall see.”

Brother Can You (Para)digm? — Four of the Latest Trends in Project Management

At the beginning of the year we are greeted with the annual list of hottest “project management trends” prognostications.  We are now three months into the year and I think it worthwhile to note the latest developments that have come up in project management meetings, conferences, and in the field.  Some of these are in alignment with what you may have seen in some earlier articles, but these are four that I find to be most significant thus far, and there may be a couple of surprises for you here.

a.  Agile and Waterfall continue to duke it out.  As the term Agile is adapted and modified to real world situations, the cult purists become shriller in attempting to enforce the Manifesto that may not be named.  In all seriousness, it is not as if most of these methods had not been used previously–and many of the methods, like scrum, also have their roots in Waterfall and earlier methods.  A great on-line overview and book on the elements of scrum can be found at Agile Learning Labs.  But there is a wide body of knowledge out there concerning social and organizational behavior that is useful in applying what works and doesn’t work.  For example, the observational science behind span of control, team building, the structure of the team in supporting organizational effectiveness, and the use of sprints in avoiding the perpetual death-spiral of adding requirements and not defining “done”, are best practices that identify successful teams (depending how you define success–keeping in mind that a successful team that produces the product often still fails as a going concern, and thus falls into obscurity).

All that being said, if you want to structure these best practices into a cohesive methodology, call it Agile, Waterfall or Harry, and can make money at it while helping people succeed in a healthy work environment, all power to you.  In IT, however, it is this last point that makes this particular controversy seem like we’ve been here before.  When woo-woo concepts like #NoEstimates and self-organization are thrown about, the very useful and empirical nature of the enterprise enters into magical thinking and ideology.  The mathematics of unsuccessful IT projects has not changed significantly since the shift to Agile.  From what one can discern from the so-called studies on the market, which are mostly anecdotal or based on unscientific surveys, somewhere north of 50% of IT projects fail, failure defined as behind schedule and over cost, or failing to meet functionality requirements.

Given this, Agile seems to be the latest belle to the ball and virtually any process improvement introducing scrum, teaming, and sprints seems to get the tag.  Still, there is much blood and thunder being expended for a result that amounts to the same (and probably less than the) mathematical chance of success as found in the coin flip.  I think for the remainder of the year the more acceptable and structured portions of Agile will get the nod.

b.  Business technology is now driving process.  This trend, I think, is why process improvements like Agile, that claim to be the panacea, cannot deliver on their promises.  As best practices they can help organizations avoid a net negative, but they rarely can provide a net positive.  Applying new processes and procedures while driving blind will still run you off the road.  The big story in 2015, I think, is the ability to handle big data and to integrate that data in a manner to more clearly reveal context to business stakeholders.  For years in A&D, DoD, governance, and other verticals engaged in complex, multi-year project management, we have seen the push and pull of interests regarding the amount of data that is delivered or reported.  With new technologies this is no longer an issue.  Delivering a 20GB file has virtually the same marginal cost as delivering a 10GB file.  Sizes smaller than 1G aren’t even worth talking about.

Recently I heard someone refer to the storage space required for all this immense data, it’s immense I tell you!  Well storage is cheap and large amounts of data can be accessed through virtual repositories using APIs and smart methods of normalizing data that requires integration at the level defined by the systems’ interrelationships.  There is more than one way to skin this cat, and more methods for handling bigger data are coming on-line every year.  Thus, the issue is not more or less data, but better data regardless of the size of the underlying file or table structure or the amount of information.  The first go-round of this process will require that all of the data available already in repositories be surveyed to determine how to optimize the information it contains.  Then, once transformed into intelligence, to determine the best manner of delivery so that it provides both significance and context to the decision maker.  For many organizations, this is the question that will be answered in 2015 and into 2016.  At that point it is the data that will dictate the systems and procedures needed to take advantage of this powerful advance in business intelligence.

c.  Cross-functional teams will soon morph into cross-functional team members.  As data originating from previously stove-piped competencies is integrated into a cohesive whole, the skillsets necessary to understand the data, know how to convert it into intelligence, and act appropriately on that intelligence will begin to shift to require a broader, multi-disciplinary understanding.  Businesses and organizations will soon find that they can no longer afford the specialist who only understands cost, schedule, risk, or any one aspect of the other various specialties that were dictated by the old line-and-staff and division of labor practices of the 20th century.  Businesses and organizations that place short term, shareholder, and equity holder interests ahead of the business will soon find themselves out of business in this new world.  The same will apply to organizations that continue to suppress and compartmentalize data.  This is because a cross-functional individual that can maximize the use of this new information paradigm requires education and development.  To achieve this goal dictates the need for the establishment of a learning organization, which requires investment and a long term view.  A learning organization exposes its members to become competent in each aspect of the business, with development including successive assignments of greater responsibility and complexity.  For the project management community, we will increasingly see the introduction of more Business Analysts and, I think, the introduction of the competency of Project Analyst to displace–at first–both cost analyst and schedule analyst.  Other competency consolidation will soon follow.

d.  The new cross-functional competencies–Business Analysts and Project Analysts–will take on an increasing role in design and deployment of technology solutions in the business.  This takes us full circle in our feedback loop that begins with big data driving process.  We are already seeing organizations that have implemented the new technologies and are taking advantage of new insights not only introducing new multi-disciplinary competencies, but also introducing new technologies that adapt the user environment to the needs of the business.  Once the business and project analyst has determined how to interact with the data and the systems necessary to the decision-making process that follows, adaptable technologies that do not take the hard-coded “one size fits all” user interfaces are, and will continue, to find wide acceptance.  Fewer off-line and one-off utilities that have been used to fill the gaps resulting from the deficiencies in inflexible hard-coded business applications will allow innovative approaches to analysis to be mainstreamed into the organization.  Once again, we are already seeing this effect in 2015 and the trend will only accelerate as possessing greater technological knowledge becomes an essential element of being an analyst.

Despite dire predictions regarding innovation, it appears that we are on the cusp of another rapid shift in organizational transformation.  The new world of big data comes with both great promise and great risks.  For project management organizations, the key in taking advantage of its promise and minimizing its risks is to stay ahead of the transformation by embracing it and leading the organization into positioning itself to reap its benefits.