Over at AITS.org — The Human Equation in Project Management

Approaches to project management have focused on the systems, procedures, and software put in place to determine progress and likely outcomes. These outcomes are usually expressed in terms of cost, schedule, and technical achievement against the project requirements and framing assumptions—the oft-cited three-legged stool of project management.  What is often missing are measures related to human behavior within the project systems environment.  In this article at AITS.org, I explore this oft ignored dimension.

Over at AITS.org — Open a Window: Using Data and Self-Awareness to Remove Organizational Blind Spots

As I’ve written in the past, as I get over my recent writer’s block, all of the interesting articles on project management are found at AITS.org. My latest post deals with the use of data in approaching the organizational Johari Window. Please check it out.

Brother Can You (Para)digm? — Four of the Latest Trends in Project Management

At the beginning of the year we are greeted with the annual list of hottest “project management trends” prognostications.  We are now three months into the year and I think it worthwhile to note the latest developments that have come up in project management meetings, conferences, and in the field.  Some of these are in alignment with what you may have seen in some earlier articles, but these are four that I find to be most significant thus far, and there may be a couple of surprises for you here.

a.  Agile and Waterfall continue to duke it out.  As the term Agile is adapted and modified to real world situations, the cult purists become shriller in attempting to enforce the Manifesto that may not be named.  In all seriousness, it is not as if most of these methods had not been used previously–and many of the methods, like scrum, also have their roots in Waterfall and earlier methods.  A great on-line overview and book on the elements of scrum can be found at Agile Learning Labs.  But there is a wide body of knowledge out there concerning social and organizational behavior that is useful in applying what works and doesn’t work.  For example, the observational science behind span of control, team building, the structure of the team in supporting organizational effectiveness, and the use of sprints in avoiding the perpetual death-spiral of adding requirements and not defining “done”, are best practices that identify successful teams (depending how you define success–keeping in mind that a successful team that produces the product often still fails as a going concern, and thus falls into obscurity).

All that being said, if you want to structure these best practices into a cohesive methodology, call it Agile, Waterfall or Harry, and can make money at it while helping people succeed in a healthy work environment, all power to you.  In IT, however, it is this last point that makes this particular controversy seem like we’ve been here before.  When woo-woo concepts like #NoEstimates and self-organization are thrown about, the very useful and empirical nature of the enterprise enters into magical thinking and ideology.  The mathematics of unsuccessful IT projects has not changed significantly since the shift to Agile.  From what one can discern from the so-called studies on the market, which are mostly anecdotal or based on unscientific surveys, somewhere north of 50% of IT projects fail, failure defined as behind schedule and over cost, or failing to meet functionality requirements.

Given this, Agile seems to be the latest belle to the ball and virtually any process improvement introducing scrum, teaming, and sprints seems to get the tag.  Still, there is much blood and thunder being expended for a result that amounts to the same (and probably less than the) mathematical chance of success as found in the coin flip.  I think for the remainder of the year the more acceptable and structured portions of Agile will get the nod.

b.  Business technology is now driving process.  This trend, I think, is why process improvements like Agile, that claim to be the panacea, cannot deliver on their promises.  As best practices they can help organizations avoid a net negative, but they rarely can provide a net positive.  Applying new processes and procedures while driving blind will still run you off the road.  The big story in 2015, I think, is the ability to handle big data and to integrate that data in a manner to more clearly reveal context to business stakeholders.  For years in A&D, DoD, governance, and other verticals engaged in complex, multi-year project management, we have seen the push and pull of interests regarding the amount of data that is delivered or reported.  With new technologies this is no longer an issue.  Delivering a 20GB file has virtually the same marginal cost as delivering a 10GB file.  Sizes smaller than 1G aren’t even worth talking about.

Recently I heard someone refer to the storage space required for all this immense data, it’s immense I tell you!  Well storage is cheap and large amounts of data can be accessed through virtual repositories using APIs and smart methods of normalizing data that requires integration at the level defined by the systems’ interrelationships.  There is more than one way to skin this cat, and more methods for handling bigger data are coming on-line every year.  Thus, the issue is not more or less data, but better data regardless of the size of the underlying file or table structure or the amount of information.  The first go-round of this process will require that all of the data available already in repositories be surveyed to determine how to optimize the information it contains.  Then, once transformed into intelligence, to determine the best manner of delivery so that it provides both significance and context to the decision maker.  For many organizations, this is the question that will be answered in 2015 and into 2016.  At that point it is the data that will dictate the systems and procedures needed to take advantage of this powerful advance in business intelligence.

c.  Cross-functional teams will soon morph into cross-functional team members.  As data originating from previously stove-piped competencies is integrated into a cohesive whole, the skillsets necessary to understand the data, know how to convert it into intelligence, and act appropriately on that intelligence will begin to shift to require a broader, multi-disciplinary understanding.  Businesses and organizations will soon find that they can no longer afford the specialist who only understands cost, schedule, risk, or any one aspect of the other various specialties that were dictated by the old line-and-staff and division of labor practices of the 20th century.  Businesses and organizations that place short term, shareholder, and equity holder interests ahead of the business will soon find themselves out of business in this new world.  The same will apply to organizations that continue to suppress and compartmentalize data.  This is because a cross-functional individual that can maximize the use of this new information paradigm requires education and development.  To achieve this goal dictates the need for the establishment of a learning organization, which requires investment and a long term view.  A learning organization exposes its members to become competent in each aspect of the business, with development including successive assignments of greater responsibility and complexity.  For the project management community, we will increasingly see the introduction of more Business Analysts and, I think, the introduction of the competency of Project Analyst to displace–at first–both cost analyst and schedule analyst.  Other competency consolidation will soon follow.

d.  The new cross-functional competencies–Business Analysts and Project Analysts–will take on an increasing role in design and deployment of technology solutions in the business.  This takes us full circle in our feedback loop that begins with big data driving process.  We are already seeing organizations that have implemented the new technologies and are taking advantage of new insights not only introducing new multi-disciplinary competencies, but also introducing new technologies that adapt the user environment to the needs of the business.  Once the business and project analyst has determined how to interact with the data and the systems necessary to the decision-making process that follows, adaptable technologies that do not take the hard-coded “one size fits all” user interfaces are, and will continue, to find wide acceptance.  Fewer off-line and one-off utilities that have been used to fill the gaps resulting from the deficiencies in inflexible hard-coded business applications will allow innovative approaches to analysis to be mainstreamed into the organization.  Once again, we are already seeing this effect in 2015 and the trend will only accelerate as possessing greater technological knowledge becomes an essential element of being an analyst.

Despite dire predictions regarding innovation, it appears that we are on the cusp of another rapid shift in organizational transformation.  The new world of big data comes with both great promise and great risks.  For project management organizations, the key in taking advantage of its promise and minimizing its risks is to stay ahead of the transformation by embracing it and leading the organization into positioning itself to reap its benefits.