I Get By With A Little Help… — Avoiding NIH in Project Management

…from my colleagues, friends, family, associates, advisors, mentors, subcontractors, consultants, employees.  And not necessarily in that order.

The term NIH in this context is not referring to the federal agency.  It is shorthand, instead, for “Not Invented Here”.  I was reminded of this particular mindset when driving through an old neighborhood where I served as a community organizer.  At one of the meetings of a local board, which was particularly dysfunctional (and where I was attempting to reform their dysfunction), a member remarked:  “I am tired of hearing about how this or that particular issue was handled somewhere else.”  Yes, I thought, why would we possibly want to know how Portland, or D.C., or Boston, or Denver, or Phoenix–or any of the number of other places faced with the same issue–effectively or ineffectively dealt with it before us?  What could they possibly teach us?

When we deal with a project management organization, we are dealing with a learning system.  Hopefully an effectively adaptive system.  The qualifier here is important.  The danger with any tightknit group is to fall into the dual traps of Groupthink and NIH.  The first deals with the behavior relating to conformity within small groups based on the observations and study by William H. Whyte and his successors.  The second is the mindset that the issues faced by the group are unique to it; and so the use of models, tools, experience, and proven statistical and leading indicators do not apply.

A project management organization (or any complex human organization) is one that adapts to pressures from its environment.  It is one with the ability to learn, since it is made up of entities with the ability to create and utilize intelligence and information, and so it is unique from biological systems that adapt over time through sexual and natural selection.  Here is also an important point:  while biological evolution occurs over long spans of time, we don’t see the dead ends and failures of adaptation until the story is written–at least, not outside of those who work in the microbiological field where evolution of viruses and bacteria occur rapidly.  So for large animals and major species it appears to be a Panglossian world, which it definitely is not.

When we take Panglossian thinking into the analogies that we find in social and other complex adaptive systems, the fallacies in our thinking can be disastrous and cause great unnecessary suffering.  I am reminded here of the misuse of the concept of self-organization in complex systems and of the term “market” in economics.  Organizations and social structures can “self-organize” not only into equilibrium but also into spirals of failure and death.  Extremely large and complex organizations like nation-states and societies are replete with such examples: from Revolutionary France to Czarist Russia, to recent examples in Africa and the Near East.  In economics, “the market” determines price.  The inability of the market to self-regulate–and the nature of self-organization–resulted in the bursting of the housing bubble in the first decade of this century, precipitating a financial crisis.  This is the most immediate example of a systemic death spiral of global proportions, which was finally resolved (finally) only with a great deal of intervention by rational actors.

So when I state: hopefully an effective adaptive system, I mean one that does not adapt itself by small steps into unnecessary failure or wasted effort.  (As our business, financial, economic, and political leaders did in the early 2000s).  Along these same lines, when we speak of fitness in surviving challenges (or fitness in biological evolution), we do not imply the “best” of something.  Fitness is simply a convenient term to describe the survivors after all of the bodies have been counted.  In nature one can survive due to plain luck, through capabilities or characteristics of inheritance fit to the environmental space, through favorable chance isolation or local conditions–the list is extensive.  Many of these same factors also apply to social and complex adaptive systems, but on a shorter timescale with a higher degree of traceable proximate cause-and-effect, depending on the size and scale of the organization.

In project management systems, while it is important to establish the closed-loop systems necessary to gain feedback from the environment to determine whether the organization is effectively navigating itself to achieve its goals against a plan, it is also necessary to have those systems in place that allow for leveraging both organizational and competency knowledge, as well as third-party solutions.  That is, broadening the base in applying intelligence.

This includes not only education, training, mentoring, and the retention and use of a diversified mix of experienced knowledge workers, but also borrowing solutions outside of the organization.  It means being open to all of the tools available in avoiding NIH.  Chances are, though the time, place, and local circumstances may be different, someone has faced something very similar before somewhere else.  With the availability of information becoming so ubiquitous, there is very little excuse for restricting our sources.

But given this new situation, our systems must now possess the ability to apply qualitative selection criteria in identifying analogous information, tempered with judgment in identifying the differences in the situations where they exist.  But given that most systems–including systems of failure–organize themselves into types and circumstances that can be generalized into assumptions, we should be able to leverage both the differences and similarities in developing a shortcut that doesn’t require all of the previous steps to be repeated (with a high degree of repeating failure).

In closing, I think it important to note that failure here is defined as the inability of the organization to come up with an effective solution to the problem at hand, where one is possible.  I am not referring to failure as the inability to achieve an intermediate goal.  In engineering and other fields of learning, including business, failure is oftentimes a necessary part of the process, especially when pushing technologies in which previous examples and experience cannot apply to the result.  The lessons learned from a failed test in this situation, for example, can be extremely valuable.  But a failed test that resulted from the unwillingness of the individuals in the group to consider similar experience or results due to NIH is inexcusable.