Do You Know Where You’re Going To? — SecDef Ash Carter talks to Neil DeGrasse Tyson…and some thoughts on the international technology business

It’s time to kick off my 2017 blogging activity and my readers have asked about my absence on this blog.  Well because of the depth and research required by some of the issues that I consider essential, most of my blogging energy has been going to contributions to  I strongly recommend that you check out the site if you haven’t already.  A great deal of useful PM information and content can be found there–and they have a strong editorial staff so that what does get to publication is pretty well sourced.  My next post on the site is scheduled for 25 January.  I will link to it once it becomes available.

For those of us just getting back into the swing of things after the holidays, there were a number of interesting events that occurred during that time that I didn’t get a chance to note.  Among these is that SecDef Ash Carter appeared (unfortunately a subscription wall) on an episode of Neil DeGrasse Tyson’s excellent show “StarTalk“, which appears on the National Geographic Channel.

Secretary Carter had some interesting things to say, among them are:

a. His mentors in science, many of whom were veterans of the Second World War, instilled in him the concept of public service and giving back to the country.

b.  His experience under former SecDef Perry, when he was Assistant Secretary of Defense for International Security Policy, taught him that the DoD needed to be the “petri dish” for R&D in new technologies.

c.  That the approach of the DoD has been to leverage the R&D into new technologies that can be leveraged from the international technology industry, given that there are many good ideas and developments that occur outside of the United States.

d.  He encouraged more scientists to serve in the federal government and the Department of Defense, if even for a short while to get a perspective on how things work at that level.

e.  He doesn’t see the biggest source of instability will necessarily be from nation states, but that small groups of individuals, given that destructive power is becoming portable, will be the emerging threat that his successor will face.

f. There imperative that the U.S. maintain its technological edge is essential in guaranteeing international stability and peace.

Secretary Carter’s comments, in particular, in realizing that the technology industry is an international one strikes a particular personal cord with me since my present vocation has caused me to introduce new capabilities in the U.S. market built from technologies that were developed by a close European ally.  The synergy that this meeting of the minds has created has begun to have a positive impact on the small portion of the market that my firm inhabits, changing the way people do business and shifting the focus from “tools” as the source of information to data, and what the data suggests.

This is not to say that cooperation in the international technology market is not fraught with the same rocks and shoals found in any business area.  But it is becoming increasingly apparent that new information technologies can be used as a means of evening the playing field because of the asymmetrical nature of information itself, which then lends itself to leverage given relatively small amounts of effort.

This also points to the importance of keeping an open mind and encouraging international trade, especially among our allies that are among the liberal democracies.  Recently my firm was the target of a protest for a government contract where this connection to international trade was used as a means of questioning whether the firm was, indeed, a bonafide U.S. business.  The answer under U.S. law is a resounding “yes”–and that first decision was upheld on appeal.  For what we have done is–under U.S. management–leveraged technology first developed elsewhere, extended its capabilities, designed, developed, and localized it for the U.S. market, and in the process created U.S. jobs and improved U.S. processes.  This is a good deal all around.

Back in the day when I wore a U.S. Navy uniform during the Cold War military, many of us in the technology and acquisition specialties looked to reform our systems and introduce innovative methods from wherever we could find them, whether they came from private industry or other government agencies.  When coming upon resistance because something was “the way it always was done” our characterization of that attitude was “NIH”.  That is, “Not Invented Here.”  NIH was a term that, in shorthand, described an invalid counterargument against process improvement that did not rely on the merits or evidence.

And so it is today.  The world is always changing, but given new technologies the rate of change is constantly accelerating.  Adapting and adopting the best technologies available will continue to give us the advantage as a nation.  It simply requires openness and the ability to identify innovation when we see it.

Sunday Music Interlude — Lydia Loveless performing “Somewhere Else”

Lydia Loveless, though merely 25 years old, has been on the music scene in a big way for about six years wowing critics and music lovers with her alt-country songs, which fuses elements of trad country, rock, singer/songwriter, and punk, about life and living.  She hails from the town of Coschocton, Ohio where she grew up on a farm and where her father ran a local honky-tonk for a while.  A member of a musical family, she performed in the band “Carson Drew”, which drew its inspiration from the father in the Nancy Drew books series, along with her father, Parker Chandler, and older sisters, Eleanor Sinacola and Jessica.

She released her first album in 2010 entitled The Only Man.  It was greeted by favorable reviews, especially on the alt-country scene.  A little more than a year later she released the album Indestructible Machine on Bloodshot Records.  This album of her original music dealt with issues regarding growing up in an insular rural town, dangerous relationships, and country staples such as isolation, drinking, and depression.  The hard edge of her lyrics which SPIN characterized as “utter lack of bullshit” by the “Ohio hellion” appealed to a wider audience and her music was greeted with rave reviews across the critical music spectrum.

She followed up Indestructible Machine with the EP Boy Crazy, which further solidified her musical cred and which served as a segue to the full album entitled Somewhere Else.  Anyone who doubted that Loveless was a major talent was converted with this album.  This past August she followed that one up with another gem entitled Real.  This album, as her previous efforts, has garnered almost universal praise.

As she has matured her voice, which is led by a Midwest twang, reveals great depth and control.  At the core of her talent, which is multi-faceted, is her ability to exploit an expansive vocal range–one greater than found in most rock and country singers.  Depending on the topic at hand she travels–sometimes in the same song–from a singer who possesses considerable pipes who can belt out a controlled and sustained melody, to verbal intimacy that expresses raw, scratchy emotion like a youthful Patti Smith.  Her lyrics are both mature beyond her years and reveal an openness and emotional vulnerability that only the most talented singers can maintain.  It is a high wire act by someone barely aware of what she is doing–and we can only hope that she continues to eschew any artifice of self-awareness that, even among the most talented, can devolve into self-parody and archness.

Here she is performing “Somewhere Else” on Audiotree Live.

Back in the Saddle Again — Putting the SME into the UI Which Equals UX

“Any customer can have a car painted any colour that he wants so long as it is black.”  — Statement by Henry Ford in “My Life and Work”, by Henry Ford, in collaboration with Samuel Crowther, 1922, page 72

The Henry Ford quote, which he made half-jokingly to his sales staff in 1909, is relevant to this discussion because the information sector has developed along the lines of the auto and many other industries.  The statement was only half-joking because Ford’s cars could be had in three colors.  But in 1909 Henry Ford had found a massive market niche that would allow him to sell inexpensive cars to the masses.  His competition wasn’t so much as other auto manufacturers, many of whom catered to the whims of the rich and more affluent members of society, but against the main means of individualized transportation at the time–the horse and buggy.  The color was not so much important to this market as was the need for simplicity and utility.

Since the widespread adoption of the automobile and the expansion of the market with multiple competitors, high speed roadways, a more affluent society anchored by a middle class, and the impact of industrial and information systems development in shaping societal norms, the automobile consumer has, over time, become more varied and sophisticated.  Today automobiles have introduced a number of new features (and color choices)–from backup cameras, to blind spot and back-up warning signals, to lane control, auto headline adjustment, and many other innovations.  Enhancements to industrial production that began with the introduction of robotics into the assembly line back in the late 1970s and early 1980s, through to the adoption of Just-in-Time (JiT) and Lean principles in overall manufacturing, provide consumers a a multitude of choices.

We are seeing a similar evolution in information systems, which leads me to the title of this post.  During the first waves of information systems development and introduction into our governing and business systems, the process has been one in which software is developed first to address an activity that is completed manually.  There would be a number of entries into a niche market (or for more robustly capitalized enterprises into an entire vertical).  The software would be fairly simplistic and the features limited, the objects (the way the information is presented and organized on the screen, the user selections, and the charts, graphs, and analytics allowed to enhance information visibility) well defined, and the UI (user interface) structured along the lines of familiar formats and views.

To include the input of the SME into this process, without specific soliciting of advice, was considered both intrusive and disruptive.  After all, software development largely was an activity confined to a select and highly trained specialty involving sophisticated coding languages that required a good deal of talent to be considered “elegant”.  I won’t go into a definition of elegance here, which I’ve addressed in previous posts, but for a short definition it is this:  the fewest bits of code possible that both maximizes computing power and provides the greatest flexibility for any given operation or set of operations.

This is no mean feat and a great number of software applications are produced in this way.  Since the elegance of any line of code varies widely by developer and organization, the process of update and enhancement can involve a great deal of human capital and time.  Thus, the general rule has been that the more sophisticated that any software application is, the more effort and thus less flexibility that the application possesses.  Need a new chart?  We’ll update you next year.  Need a new set of views or user settings?  We’ll put your request on the road-map and maybe you’ll see something down the road.

It is not as if the needs and requests of users have always been ignored.  Most software companies try to satisfy the needs of their customer, balancing the demands of the market against available internal resources.  Software websites, such as at UXmatters in this article, have advocated the ways that the SME (subject-matter expert) needs to be at the center of the design process.

With the introduction of fourth-generation adaptive software environments–that is, those systems that leverage underlying operating environments and objects such as .NET and WinForms, that are open to any data through OLE DB and ODBC, and that leave the UI open to simple configuration languages that leverage these underlying capabilities and place them at the feet of the user–put the SME at the center of the design process into practice.

This is a development in software as significant as the introduction of JiT and Lean in manufacturing, since it removes both the labor and time-intensiveness involved in rolling out software solutions and enhancements.  Furthermore, it goes one step beyond these processes by allowing the SME to roll out multiple software solutions from one common platform that is only limited by access to data.  It is as if each organization and SME has a digital printer for software applications.

Under this new model, software application manufacturers have a flexible environment to pre-configure the 90% solution to target any niche or market, allowing their customers to fill in any gaps or adapt the software as they see fit.  There is still IP involved in the design and construction of the “canned” portion of the solution, but the SME can be placed into the middle of the design process for how the software interacts with the user–and to do so at the localized and granular level.

This is where we transform UI into UX, that is, the total user experience.  So what is the difference?  In the words of Dain Miller in a Web Designer Depot article from 2011:

UI is the saddle, the stirrups, and the reigns.

UX is the feeling you get being able to ride the horse, and rope your cattle.

As we adapt software applications to meet the needs of the users, the role of the SME can answer many of the questions that have vexed many software implementations for years such as user perceptions and reactions to the software, real and perceived barriers to acceptance, variations in levels of training among users, among others.  Flexible adaptation of the UI will allow software applications to be more successfully localized to not only meet the business needs of the organization and the user, but to socialize the solution in ways that are still being discovered.

In closing this post a bit of full disclosure is in order.  I am directly involved in such efforts through my day job and the effects that I am noting are not simply notional or aspirational.  This is happening today and, as it expands throughout industry, will disrupt the way in which software is designed, developed, sold and implemented.

Keep on Keeping On — Feedspot Ranks “Life, Project Management, and Everything” among Top 50 PM Blogs

Feedspot has published its rankings of the top 50 project management blogs and this site found itself squarely near the middle ranked at number 24.  For those in the project management discipline, I strongly recommend you check out the list and bookmark all of them. They have a number of interesting lists by topic and so it is worth exploring beyond just PM.  I read just about every one of the blogs noted on the PM list on a regular basis, which are both a source of inspiration and discovery.

A big thanks to those of you who read this blog and to the Feedspot editors for the acknowledgment.  A great deal of research and work goes into each of the posts on this blog, whether about project management or other subjects that pique my interest.  I would like to post more frequently, but my day job and living impose demands that limit my ability to write.  Regardless, don’t give up on me during my short periods of writing hiatus.  In all probability, I am working some problem of interest and am not yet ready to share my results for vetting.

Sunday Contemplation — Finding Wisdom — Daniel Dennett in “Darwin’s Dangerous Idea”


“The Darwinian Revolution is both a scientific and a philosophical revolution, and neither revolution could have occurred without the other. As we shall see, it was the philosophical prejudices of the scientists, more than their lack of scientific evidence, that prevented them from seeing how the theory could actually work, but those philosophical prejudices that had to be overthrown were too deeply entrenched to be dislodged by mere philosophical brilliance. It took an irresistible parade of hard-won scientific facts to force thinkers to take seriously the weird new outlook that Darwin proposed…. If I were to give an award for the single best idea anyone has ever had, I’d give it to Darwin, ahead of Newton and Einstein and everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law. But it is not just a wonderful scientific idea. It is a dangerous idea.”

Daniel Dennett (pictured above thanks to Wikipedia) is the Co-Director of the Center for Cognitive Studies and Austin B. Fletcher Professor of Philosophy at Tufts University.  He is also known as “Dawkins’ Bulldog”, for his pointed criticism of what he viewed as unnecessary revisions to Darwinian Theory by Stephen Jay Gould, who was also a previous subject of this blog, and others.  In popular culture he has also been numbered among the “Four Horsemen” of the so-called “New Atheism”.  His intellectual and academic achievements are many, and his insights into evolution, social systems, cognition, consciousness, free will, philosophy, and artificial intelligence are extremely influential.

Back in 1995, when I was a newly minted Commander in the United States Navy, I happened across an intriguing book in a Jacksonville, Florida bookshop during a temporary duty assignment.  The book was entitled Darwin’s Dangerous Idea: Evolution and the Meanings of Life.  I opened it that afternoon during a gentle early spring Florida day and found myself astounded and my mind liberated, as if chains which I had not previously noticed, but which had bound my mind, had been broken and released me, so great was the influence of the philosophical articulation of this “dangerous idea”.

Here, for the first time, was a book that took what we currently know about the biological sciences and placed them within the context of other scientific domains–and done so in a highly organized, articulate, and readable manner.  The achievement of the book was not so much in deriving new knowledge, but in presenting an exposition of the known state of the science and tracing its significance and impact–no mean achievement given the complexity of the subject matter and the depth and breadth of knowledge being covered.  The subject matter, of course, is highly controversial only because it addresses subjects that engender the most fear: the facts of human origins, development, nature, biological interconnectedness, and the inevitability of mortality.

Dennett divides his thesis into three parts: the method of developing the theory and its empirical proofs, it’s impact on the biological sciences, and the impact on other disciplines, especially regarding consciousness, philosophy, sociology, and morality.  He introduces and develops several concepts, virtually all of which have since become cornerstones in human inquiry, and not only among the biological sciences.

Among these are the concepts of design space, of natural selection behaving as an algorithm, of Darwinism acting as a “universal acid” that transforms the worldview of everything it touches, and of the mental concepts of skyhooks, cranes and “just-so” stories–fallacious and magical ways of thinking that have no underlying empirical foundation to explain natural phenomena.

The concept of the design space has troubled many, though not most evolutionary biologists and physicists, only because Dennett posits a philosophical position in lieu of a mathematical one.  This does not necessarily undermine his thesis, simply because one must usually begin with a description of a thesis before one can determine whether it can be disproven.  Furthermore, Dennett is a philosopher of the analytical school and so the scope of his work is designed from that perspective.

But there are examples that approach the analogue of design space in physics–those that visualize space-time and general relativity as at this site.  It is not a stretch to understand that our reality–the design space that the earth inhabits among many alternative types of design spaces that may exist that relate to biological evolution–can eventually be mathematically formulated.  Given that our knowledge of comparative planetary and biological physics is still largely speculative and relegated to cosmological speculation, the analogy for now is sufficient and understandable.  It also gives a new cast to the concept of adaptation away from the popular (and erroneous) concept of “survival of the fittest”, since fitness is based on the ability to adapt to environmental pressures and to find niches that may exist in that environment.  With our tracing of the effects of climate change on species, we will be witnessing first hand the brutal concept of design space.

Going hand-in-hand with design space is the concept that Darwinian evolution through the agent of natural selection is an algorithmic process.  This understanding becomes “universal acid” that, according to Dennett, “eats through just about every traditional concept and leaves in its wake a revolutionized world-view.”

One can understand the objection of philosophers and practitioners of metaphysics to this concept, which many of them have characterized as nihilistic.  This, of course, is argument from analogy–a fallacious form of rhetoric.  The objection to the book through these arguments, regardless of the speciousness of their basis, is premature and a charge to which Dennett effectively responds through his book Consciousness Explained.  It is in this volume that Dennett addresses the basis for the conscious self, “intentionality”, and the concept of free will (and its limitations)–what in the biological and complexity sciences is described as emergence.

What Dennett has done through describing the universal acid of Darwinian evolution is to describe a phenomenon: the explanatory reason for rapid social change that we have and are witnessing, and the resulting reaction and backlash to it.  For example, the revolution that was engendered from the Human Genome Project not only has confirmed our species’ place in the web of life on Earth and our evolutionary place among primates, but also the interconnections deriving from descent from common ancestors of the entire human species, exploding the concept of race and any claim to inherent superiority or inferiority to any cultural grouping of humans.

One can clearly see the threat this basic truth has to entrenched beliefs deriving from conservative philosophy, cultural tradition, metaphysics, religion, national borders, ethnic identity, and economic self-interest.

For it is apparent to me, given my reading not only of Dennett, but also that of both popularizers and the leading minds in the biological sciences that included Dawkins, Goodall, Margulis, Wilson, Watson, Venter, Crick, Sanger, and Gould; in physics from Hawking, Penrose, Weinberg, Guth, and Krauss, in mathematics from Wiles, Witten, and Diaconis; in astrophysics from Sandage, Sagan, and deGrasse Tyson; in climate science from Hansen and many others; and in the information sciences from Moore, Knuth, and Berners-Lee, that we are in the midst of another intellectual revolution.  This intellectual revolution far outstrips both the Renaissance and the Enlightenment as periods of human achievement and advancement, if only because of the widespread availability of education, literacy, healthcare, and technology, as well as human diversity, which both accelerates and expands many times over the impact of each increment in knowledge.

When one realizes that both of those earlier periods of scientific and intellectual advance engendered significant periods of social, political, and economic instability, upheaval, and conflict, then the reasons for many of the conflicts in our own times become clear.  It was apparent to me then–and even more apparent to me now–that there will be a great overturning of the institutional, legal, economic, social, political, and philosophic ideas and structures that now exist as a result.  We are already seeing the strains in many areas.  No doubt there are interests looking to see if they can capitalize on or exploit these new alignments.  But for those overarching power structures that exert control, conflict, backlash, and eventual resolution is inevitable.

In this way Fukuyama was wrong in the most basic sense in his thesis in The End of History and the Last Man to the extent that he misidentified ideologies as the driving force behind the future of human social organization.  What he missed in his social “science” (*) is the shift to the empirical sciences as the nexus of change.  The development of analytical philosophy (especially American Pragmatism) and more scientifically-based modeling in the social sciences are only the start, but one can make the argument that these ideas have been more influential in clearly demonstrating that history, in Fukuyama’s definition, is not over.

Among the first shots over the bow from science into the social sciences have come works from such diverse writers as Jared Diamond (Guns, Germs, and Steel: The Fates of Human Societies (1997)) and Sam Harris (The Moral Landscape: How Science Can Determine Human Values (2010)).  The next wave will, no doubt, be more intense and drive further resistance and conflict.

The imperative of science informing our other institutions is amply demonstrated by two facts.

  1. On March 11, 2016 an asteroid that was large enough to extinguish a good part of all life on earth came within 19,900 miles of our planet’s center.  This was not as close, however, as the one that passed on February 25 (8,900 miles).  There is no invisible shield or Goldilocks Zone to magically protect us.  The evidence of previous life-ending collisions are more apparent with each new high resolution satellite image of our planet’s surface.  One day we will look up and see our end slowly but inevitably making its way toward us, unless we decide to take measures to prevent such a catastrophe.
  2. Despite the desire to deny that it’s happening, 2015 was the hottest recorded year on record and 2016 thus far is surpassing that, providing further empirical evidence of the validity of Global Warming models.  In fact, the last four consecutive years fall within the four hottest years on record (2014 was the previous hottest year).  The outlier was 2010, another previous high, which is hanging in at number 3 for now.  2013 is at number 4 and 2012 at number 8.  Note the general trend.  As Jared Diamond has convincingly demonstrated–the basis of conflict and societal collapse is usually rooted in population pressures exacerbated by resource scarcity.  We are just about to the point of no return, given the complexity of the systems involved, and can only mitigate the inevitable–but we must act now to do.

What human civilization does not want to be is on the wrong side of history in how to deal with these challenges.  Existing human power structures and interests would like to keep the scientific community within the box of technology–and no doubt there are still scientists that are comfortable to stay within that box.

The fear regarding allowing science to move beyond the box of technology and general knowledge is its misuse and misinterpretation, usually by non-scientists, such as the reprehensible meme of Social Darwinism (which is neither social nor Darwinian).**  This fear is oftentimes transmitted by people with a stake in controlling the agenda or interpreting what science has determined.  Its contingent nature also is a point of fear.  While few major theories are usually completely overturned as new knowledge is uncovered, the very nature of revision and adjustment to theory is frightening to people who depend on, at least, the illusion of continuity and hard truths.  Finally, science puts us in our place within the universe.  If there are millions of planets that can harbor some kind of life, and a sub-set of those that have the design space to allow for some kind of intelligent life (as we understand that concept), are we really so special after all?

But not only within the universe.  Within societies, if all humans have developed from a common set of ancestors, then our basic humanity is a shared one.  If the health and sustainability of an ecology is based on its biodiversity, then the implication for human societies is likewise found in diversity of thought and culture, eschewing tribalism and extreme social stratification.  If the universe is deterministic with only probability determining ultimate cause and effect, then how truly free is free will?  And what does this say about the circumstances in which each of us finds him or herself?

The question now is whether we embrace our fears, manipulated by demagogues and oligarchs, or embrace the future, before the future overwhelms and extinguishes us–and to do so in a manner that is consistent with our humanity and ethical reasoning.


Note:  Full disclosure.  As a senior officer concerned with questions of AI, cognition, and complex adaptive systems, I opened a short correspondence with Dr. Dennett about those subjects.  I also addressed what I viewed as his unfair criticism (being Dawkins’ Bulldog) of punctuated equilibrium, spandrels, and other minor concepts advanced by Stephen Jay Gould, offering a way that Gould’s concepts were well within Darwinian Theory, as well as being both interesting and explanatory.  Given that less complex adaptive systems that can be observed do display punctuated periods of rapid development–and also continue to have the vestiges of previous adaptations that no longer have a purpose–it seemed to me that larger systems must also do so, the punctuation being on a different time-scale, and that any adaptation cannot be precise given that biological organisms are imprecise.  He was most accommodating and patient, and this writer learned quite a bit in our short exchange.  My only regret was not to continue the conversation.  I do agree with Dr. Dennett (and others) on their criticism of non-overlapping magisteria (NOMA), as is apparent in this post.

Saturday Night Music Interlude — The Marcus King Band performing “Rita is Gone”

The lead singer and guitarist that provides the Marcus King Band’s moniker hails from Greenville, South Carolina, and plays what he calls “soul-influenced psychedelic southern rock,” which is an apt description.  Only 20 years old, Marcus King’s father, Marvin King, was a regionally popular blues and gospel singer, and his grandfather was a regional musician as well.  Growing up as a boy, young Marcus told eastof8th blog “I was listening to George Jones, Chet Atkins, and Merle Haggard with my granddad.  Later on, I was heavily influenced by jazz cats like Coltrane, Miles Davis, and Jimmy Smith.”

The legendary Warren Haynes has promoted Marcus and his band as a true believer, performing with them at concerts and inviting Marcus King to perform with him in the band Gov’t Mule.  The band has two albums to its credit: Soul Insight, a gritty blues, southern rock and prog rock-inflected debut, and the self-titled double disc Marcus King Band on the Fantasy label.

Touring in anticipation of their new album, they impressed at SXSW, jamming out with George Clinton, performed at Mountain Jam that included electric sets and extended jams with Warren Haynes, and–a last minute substitute booking–took the XPoNential Music Festival in Philadelphia by storm, becoming WXPN’s August Artist to Watch.

You can hear the musical influences that informed Marcus’ sound blend together in the mix of horns, drums, keyboards, and guitar, the band’s eclectic mix of blues, soul, prog rock, and southern rock producing a gumbo reminiscent of Tower of Power at their peak mixed in with a bit of Allman Brothers, a slice of John McLaughlin, a dash of Gov’t Mule, and a pinch of Hendrix psychedelia.  While still a bit raw and unfocused at times, this is one talent to watch as he matures and develops his sound.

Here is the band at WFUV performing “Rita is Gone.”


Takin’ Care of Business — Information Economics in Project Management

Neoclassical economics abhors inefficiency, and yet inefficiencies exist.  Among the core issues that create inefficiencies is the asymmetrical nature of information.  Asymmetry is an accepted cornerstone of economics that leads to inefficiency.  We can see in our daily lives and employment the effects of one party in a transaction having more information than the other:  knowing whether the used car you are buying is a lemon, measuring risk in the purchase of an investment and, apropos to this post, identifying how our information systems allow us to manage complex projects.

Regarding this last proposition we can peel this onion down through its various levels: the asymmetry in the information between the customer and the supplier, the asymmetry in information between the board and stockholders, the asymmetry in information between management and labor, the asymmetry in information between individual SMEs and the project team, etc.–it’s elephants all the way down.

This asymmetry, which drives inefficiency, is exacerbated in markets that are dominated by monopoly, monopsony, and oligopoly power.  When informed by the work of Hart and Holmström regarding contract theory, which recently garnered the Nobel in economics, we have a basis for understanding the internal dynamics of projects in seeking efficiency and productivity.  What is interesting about contract theory is that it incorporates the concept of asymmetrical information (labeled as adverse selection), but expands this concept in human transactions at the microeconomic level to include considerations of moral hazard and the utility of signalling.

The state of asymmetry and inefficiency is exacerbated by the patchwork quilt of “tools”–software applications that are designed to address only a very restricted portion of the total contract and project management system–that are currently deployed as the state of the art.  These tend to require the insertion of a new class of SME to manage data by essentially reversing the efficiencies in automation, involving direct effort to reconcile differences in data from differing tools. This is a sub-optimized system.  It discourages optimization of information across the project, reinforces asymmetry, and is economically and practically unsustainable.

The key in all of this is ensuring that sub-optimal behavior is discouraged, and that those activities and behaviors that are supportive of more transparent sharing of information and, therefore, contribute to greater efficiency and productivity are rewarded.  It should be noted that more transparent organizations tend to be more sustainable, healthier, and with a higher degree of employee commitment.

The path forward where there is monopsony power, where there is a dominant buyer, is to impose the conditions for normative behavior that would otherwise be leveraged through practice in a more open market.  For open markets not dominated by one player as either supplier or seller, instituting practices that reward behavior that reduces the effects of asymmetrical information, and contracting disincentives in business transactions on the open market is the key.

In the information management market as a whole the trends that are working against asymmetry and inefficiency involve the reduction of data streams, the construction of cross-domain data repositories (or reservoirs) that allow for the satisfaction of multiple business stakeholders, and the introduction of systems that are more open and adaptable to the needs of the project system in lieu of a limited portion of the project team.  These solutions exist, yet their adoption is hindered because of the long-term infrastructure that is put in place in complex project management.  This infrastructure is supported by incumbents that are reinforcing to the status quo.  Because of this, from the time a market innovation is introduced to the time that it is adopted in project-focused organizations usually involves the expenditure of several years.

This argues for establishing an environment that is more nimble.  This involves the adoption of a series of approaches to achieve the goals of broader information symmetry and efficiency in the project organization.  These are:

a. Instituting contractual relationships, both internally and externally, that encourage project personnel to identify risk.  This would include incentives to kill efforts that have breached their framing assumptions, or to consolidate progress that the project has achieved to date–sending it as it is to production–while killing further effort that would breach framing assumptions.

b. Institute policy and incentives on the data supply end to reduce the number of data streams.  Toward this end both acquisition and contracting practices should move to discourage proprietary data dead ends by encouraging normalized and rationalized data schemas that describe the environment using a common or, at least, compatible lexicon.  This reduces the inefficiency derived from opaqueness as it relates to software and data.

c.  Institute policy and incentives on the data consumer end to leverage the economies derived from the increased computing power from Moore’s Law by scaling data to construct interrelated datasets across multiple domains that will provide a more cohesive and expansive view of project performance.  This involves the warehousing of data into a common repository or reduced set of repositories.  The goal is to satisfy multiple project stakeholders from multiple domains using as few streams as necessary and encourage KDD (Knowledge Discovery in Databases).  This reduces the inefficiency derived from data opaqueness, but also from the traditional line-and-staff organization that has tended to stovepipe expertise and information.

d.  Institute acquisition and market incentives that encourage software manufacturers to engage in positive signalling behavior that reduces the opaqueness of the solutions being offered to the marketplace.

In summary, the current state of project data is one that is characterized by “best-of-breed” patchwork quilt solutions that tend to increase direct labor, reduces and limits productivity, and drives up cost.  At the end of the day the ability of the project to handle risk and adapt to technical challenges rests on the reliability and efficiency of its information systems.  A patchwork system fails to meet the needs of the organization as a whole and at the end of the day is not “takin’ care of business.”